Evaluation Method of English-Speaking Self-Learning System Based on Natural Language Processing Technology

Main Article Content

Xiaolan Guo

Abstract

In the education, the use of advanced natural language processing techniques has gained prominence for their potential to revolutionize the oral learning experience. Self-learning in English oral learning has become increasingly popular due to its flexibility and accessibility. With the advent of digital resources and language learning apps, students can now engage in language acquisition on their own terms. This paper presented a novel framework that combines the power of the Bidirectional Encoder Representations from the Transformers (BERT) model with the Hidden Condition Random Model (HCRM) for the enhancement of English oral learning. The primary goal is to provide educators and institutions with a robust tool for evaluating the relevance and quality of oral learning materials. The HCRM architecture incorporates sentiment analysis, feature extraction, and classification, making it a comprehensive solution for assessing the suitability of documents in the context of English oral learning. The model takes into account the opinions of both students and teachers, ensuring a holistic perspective on the oral learning materials' effectiveness. By effectively analyzing sentiments and extracting pertinent features, the HCRM facilitates a nuanced understanding of the potential impact of educational content. This paper's findings suggest that the integration of BERT with HCRM has the potential to greatly enhance English oral learning by providing a more accurate, holistic, and data-driven approach to material assessment. The innovative framework presented in this research holds promise for improving the quality and relevance of oral learning materials in the field of English instruction.

Article Details

Section
Articles
Author Biography

Xiaolan Guo

1Xiaolan Guo

1 Department of Foreign Language, Ganzhou Teachers College, Ganzhou, 341000, China

*Corresponding author e-mail: Lanmoon08@sina.com

Copyright © JES 2023 on-line : journal.esrgroups.org

References

Mishra, A. (2021). Self-Learning System for Child Development Using Conversational AI and Natural Language Processing (NLP). In Impact of AI Technologies on Teaching, Learning, and Research in Higher Education (pp. 124-133). IGI Global.

Fan, X., Cho, E., Huang, X., & Guo, E. (2021). Search based self-learning query rewrite system in conversational ai.

ZIRAPE, S., SHARMA, S., ALI, I. H., KUMBHAR, M., NALAWADE, R., & JANSARI, M. (2023). Add Self-Learning Ability to NLP for Automatic Test Case Generation.

Xiang, X., & Foo, S. (2021). Recent advances in deep reinforcement learning applications for solving partially observable markov decision processes (pomdp) problems: Part 1—fundamentals and applications in games, robotics and natural language processing. Machine Learning and Knowledge Extraction, 3(3), 554-581

Budiharto, W. I. D. O. D. O., Andreas, V., & Gunawan, A. A. S. (2021). A novel model and implementation of humanoid robot with facial expression and natural language processing (NLP). ICIC Express Letters, Part B: Applications, 12(3), 275-281.

El Essawy, E. S., & Dowydar, M. (2022). Effect of counseling program based on Neuro Linguistic Programming (NLP) On self-learning Skills and Psychological Resilience For students of the Faculty of Sport Education during novel Corona Virus pandemic. The Scientific Journal of Physical Education and Sports Sciences, 45(1), 197-237.

Wei, L., Li, Q., Song, Y., Stefanov, S., Siriwardane, E., Chen, F., & Hu, J. (2022). Crystal transformer: Self-learning neural language model for generative and tinkering design of materials. arXiv preprint arXiv:2204.11953.

Li, W., Ma, K., Qiu, Q., Wu, L., Xie, Z., Li, S., & Chen, S. (2021). Chinese Word Segmentation Based on Self‐Learning Model and Geological Knowledge for the Geoscience Domain. Earth and Space Science, 8(6), e2021EA001673.

Joaquim, C. E. D. L., & Faleiros, T. D. P. (2022, June). BERT Self-Learning Approach with Limited Labels for Document Classification. In International Conference on Learning and Intelligent Optimization (pp. 278-291). Cham: Springer International Publishing.

Yurchenko, O., Cherednichenko, O., Trofimova-Herman, A., & Kupriianov, Y. (2023, April). Towards Cross-Lingual Transfer Based on Self-Learning Conversational Agent Model. In 6th International Conference on Computational Linguistics and Intelligent Systems (CoLInS 2023) (Vol. 3396, pp. 194-205).

Joaquim, C. E. D. L. (2022). BERT self-learning approach with limited labels for document classification of a Brazilian Army’s administrative documentary set.

Neto, J. R. C., & Faleiros, T. D. P. (2021). Deep Active-Self Learning Applied to Named Entity Recognition. In Intelligent Systems: 10th Brazilian Conference, BRACIS 2021, Virtual Event, November 29–December 3, 2021, Proceedings, Part II 10 (pp. 405-418). Springer International Publishing.

Dong, X. (2023). Methods for Leveraging Auxiliary Signals for Low-Resource NLP (Doctoral dissertation, Rutgers The State University of New Jersey, School of Graduate Studies).

Fang, Q., Ye, R., Li, L., Feng, Y., & Wang, M. (2022). Stemm: Self-learning with speech-text manifold mixup for speech translation. arXiv preprint arXiv:2203.10426.

Fan, S., Liu, J., An, G., & Li, X. (2021, December). Research on data processing method of railway safety information based on NLP technology. In 2021 3rd International Academic Exchange Conference on Science and Technology Innovation (IAECST) (pp. 1304-1307). IEEE.

Alsafari, S., & Sadaoui, S. (2021, October). Semi-supervised self-learning for arabic hate speech detection. In 2021 IEEE International Conference on Systems, Man, and Cybernetics (SMC) (pp. 863-868). IEEE.

Dash, B., Swayamsiddha, S., & Ali, A. I. (2023). Evolving of Smart Banking with NLP and Deep Learning. In Enabling Technologies for Effective Planning and Management in Sustainable Smart Cities (pp. 151-172). Cham: Springer International Publishing.

Xu, L., Zhang, X., Zhao, X., Chen, H., Chen, F., & Choi, J. D. (2021). Boosting cross-lingual transfer via self-learning with uncertainty estimation. arXiv preprint arXiv:2109.00194.

Zhang, H., Chao, B., Huang, Z., & Li, T. (2022). Construction and Research on Chinese Semantic Mapping Based on Linguistic Features and Sparse Self-Learning Neural Networks. Computational Intelligence and Neuroscience, 2022.

Kazakova, M. A., & Sultanova, A. P. (2022). Analysis of natural language processing technology: modern problems and approaches. Advanced Engineering Research, 22(2), 169-176.

Nayak, S., Kanetkar, A., Hirudkar, H., Ghotkar, A., Sonawane, S., & Litake, O. (2022). Suggesting Relevant Questions for a Query Using Statistical Natural Language Processing Technique. arXiv preprint arXiv:2204.12069.

Gu, Z., Wang, Q., Li, F., & Ou, Y. (2021, November). Design of Intelligent QA for Self-learning of College Students Based on BERT. In ISCTT 2021; 6th International Conference on Information Science, Computer Technology and Transportation (pp. 1-5). VDE.

Nugroho, R. ., Ruliyanta, & Nugroho, E. R. . (2023). Directional Flat Panel Antenna Design for Analog to Digital TV Broadcast Transition in Indonesia. International Journal of Intelligent Systems and Applications in Engineering, 11(1), 63–69. Retrieved from https://ijisae.org/index.php/IJISAE/article/view/2444

Geometric interpretations and reversed versions of Young’s integral inequality. (2021). Advances in the Theory of Nonlinear Analysis and Its Application, 5(1), 1-6. https://atnaea.org/index.php/journal/article/view/177

Müller, M., Salathé, M., & Kummervold, P. E. (2023). Covid-twitter-bert: A natural language processing model to analyse covid-19 content on twitter. Frontiers in Artificial Intelligence, 6, 1023281.

Xu, S., Zhang, C., & Hong, D. (2022). BERT-based NLP techniques for classification and severity modeling in basic warranty data study. Insurance: Mathematics and Economics, 107, 57-67.

Shahid, M. R., & Debar, H. (2021, December). Cvss-bert: Explainable natural language processing to determine the severity of a computer security vulnerability from its description. In 2021 20th IEEE International Conference on Machine Learning and Applications (ICMLA) (pp. 1600-1607). IEEE.

Koroteev, M. V. (2021). BERT: a review of applications in natural language processing and understanding. arXiv preprint arXiv:2103.11943.

Wu, Y., Liu, Z., Wu, L., Chen, M., & Tong, W. (2021). BERT-Based Natural Language Processing of Drug Labeling Documents: A Case Study for Classifying Drug-Induced Liver Injury Risk. Frontiers in Artificial Intelligence, 4, 729834.

Nugroho, K. S., Sukmadewa, A. Y., & Yudistira, N. (2021, September). Large-scale news classification using bert language model: Spark nlp approach. In Proceedings of the 6th International Conference on Sustainable Information Engineering and Technology (pp. 240-246).

Liu, Z., Li, G., & Cheng, J. (2021, February). Hardware acceleration of fully quantized bert for efficient natural language processing. In 2021 Design, Automation & Test in Europe Conference & Exhibition (DATE) (pp. 513-516). IEEE.

Özçift, A., Akarsu, K., Yumuk, F., & Söylemez, C. (2021). Advancing natural language processing (NLP) applications of morphologically rich languages with bidirectional encoder representations from transformers (BERT): an empirical case study for Turkish. Automatika: časopis za automatiku, mjerenje, elektroniku, računarstvo i komunikacije, 62(2), 226-238.

Olaniyan, R., Stamate, D., & Pu, I. (2021). A Two-Step Optimised BERT-Based NLP Algorithm for Extracting Sentiment from Financial News. In Artificial Intelligence Applications and Innovations: 17th IFIP WG 12.5 International Conference, AIAI 2021, Hersonissos, Crete, Greece, June 25–27, 2021, Proceedings 17 (pp. 745-756). Springer International Publishing.

Donnelly, L. F., Grzeszczuk, R., & Guimaraes, C. V. (2022, April). Use of natural language processing (NLP) in evaluation of radiology reports: an update on applications and technology advances. In Seminars in Ultrasound, CT and MRI (Vol. 43, No. 2, pp. 176-181). WB Saunders.

Turchin, A., Masharsky, S., & Zitnik, M. (2023). Comparison of BERT implementations for natural language processing of narrative medical documents. Informatics in Medicine Unlocked, 36, 101139.

Qiu, Z., Wu, X., Gao, J., & Fan, W. (2021, May). U-BERT: Pre-training user representations for improved recommendation. In Proceedings of the AAAI Conference on Artificial Intelligence (Vol. 35, No. 5, pp. 4320-4327).

Wang, R., Chen, D., Wu, Z., Chen, Y., Dai, X., Liu, M., ... & Yuan, L. (2022). Bevt: Bert pretraining of video transformers. In Proceedings of the IEEE/CVF conference on computer vision and pattern recognition (pp. 14733-14743).

Perez, I., & Reinauer, R. (2022). The topological BERT: Transforming attention into topology for natural language processing. arXiv preprint arXiv:2206.15195.