USING LONG SHORT-TERM MEMORY NETWORKS FOR NATURAL LANGUAGE PROCESSING

Authors

DOI:

https://doi.org/10.20998/2079-0023.2023.01.14

Keywords:

natural language processing, neural network, natural language, long short-term memory networks, text classification, emotional text analysis

Abstract

The problem of emotion classification is a complex and non-trivial task of language interpretation due to the natural language structure and its dynamic nature. The significance of the study is in covering the important issue of automatic processing of client feedbacks, collecting opinions and trend-catching. In this work, a number of existing solutions for emotion classification problem were considered, having their shortcomings and advantages illustrated. The evaluation of performance of the considered models was conducted on emotion classification on four emotion classes, namely Happy, Sad, Angry and Others. The model for emotion classification in three-sentence conversations was proposed in this work. The model is based on smileys and word embeddings with domain specificity in state of art conversations on the Internet. The importance of taking into account the information extracted from smileys as an additional data source of emotional coloring is investigated. The model performance is evaluated and compared with language processing model BERT (Bidirectional Encoder Representations from Transformers). The proposed model achieved better performance at classifying emotions comparing to BERT (having F1 score as 78 versus 75). It should be noted, that further study should be performed to enhance the processing by the model of mixed reviews represented by emotion class Others. However, modern performance of models for language representation and understanding did not achieve the human performance. There is a variety of factors to consider when choosing the word embeddings and training methods to design the model architecture.

Author Biographies

Kostiantyn Onyshchenko, Kharkiv National university of Radio electronics

Senior Lecturer at the Department of Software engineering, Kharkiv National university of Radio electronics, Kharkiv, Ukraine

Yana Daniiel, Kharkiv National university of Radio electronics

Assistant at the Department of Software engineering, Kharkiv National university of Radio electronics, Kharkiv, Ukraine

References

Graves A., Schmidhuber J. Framewise phoneme classification with bidirectional LSTM and other neural network architectures. Neural Networks. 2005, vol.18(5), pp.602–610.

Srivastava N., Hinton G., Krizhevsky A., Sutskever I., Salakhutdinov R. Dropout: A simple way to prevent neural networks from overfitting. Journal of Machine Learning Research. 2014, vol. 15, pp. 1929–1958.

Daniiel Y., Onyshchenko K. Implementation of Recursive Deep Learning Algorithms for Natural Language Processing. Information Systems and Technologies 2021. Kharkiv-Odesa, 2021, pp. 141–145.

Afanasieva I., Golian N., Hnatenko O., Daniiel Y., Onyshchenko K. Data exchange model in the internet of things concept. Telecommunications and Radio Engineering. 2019, vol. 78(10), pp. 869–878.

Collobert R., Weston J., Bottou L., Karlen M., Kavukcuoglu K., Kuksa P. Natural language processing (almost) from scratch. CoRR, 2011, pp. 201–244.

Joulin A., Grave E., Bojanowski P., Mikolov T. Bag of tricks for efficient text classification. CoRR, 2016, pp. 11–32.

Otter D.W., Medina J.R., Kalita J.K. A survey of the usages of deep learning in natural language processing. CoRR, 2018, pp. 112–123.

Allen J. Natural Language Understanding. 2nd ed. Benjamin-Cummings Publishing Co., Inc. Redwood City, CA, USA, 1995. 512 p.

Gupta U., Chatterjee A., Srikanth R., Agrawal P. A sentiment-and-semantics-based approach for emotion detection in textual conversations. In neu-ir: the SIGIR 2017 workshop on neural information retrieval. 2017, pp. 21–28.

Jurafsky D., Martin J.H. Speech and language processing: an introduction to natural language processing, computational linguistics, and speech recognition. Upper Saddle River, NJ, USA, Prentice Hall PTR, 2000. 1044 p.

Christopher M., Schütze H. Foundations of Statistical Natural Language Processing. MIT Press, Cambridge. MA, USA, 1999. 718 p.

Yao L., Guan Y. An Improved LSTM Structure for Natural Language Processing, 2018 IEEE International Conference of Safety Produce Informatization (IICSPI). Chongqing, China. 2018, pp. 565–569.

Korti S. S., Kanakaraddi, S. G. Depression detection from Twitter posts using NLP and Machine learning techniques, 2022 Fourth International Conference on Emerging Research in Electronics, Computer Science and Technology (ICERECT). Mandya, India. 2022, pp. 1–6.

Sharma D., Dhiman C., Kumar D. Automated Image Caption Generation Framework using Adaptive Attention and Bi-LSTM, IEEE Delhi Section Conference (DELCON). New Delhi, India, 2022, pp. 1–5.

Aziz A. A., Diamal E. C., Ilyas R. Paraphrase Detection Using Manhattan's Recurrent Neural Networks and Long Short-Term Memory, 6th International Conference on Electrical Engineering, Computer Science and Informatics (EECSI), Bandung, Indonesia. 2019, pp. 432–437.

Yang H., Feng Y. Authoritative Prediction of Website Based on Deep Learning. IEEE Fourth International Conference on Big Data Computing Service and Applications (BigDataService). Bamberg, Germany, 2018, pp. 208–212.

Downloads

Published

2023-07-15

How to Cite

Onyshchenko, K., & Daniiel, Y. (2023). USING LONG SHORT-TERM MEMORY NETWORKS FOR NATURAL LANGUAGE PROCESSING. Bulletin of National Technical University "KhPI". Series: System Analysis, Control and Information Technologies, (1 (9), 89–96. https://doi.org/10.20998/2079-0023.2023.01.14

Issue

Section

INFORMATION TECHNOLOGY