(48-5) 17 * << * >> * Russian * English * Content * All Issues

Neural network in an artificial intelligence model for realization of affective computing based on electroencephalogram analysis
A.G. Choban 1, D.G. Stadnikov 1, A.E. Sulavko 1

Omsk State Technical University,
644050, Omsk, Russia, Mira 11

 PDF, 1073 kB

DOI: 10.18287/2412-6179-CO-1417

Pages: 782-790.

Full text of article: Russian language.

Abstract:
This paper analyzes the possibility of assessing the emotional state of a person by analyzing features of their brain activity using non-invasive neuro-computer interfaces. The analysis of recent publications dealing with the use of electroencephalogram (EEG) signals for assessing the emotional state is carried out and topical problems in this area are revealed. The main approaches to brain stimulation for obtaining informative EEG signals, as well as describing methods for their analysis and recognition. The architecture of a deep convolutional neural network for EEG data analysis is proposed, as well as a neural network model for classifying 4 emotions (fear, happiness, sadness, calmness) according to the Russell valency-arozality scale based on two convolutional neural networks. An experiment was conducted with 50 participants who watched emotion-laden videos. EEG data from 50 subjects were collected and used to train and test the neural network model. The results showed a high emotion classification accuracy of 94%±3.4% while using a wireless neural interface.

Keywords:
emotion recognition, electroencephalogram, convolutional neural networks, EEG signal, brain-computer interface, neural interface, analysis of biometric parameters.

Citation:
Choban AG, Stadnikov DG, Sulavko AE. Neural network in an artificial intelligence model for realization of affective computing based on electroencephalogram analysis. Computer Optics 2024; 48(5): 782-790. DOI: 10.18287/2412-6179-CO-1417.

Acknowledgements:
The research was financially supported by the Ministry of Science and Higher Education of the Russian Federation (theme No. FSGF-2023-0004).

References:

  1. Shan MK, Kuo FF, Chiang MF, Lee SY. Emotion-based music recommendation by affinity discovery from film music. Expert Syst Appl 2009; 36(4): 7666-7674.
  2. Anderson K, McOwan PW. A real-time automated system for the recognition of human facial expression. IEEE Trans Syst Man Cybern B: Cybern 2006; 36(1): 96-105.
  3. Ang J, Dhillon R, Krupski A, Shriberg E, Stolcke A. Prosody-based automatic detection of annoyance and frustration in human-computer dialog. Int Conf on Spoken Language Processing 2002: 2037-2039.
  4. Craik A. Deep learning for electroencephalogram (EEG) classification tasks: a review, J Neur Eng 2019; 3(16). DOI: 10.1088/1741-2552/ab0ab5.
  5. Sulavko AE, Kuprik MA, Starkov AI, Stadnikov DG. Analysis of human pattern recognition methods based on electroencephalogram features (review) [In Russian], Information Protection Issues 2018; 4: 36-46.
  6. Vance A, Anderson BB, Kirwan BC, Eargle D. Using measures of risk perception to predict information security behavior: Insights from electroencephalography (EEG). J Assoc Inf Syst 2014; 15(10): 679-722. DOI: 10.17705/1jais.00375.
  7. Nigray АА. Methods of automatic assessment of human psychophysiological state by parameters of electroencephalograms (review) [In Russian]. Biomedical Radioelectronics 2020; 5: 5-18.
  8. Khushaba RN, Greenacre L, Kodagoda S, Louviere J, Burke S, Dissanayake G. Choice modeling and the brain: A study on the Electroencephalogram (EEG) of preference. Expert Syst Appl 2012; 39(16): 12378-12388.
  9. Ekman P, Friesen WV, O'Sullivan M, Chan A, DiacoyanniTarlatzis I, Heider K, Krause R, LeCompte WA, Pitcairn T, Ricci-Bitti PE. Universals and cultural differences in the judgments of facial expressions of emotion. J Pers Soc Psychol 1987; 53(4): 712-717.
  10. Parrott WG. Emotions in social psychology: Essential readings. Psychology Press; 2001.
  11. Plutchik R. The Nature of Emotions: Human emotions have deep evolutionary roots, a fact that may explain their complexity and provide tools for clinical practice. Am Sci 2001; 89: 344.
  12. Russell JA. A circumplex model of affect. J Pers Soc Psychol 1980; 39(6): 1161-1178.
  13. Bradley MM, Lang PJ. Measuring emotion: The self-assessment manikin and the semantic differential. J Behav Ther Exp Psychiatry 1994; 25(1): 49-59.
  14. Nie D, Wang X-W, Shi L-C, Lu B-L. EEG-based emotion recognition during watching movies. 5th Int IEEE EMBS Conf on Neural Engineering 2011: 667-670. DOI: 10.1109/NER.2011.5910636.
  15. Li M, Lu BL. Emotion classification based on gamma-band EEG. IEEE Int Conf Engineering in Medicine and Biology Society 2009; 1223-1226.
  16. Wu W, Gao X, Hong B, Gao S. Classifying single-trial EEG during motor imagery by iterative spatio-spectral patterns learning (ISSPL). IEEE Trans Biomed Eng 2008; 55(6): 1733-1743.
  17. Terasawa N, Tanaka H, Sakti S, Nakamura S. Tracking liking state in brain activity while watching multiple movies. 19th ACM Int Conf on Multimodal Interaction 2017: 321-325. DOI: 10.1145/3136755.3136772.
  18. Shi LC, Lu BL. Off-line and on-line vigilance Estimation based on linear dynamical system and manifold learning. Proc 32nd Int Conf of the IEEE Engineering in Medicine and Biology Society 2010: 6587-6590.
  19. Cheng J, Chen M, Li C, Cheng J. Emotion recognition from multi-channel EEG via deep forest. IEEE J Biomed Health Inform 2021; 25(2): 453-464.
  20. Yang Y, Wu Q, Fu Y, Chen X. Continuous convolutional neural network with 3D input for EEG-based emotion recognition. In Book: Cheng L, Leung ACS, Ozawa S, eds. Neural information processing. Pt VII. Cham: Springer Nature Switzerland AG; 2018: 433-443.
  21. Sarkar P. Self-supervised learning for ecg-based emotion recognition, IEEE Int Conf on Acoustics, Speech and Signal Processing (ICASSP) 2020: 3217-3221.
  22. Balestriero R, Ibrahim M, Sobal V, Morcos A, Shekhar S, Goldstein T, Bordes F, Bardes A, Mialon G, Tian Y, Schwarzschild A, Wilson AG, Geiping J, Garrido Q, Fernandez P, Bar A, Pirsiavash H, LeCun Y, Goldblum M. A cookbook of self-supervised learning. arXiv Preprint. 2023. Source: <https://arxiv.org/abs/2304.12210>. DOI: 10.48550/arXiv.2304.12210.
  23. Farhad Z, Retno W. Emotion classification using 1D-CNN and RNN based On DEAP Dataset. 10th Int Conf on Natural Language Processing 2021: 363-378. DOI: 10.5121/csit.2021.112328.
  24. Bhanumathi KS, Jayadevappa D, Tunga S. Feedback artificial shuffled shepherd optimization-based deep maxout network for human emotion recognition using EEG signals. Int J Telemed Appl 2022; 2022: 3749413. DOI: 10.1155/2022/3749413.
  25. Schmidt LA, Trainor LJ. Frontal brain electrical activity distinguishes valence and intensity of musical emotions. Cogn Emot 2001; 15(4): 487-500.
  26. Davidson RJ. The neuropsychology of emotion and affective style. In Book: Lewis M, Haviland JM, eds. Handbook of emotion. The Guilford Press; 1993: 143-154.
  27. Fox NA. If it's not left, it's right: Electroencephalograph asymmetry and the development of emotion. Am Psychol 1991; 46: 863-872.
  28. Heller W. Neuropsychological mechanisms of individual differences in emotion, personality, and arousal. Neuropsychology 1993; 7: 476-489.
  29. Morris JD. SAM: The self-assessment manikin an efficient cross-cultural measurement of emotional response. J Advert Res 1995; 35(8): 63-68.
  30. Sulavko AE, Lozhnikov PS, Choban AG, Stadnikov DG, Nigrey AA, Inivatov DP. Evaluation of eeg identification potential using statistical approach and convolutional neural networks. Information and Control Systems 2020; 6(109): 37-49. DOI: 10.31799/1684-8853-2020-6-37-49.
  31. Bragin AD, Spitsyn VG. Motor imagery recognition in electroencephalograms using convolutional neural networks. Computer Optics 2020; 44(3): 482-487. DOI: 10.18287/2412-6179-CO-669.
  32. Hodashinsky IA, Sarin KS, Bardamova MB, Svetlakov MO, Slezkin AO, Koryshev NP. Biometric data and machine learning methods in the diagnosis and monitoring of neurodegenerative diseases: a review. Computer Optics 2022; 46(6): 988-1019. DOI: 10.18287/2412-6179-CO-1134.
  33. Lawhern VJ, Solon AJ, Waytowich NR, Gordon SM, Hung CP, Lance BJ. EEGNet: A compact convolutional network for EEG-based Brain-Computer Interfaces. J Neural Eng 2018; 15(5): 056013. DOI: 10.1088/1741-2552/aace8c.
  34. Osherov E, Lindenbaum M. Increasing cnn robustness to occlusions by reducing filter support. IEEE Int Conf on Computer Vision 2017: 550-561. DOI: 10.1109/ICCV.2017.67.
  35. Yang HC, Lee CC. An attribute-invariant variational learning for emotion recognition using physiology. IEEE Int Conf on Acoustics, Speech and Signal Processing (ICASSP) 2019: 1184-1188.
  36. Shukla J, Barreda-Angeles M, Oliver J, Nandi G. Feature extraction and selection for emotion recognition from electrodermal activity. IEEE Trans Affect Comput 2021; 12(4): 857-869. DOI: 10.1109/TAFFC.2019.2901673.
  37. Katsigiannis S. DREAMER: A database for emotion recognition through EEG and ECG signals from wireless low-cost off-the-shelf devices, IEEE J Biomed Health Inform 2017; 22: 98-107.
  38. Harper R, Southern J. A Bayesian deep learning framework for end-to-end prediction of emotion from heartbeat. arXiv Preprint. 2020. Source: <https://arxiv.org/abs/1902.03043>.
  39. Zhang T, Wang X, Xu X, Chen C. GCB-Net: Graph convolutional broad network and its application in emotion recognition. IEEE Trans Affect Comput 2022; 13(1): 379-388. DOI: 10.1109/TAFFC.2019.2937768.
  40. Song T, Zheng W, Song P, Cui Z. EEG emotion recognition using dynamical graph convolutional neural networks. IEEE Trans Affect Comput 2020; 11(3): 532-541. DOI: 10.1109/TAFFC.2018.2817622.
  41. Candra H, Yuwono M, Handojoseno A, Chai R, Su S. Recognizing emotions from EEG subbands using wavelet analysis. Annual Int Conf of the IEEE Engineering in Medicine and Biology Society (EMBC) 2015: 6030-6033.
  42. Li C, Bao Z, Li L, Zhao Z. Exploring temporal representations by leveraging attention-based bidirectional LSTM-RNNs for multi-modal emotion recognition. Inf Process Manag 2020; 57: 102185. DOI: 10.1016/j.ipm.2019.102185.
  43. Zhao Y, Cao X, Lin J, Yu D, Cao X. Multimodal emotion recognition model using physiological signals. In Book: Liu D, Xie S, Li Y, Zhao D, El-Alfy E-LM, eds. Neural Information Processing. Pt IV. Cham: Springer International Publishing AG; 2017: 811-819. DOI: 10.1007/978-3-319-70093-9_86.
  44. Siddharth S, Jung T-P, Sejnowski TJ. Utilizing deep learning towards multi-modal bio-sensing and vision-based affective computing. IEEE Trans Affect Comput 2022; 13(1): 96-107. DOI: 10.1109/TAFFC.2019.2916015.
  45. Dar MN, Akram MU, Khawaja SG, Pujari AN. CNN and LSTM-based emotion charting using physiological signals. Sensors 2020; 20(16): 4551. DOI: 10.3390/s20164551..

© 2009, IPSI RAS
151, Molodogvardeiskaya str., Samara, 443001, Russia; E-mail: journal@computeroptics.ru ; Tel: +7 (846) 242-41-24 (Executive secretary), +7 (846) 332-56-22 (Issuing editor), Fax: +7 (846) 332-56-20