(48-5) 17 * << * >> * Русский * English * Содержание * Все выпуски
  
Нейросетевая модель искусственного интеллекта для реализации аффективных вычислений на основе анализа электроэнцефалограмм
 А.Г. Чобан 1, Д.Г. Стадников 1, А.Е. Сулавко 1
 1 ФГАОУ ВО «Омский государственный технический университет» (ОмГТУ),
     644050, г. Омск, просп. Мира, д. 11
 
 PDF, 1073 kB
  PDF, 1073 kB
DOI: 10.18287/2412-6179-CO-1417
Страницы: 782-790.
Аннотация:
Статья посвящена анализу возможности оценки  эмоционального состояния человека по особенностям работы головного мозга с  использованием неинвазивных нейро-компьютерных интерфейсов. Проведен анализ  публикаций последних лет, посвященных применению сигналов электроэнцефалограммы  (ЭЭГ) для оценки эмоционального состояния, выявлены актуальные проблемы в этой  области. Описаны основные подходы к стимуляции головного мозга для получения  информативных сигналов ЭЭГ, а также методы их анализа и распознавания. Предложена  архитектура глубокой сверточной нейронной сети для анализа данных ЭЭГ, а также нейросетевая  модель искусственного интеллекта для классификации 4 эмоций (страх, счастье,  печаль, спокойствие) по шкале валентности-арозальности Рассела на основе двух  сверточных нейронных сетей. Проведен эксперимент с участием 50 человек, которые  просматривали эмоционально окрашенные видеоролики. Собраны данные ЭЭГ 50  испытуемых, которые использовались для обучения и тестирования нейросетевой  модели. Результаты показали высокую точность классификации эмоций (94%±3,4%)  с использованием беспроводного нейроинтерфейса.
Ключевые слова:
распознавание эмоций, электроэнцефалограмма,  сверточные нейронные сети, сигнал ЭЭГ, интерфейс мозг-компьютер, нейроинтерфейс,  анализ биометрических параметров.
Благодарности
Работа выполнена ОмГТУ в рамках государственного  задания Минобрнауки России на 2023 – 2025 годы  (FSGF-2023-0004).
Цитирование:
Чобан, А.Г. Нейросетевая модель искусственного интеллекта для реализации аффективных вычислений на основе анализа электроэнцефалограмм / А.Г. Чобан, Д.Г. Стадников, А.Е. Сулавко // Компьютерная оптика. – 2024. – Т. 48, № 5. – С. 782-790. – DOI: 10.18287/2412-6179-CO-1417.
Citation:
Choban AG, Stadnikov DG,  Sulavko AE. Neural network in an artificial intelligence model for realization  of affective computing based on electroencephalogram analysis. Computer Optics  2024; 48(5): 782-790. DOI: 10.18287/2412-6179-CO-1417.
References:
  - Shan MK, Kuo FF, Chiang  MF, Lee SY. Emotion-based music recommendation by affinity discovery from film  music. Expert Syst Appl 2009; 36(4): 7666-7674.
 
- Anderson K, McOwan PW. A real-time automated system for the recognition of human  facial expression. IEEE Trans Syst Man Cybern B: Cybern 2006; 36(1): 96-105.
 
- Ang J, Dhillon R, Krupski A,  Shriberg E, Stolcke A. Prosody-based automatic detection of annoyance and  frustration in human-computer dialog. Int Conf on Spoken Language Processing  2002: 2037-2039.
 
- Craik A. Deep learning for  electroencephalogram (EEG) classification tasks: a review, J Neur Eng 2019;  3(16). DOI: 10.1088/1741-2552/ab0ab5.
 
- Sulavko AE, Kuprik MA, Starkov AI,  Stadnikov DG. Analysis  of human pattern recognition methods based on electroencephalogram features  (review) [In Russian], Information Protection Issues 2018; 4: 36-46.
 
- Vance A, Anderson BB, Kirwan BC,  Eargle D. Using measures of risk perception to predict information security  behavior: Insights from electroencephalography (EEG). J Assoc Inf Syst 2014;  15(10): 679-722. DOI: 10.17705/1jais.00375.
 
- Nigray АА. Methods of automatic  assessment of human psychophysiological state by parameters of electroencephalograms  (review) [In Russian]. Biomedical Radioelectronics 2020; 5: 5-18.
 
- Khushaba RN, Greenacre L, Kodagoda S, Louviere J,  Burke S, Dissanayake G. Choice modeling and the brain: A study on the  Electroencephalogram (EEG) of preference. Expert Syst Appl 2012; 39(16):  12378-12388.
 
- Ekman P, Friesen WV, O’Sullivan M,  Chan A, DiacoyanniTarlatzis I, Heider K, Krause R, LeCompte WA, Pitcairn T,  Ricci-Bitti PE. Universals and cultural differences in the judgments of facial  expressions of emotion. J Pers Soc Psychol 1987; 53(4): 712-717.
 
- Parrott WG. Emotions in social  psychology: Essential readings. Psychology Press; 2001.
 
- Plutchik R. The Nature of Emotions:  Human emotions have deep evolutionary roots, a fact that may explain their complexity  and provide tools for clinical practice. Am Sci 2001; 89: 344.
 
- Russell JA. A circumplex model of  affect. J Pers Soc Psychol 1980; 39(6): 1161-1178.
 
- Bradley MM, Lang PJ. Measuring  emotion: The self-assessment manikin and the semantic differential. J Behav  Ther Exp Psychiatry 1994; 25(1): 49-59.
 
- Nie D, Wang X-W, Shi L-C, Lu B-L. EEG-based emotion recognition during  watching movies. 5th Int IEEE EMBS Conf on Neural Engineering 2011: 667-670.  DOI: 10.1109/NER.2011.5910636.
 
- Li M, Lu BL. Emotion classification  based on gamma-band EEG. IEEE Int Conf Engineering in Medicine and Biology  Society 2009; 1223-1226.
 
- Wu W, Gao X, Hong B, Gao S.  Classifying single-trial EEG during motor imagery by iterative spatio-spectral  patterns learning (ISSPL). IEEE Trans Biomed Eng 2008; 55(6): 1733-1743.
 
- Terasawa N, Tanaka H, Sakti S,  Nakamura S. Tracking liking state in brain activity while watching multiple  movies. 19th ACM Int Conf on Multimodal Interaction 2017: 321-325. DOI:  10.1145/3136755.3136772.
 
- Shi LC, Lu BL. Off-line and on-line  vigilance Estimation based on linear dynamical system and manifold learning.  Proc 32nd Int Conf of the IEEE Engineering in Medicine and Biology Society  2010: 6587-6590.
 
- Cheng J, Chen M, Li C, Cheng J.  Emotion recognition from multi-channel EEG via deep forest. IEEE J Biomed  Health Inform 2021; 25(2): 453-464.
 
- Yang Y, Wu Q, Fu Y, Chen X. Continuous convolutional neural  network with 3D input for EEG-based emotion recognition. In Book: Cheng L,  Leung ACS, Ozawa S, eds. Neural information processing. Pt VII. Cham: Springer  Nature Switzerland AG; 2018: 433-443.
 
- Sarkar P. Self-supervised learning  for ecg-based emotion recognition, IEEE Int Conf on Acoustics, Speech and Signal  Processing (ICASSP) 2020: 3217-3221.
 
- Balestriero R, Ibrahim M, Sobal V,  Morcos A, Shekhar S, Goldstein T, Bordes F, Bardes A, Mialon G, Tian Y,  Schwarzschild A, Wilson AG, Geiping J, Garrido Q, Fernandez P, Bar A,  Pirsiavash H, LeCun Y, Goldblum M. A cookbook of self-supervised learning.  arXiv Preprint. 2023. Source: <https://arxiv.org/abs/2304.12210>. DOI:  10.48550/arXiv.2304.12210.
 
- Farhad Z, Retno W. Emotion  classification using 1D-CNN and RNN based On DEAP Dataset. 10th Int Conf on Natural  Language Processing 2021: 363-378. DOI: 10.5121/csit.2021.112328.
 
- Bhanumathi KS, Jayadevappa D, Tunga S. Feedback  artificial shuffled shepherd optimization-based deep maxout network for human  emotion recognition using EEG signals. Int J Telemed Appl 2022; 2022: 3749413.  DOI: 10.1155/2022/3749413.
 
- Schmidt LA, Trainor LJ. Frontal  brain electrical activity distinguishes valence and intensity of musical  emotions. Cogn Emot 2001; 15(4): 487-500.
 
- Davidson RJ. The neuropsychology of  emotion and affective style. In Book: Lewis M, Haviland JM, eds. Handbook of  emotion. The Guilford  Press; 1993: 143-154.
 
- Fox NA. If it’s not left, it’s  right: Electroencephalograph asymmetry and the development of emotion. Am  Psychol 1991; 46: 863-872.
 
- Heller W. Neuropsychological  mechanisms of individual differences in emotion, personality, and arousal. Neuropsychology  1993; 7: 476-489.
 
- Morris JD. SAM: The self-assessment  manikin an efficient cross-cultural measurement of emotional response. J Advert  Res 1995; 35(8): 63-68.
 
- Sulavko AE, Lozhnikov PS, Choban AG,  Stadnikov DG, Nigrey AA, Inivatov DP. Evaluation of eeg identification potential using statistical approach and  convolutional neural networks. Information and Control Systems 2020; 6(109):  37-49. DOI: 10.31799/1684-8853-2020-6-37-49.
 
- Bragin AD, Spitsyn VG. Motor imagery  recognition in electroencephalograms using convolutional neural networks.  Computer Optics 2020; 44(3): 482-487. DOI: 10.18287/2412-6179-CO-669.
 
- Hodashinsky IA,  Sarin KS, Bardamova MB, Svetlakov MO,  Slezkin AO, Koryshev NP. Biometric data and machine learning methods in the  diagnosis and monitoring of neurodegenerative diseases: a review. Computer  Optics 2022; 46(6): 988-1019. DOI: 10.18287/2412-6179-CO-1134.
 
- Lawhern VJ, Solon AJ, Waytowich NR,  Gordon SM, Hung CP, Lance BJ. EEGNet: A compact convolutional network for  EEG-based Brain-Computer Interfaces. J Neural Eng 2018; 15(5): 056013. DOI:  10.1088/1741-2552/aace8c.
 
- Osherov E, Lindenbaum M. Increasing  cnn robustness to occlusions by reducing filter support. IEEE Int Conf on  Computer Vision 2017: 550-561. DOI: 10.1109/ICCV.2017.67.
 
- Yang HC, Lee CC. An attribute-invariant  variational learning for emotion recognition using physiology. IEEE Int Conf on  Acoustics, Speech and Signal Processing (ICASSP) 2019: 1184-1188.
 
- Shukla J, Barreda-Angeles M, Oliver  J, Nandi G. Feature extraction and selection for emotion recognition from electrodermal  activity. IEEE Trans Affect Comput 2021; 12(4): 857-869. DOI:  10.1109/TAFFC.2019.2901673.
 
- Katsigiannis S. DREAMER: A database  for emotion recognition through EEG and ECG signals from wireless low-cost  off-the-shelf devices, IEEE J Biomed Health Inform 2017; 22: 98-107.
 
- Harper R, Southern J. A Bayesian  deep learning framework for end-to-end prediction of emotion from heartbeat.  arXiv Preprint. 2020. Source: <https://arxiv.org/abs/1902.03043>.
 
- Zhang T, Wang X, Xu X, Chen C.  GCB-Net: Graph convolutional broad network and its application in emotion  recognition. IEEE Trans Affect Comput 2022; 13(1): 379-388. DOI:  10.1109/TAFFC.2019.2937768.
 
- Song T, Zheng W, Song P, Cui Z. EEG  emotion recognition using dynamical graph convolutional neural networks. IEEE  Trans Affect Comput 2020; 11(3): 532-541. DOI: 10.1109/TAFFC.2018.2817622.
 
- Candra H, Yuwono M, Handojoseno A,  Chai R, Su S. Recognizing emotions from EEG subbands using wavelet analysis.  Annual Int Conf of the IEEE Engineering in Medicine and Biology Society (EMBC)  2015: 6030-6033.
 
- Li  C, Bao Z, Li L, Zhao Z. Exploring  temporal representations by leveraging attention-based bidirectional LSTM-RNNs  for multi-modal emotion recognition. Inf Process Manag 2020; 57: 102185. DOI:  10.1016/j.ipm.2019.102185.
 
- Zhao Y, Cao X, Lin J, Yu D, Cao X.  Multimodal emotion recognition model using physiological signals. In Book: Liu  D, Xie S, Li Y, Zhao D, El-Alfy E-LM, eds. Neural Information Processing. Pt  IV. Cham: Springer International Publishing AG; 2017: 811-819. DOI:  10.1007/978-3-319-70093-9_86.
 
- Siddharth S, Jung T-P, Sejnowski TJ.  Utilizing deep learning towards multi-modal bio-sensing and vision-based affective  computing. IEEE Trans Affect Comput 2022; 13(1): 96-107. DOI:  10.1109/TAFFC.2019.2916015.
 
- Dar MN, Akram MU, Khawaja SG, Pujari  AN. CNN and LSTM-based emotion charting using physiological signals. Sensors  2020; 20(16): 4551. DOI: 10.3390/s20164551..
  
  © 2009, IPSI RAS
    Россия, 443001, Самара, ул. Молодогвардейская, 151; электронная почта: journal@computeroptics.ru; тел: +7  (846)  242-41-24 (ответственный секретарь), +7 (846) 332-56-22 (технический  редактор), факс: +7 (846) 332-56-20