(49-6) 37 * << * >> * Russian * English * Content * All Issues

A method for analyzing biometric data to assess the cognitive load and stress resistance of an agricultural unmanned aerial vehicle operator during mission planning
L.A. Taskina 1, L.A. Abakumov 1, T.D. Kazarkin 1

Samara National Research University,
Moskovskoye Shosse 34, Samara, 443086, Russia

 PDF, 2856 kB

DOI: 10.18287/COJ1849

Pages: 1202-1210.

Full text of article: English language.

Abstract:
We present a computer vision method to assess cognitive load and stress resistance of an agricultural unmanned aerial vehicle operator during mission planning in virtual reality. The approach combines geometrically rigorous gaze-to-user-interface mapping projecting the eye-tracker ray into widget space to obtain metrically correct hits on areas of interest, behavioral and ocular biomarkers, and image-like attention representations, such as heatmaps and recurrence plots. In a study with twelve participants across four scenarios, we recorded 1,198 interaction events and obtained 85.3 % accuracy of gaze-to-interface hits; with increasing difficulty, fixation durations shortened, transition entropy increased, and event-locked pupil responses became larger and slower to recover. Planning time and the time required for replanning increased, while route quality decreased under time pressure. The approach relies only on software–platform aggregate signals and does not use raw eye images, which supports privacy-preserving deployment and portability to ground control software.

Keywords:
agricultural; unmanned aerial vehicle; stress resistance; virtual reality; Unreal Engine; gaze tracking; mission planning.

Citation:
Taskina LA, Abakumov LA, Kazarkin TD. A method for analyzing biometric data to assess the cognitive load and stress resistance of an agricultural unmanned aerial vehicle operator during mission planning. Computer Optics 2025; 49(6): 1202-1210. DOI: 10.18287/COJ1849.

Acknowledgements:
This work was supported by the Ministry of science and higher education of the Russian Federation, grant No 075-15-2025-610.

References:

  1. Meng W, Zhang X, Zhou L, Guo H, Hu X. Advances in UAV path planning: A comprehensive review of Methods, Challenges, and Future Directions. Drones 2025; 9(5): 376. DOI: 10.3390/drones9050376.
  2. Kim J, Atkins E. Airspace Geofencing and Flight Planning for Low-Altitude, Urban, Small Unmanned Aircraft Systems. Applied Sciences 2022; 12(2): 576. DOI: 10.3390/app12020576.
  3. ElSayed M, Mohamed M. Robust digital–twin airspace discretization and trajectory optimization for autonomous unmanned aerial vehicles. Scientific Reports 2024; 14: 12506. DOI: 10.1038/s41598-024-62421-4.
  4. Behjati M, Nordin R, Zulkifley MA, Abdullah NF. 3D Global Path Planning Optimization for Cellular-Connected UAVs under Link Reliability Constraint. Sensors 2022; 22(22):8957. DOI: 10.3390/s22228957.
  5. Das Chakladar D, Roy PP. Cognitive workload estimation using physiological measures: a review. Cogn Neurodyn 2024; 18(4):1445-1465. DOI: 10.1007/s11571-023-10051-3.
  6. Radhakrishnan V, Louw T, Gonçalves RC, Torrao G, Lenné M, Merat N. Using pupillometry and gaze-based metrics for understanding drivers’ mental workload during automated driving. Transportation Research Part F: Traffic Psychology and Behaviour 2023; 94: 254-267. DOI: 10.1016/j.trf.2023.02.015.
  7. Alshanskaia EI, Portnova GV, Liaukovich K, Martynova OV. Pupillometry and autonomic nervous system responses to cognitive load and false feedback: An unsupervised machine learning approach. Frontiers in Neuroscience 2024; 18:1445697. DOI: 10.3389/fnins.2024.1445697.
  8. Ugwitz P, Kvarda O, Juříková Z, Šašinka Č, Tamm S. Eye-tracking in interactive virtual environments: Implementation and evaluation. Applied Sciences 2022; 12(3):1027. DOI: 10.3390/app12031027.
  9. Warin C, Pak V, Reinhardt D. Privacy perceptions across the XR spectrum: An extended reality cross-platform comparative analysis of a virtual house tour. Proceedings on Privacy Enhancing Technologies 2025; 1: 150. DOI: 10.56553/popets-2025-0009.
  10. Garrido GM, Nair V, Song D. SoK: Data privacy in virtual reality. Proceedings on Privacy Enhancing Technologies 2024; 21: 40. DOI: 10.56553/popets-2024-0003.
  11. Kuwahara A, Nishikawa K, Hirakawa R, Kawano H, Nakatoh Y. Eye fatigue estimation using blink detection based on Eye Aspect Ratio Mapping (EARM). Cognitive Robotics 2022; 2: 50–59. DOI: 10.1016/j.cogr.2022.01.003.
  12. Kaur K, Gurnani B, Nayak S, Deori N, Kaur S, Jethani J, Singh D, Agarkar S, Hussaindeen JR, Sukhija J, Mishra D. Digital eye strain – A comprehensive review. Ophthalmol Ther. 2022; 11(5):1655-1680. DOI: 10.1007/s40123-022-00540-9.
  13. Huyghe T, Calleja-González J, Bird SP, Alcaraz PE. Pupillometry as a new window to player fatigue? A glimpse inside the eyes of a Euro Cup Women’s Basketball team. Biol Sport. 2024; 41(1): 3-15. DOI: 10.5114/biolsport.2024.125590.
  14. Bisogni C, Nappi M, Tortora G, Del Bimbo A. Gaze analysis: A survey on its applications. Image and Vision Computing 2024; 144(13): 104961. DOI: 10.1016/j.imavis.2024.104961.
  15. Szczepaniak D, Harvey M, Deligianni F. Predictive Modelling of Cognitive Workload in VR: An Eye-Tracking Approach. Proceedings of the 2024 Symposium on Eye Tracking Research and Applications 2024; 46: 1-3. DOI: 10.1145/3649902.3655642.
  16. Moulder R, Booth B, Abitino A, D'Mello S. Recurrence Quantification Analysis of Eye Gaze Dynamics During Team Collaboration. ACM LAK 2023. DOI: 10.1145/3576050.3576113.
  17. Atweh J, Riggs S. Quantifying collaborative strategies and identifying performance breakdowns of UAV C2 teams using multidimensional cross-recurrence quantification analysis. International Journal of Human-Computer Studies 2025; 204: 103593. DOI: 10.1016/j.ijhcs.2025.103593.
  18. Wu S, Chen H, Hou L, Zhang GK, Li CQ. Using Eye-Tracking to Measure Worker Situation Awareness in Augmented Reality. Automation in Construction 2024; 165: 105582. DOI: 10.1016/j.autcon.2024.105582.
  19. Devlin SP, Brown NL, Drollinger S, Sibley C, Alami J, Riggs SL. Scan-based eye tracking measures are predictive of workload transition performance. Applied Ergonomics 2022; 105: 103829. DOI: 10.1016/j.apergo.2022.103829.
  20. Zhu M, Wu Q, Bai Z, Song Y, Gao Q. EEG-eye movement based subject dependence, cross-subject, and cross-session emotion recognition with multidimensional homogeneous encoding space alignment. Expert Systems with Applications 2024; 251: 124001. DOI: 10.1016/j.eswa.2024.124001.
  21. Ramírez-Atencia C, Camacho D. Extending QGroundControl for automated mission planning of UAVs. Sensors 2018; 18(7): 2339. DOI: 10.3390/s18072339.
  22. He M, Alkurdi A, Clore JL, Sowers RB, Hsiao-Wecksler ET, Hernandez ME. Scoping Review of ML Approaches in Anxiety Detection from In-Lab to In-the-Wild. Applied Sciences. 2025; 15(18): 10099. DOI: 10.3390/app151810099.
  23. Bednarski BP, Singh AD, Zhang W, Jones WM, Naeim A, Ramezani R. Temporal convolutional networks and data rebalancing for clinical length of stay and mortality prediction. Scientific Reports 2022; 12: 21247. DOI: 10.1038/s41598-022-25472-z.
  24. Shafiq M, Gu Z. Deep Residual Learning for Image Recognition: A Survey. Applied Sciences 2022; 12(18): 8972. DOI: 10.3390/app12188972.
  25. Xu L, Skoularidou M, Cuesta-Infante A, Veeramachaneni K. Modeling tabular data using conditional GAN. NeurIPS Workshop 2019; 659: 7335. DOI: 10.48550/arXiv.1907.00503.

© 2009, IPSI RAS
151, Molodogvardeiskaya str., Samara, 443001, Russia; E-mail: journal@computeroptics.ru ; Tel: +7 (846) 242-41-24 (Executive secretary), +7 (846) 332-56-22 (Issuing editor), Fax: +7 (846) 332-56-20