(44-6) 13 * << * >> * Русский * English * Содержание * Все выпуски

The study of skeleton description reduction in the human fall-detection task
O.S. Seredin 1, A.V. Kopylov 1, E.E. Surkov 1

Tula State University, 300012, Tula, Russia, Lenin Ave. 92

 PDF, 2296 kB

DOI: 10.18287/2412-6179-CO-753

Страницы: 951-958.

Язык статьи: English

Аннотация:
Accurate and reliable real-time fall detection is a key aspect of any intelligent elderly people care system. A lot of modern RGB-D cameras can provide a skeleton description of a human figure as a compact pose presentation. This makes it possible to use this description for further analysis without access to real video and, thus, to increase the privacy of the whole system. The skeleton description reduction based on the anthropometrical characteristics of a human body is proposed. The experimental study on the TST Fall Detection dataset v2 by the Leave-One-Person-Out method shows that the proposed skeleton description reduction technique provides better recognition quality and increases the overall performance of a Fall-Detection System.

Ключевые слова:
fall detection, human activity detection, skeleton description, RGB-D camera, elderly people care system.

Благодарности
The work is supported by the Russian Fund for Basic Research, grants 18-07-00942, 18-07-01087, 20-07-00441. The results of the research project are published with the financial support of Tula State University within the framework of the scientific project 2019-21NIR. The part of the research is carried out using the equipment of the shared research facilities of HPC computing resources at Lomonosov Moscow State University.

Citation:
Seredin OS, Kopylov AV, Surkov EE. The study of skeleton description reduction in the human fall-detection task. Computer Optics 2020; 44(6): 951-958. DOI: 10.18287/2412-6179-CO-753.

Литература:

  1. Falls. World Health Organization. Source: <https://www.who.int/en/news-room/fact-sheets/detail/falls>.
  2. Wild K, et al. Unobtrusive in-home monitoring of cognitive and physical health: Reactions and perceptions of older adults. J Appl Gerontol 2008; 27: 181-200.
  3. Mastorakis G, Makris D. Fall detection system using Kinect’s infrared sensor. J Real-Time Image Process 2012; 9(4): 635-646.
  4. Demiris G, et al. Older adults’ privacy considerations for vision based recognition methods of eldercare applications. Technol Heal Care 2009; 17(1): 41-48.
  5. Seredin OS, Kopylov AV, Huang S-C, Rodionov DS. A skeleton features-based fall detection using Microsoft Kinect v2 with one class-classifier outlier removal. Int Arch Photogramm Remote Sens Spat Inf Sci 2019; 42(2:W12): 189-195.
  6. Mundher Z, Zhong J. A real-time fall detection system in elderly care using mobile robot and Kinect sensor. Int J Mater Mech Manuf 2014; 2(2): 133-138.
  7. Wang J, et al. Mining actionlet ensemble for action recognition with depth cameras. Proc IEEE Comp Soc Conf Comp Vis Pattern Recognit 2012: 1290-1297.
  8. Vemulapalli R, Arrate F, Chellappa R. Human action recognition by representing 3D skeletons as points in a lie group. Proc IEEE Comput Soc Conf Comput Vis Pattern Recognit 2014: 588-595.
  9. Hussein ME, et al. Human action recognition using a temporal hierarchy of covariance descriptors on 3D joint locations. IJCAI Int Jt Conf Artif Intell 2013: 2466-2472.
  10. Papandreou G, et al. Towards accurate multi-person pose estimation in the wild. Proc 30th IEEE Conf Comput Vis Pattern Recognit (CVPR) 2017; 2017: 3711-3719.
  11. Pathak D, Bhosale VK. Fall detection for elderly people in homes using Kinect Sensor. Int J Innov Res Comput Commun Eng 2017; 5(2): 1468-1474.
  12. Bevilacqua V, et al. Fall detection in indoor environment with Kinect sensor. 2014 IEEE Int Symp Innov Intell Syst Appl Proc 2014: 319-324.
  13. Chen C, et al. Learning a 3D human pose distance metric from geometric pose descriptor. IEEE Trans Vis Comput Graph 2011; 17(11): 1676-1689.
  14. Zhang S, Liu X, Xiao J. On geometric features for skeleton-based action recognition using multilayer LSTM networks. Proc 2017 IEEE Winter Conference on Applications of Computer Vision (WACV) 2017: 148-157.
  15. Zhang X, Xu C, Tao D. Graph edge convolutional neural networks for skeleton based action recognition. 2018. P. 1–22.
  16. Du Y, Wang W, Wang L. Hierarchical recurrent neural network for skeleton based action recognition. Proc IEEE Comput Soc Conf Comput Vis Pattern Recognit 2015; 07-12-June: 1110-1118.
  17. Yan S, Xiong Y, Lin D. Spatial temporal graph convolutional networks for skeleton-based action recognition. arXiv Preprint 2018. Source: <https://arxiv.org/abs/1801.07455>.
  18. TST Fall detection dataset v2. IEEE DataPort. Source: <https://ieee-dataport.org/documents/tst-fall-detection-dataset-v2>.
  19. Sung J, et al. Unstructured human activity detection from RGBD images. Proc IEEE Int Conf on Robotics and Automation 2012: 842-849.
  20. Page ES. Continuous inspection schemes. Biometrika 1954; 41(1/2): 100-115.
  21. Gasparrini S, Cippitelli E, Gambi E, Spinsante S, Wåhslén J, Orhan I, Lindh T. Proposal and experimental evaluation of fall detection solution based on wearable and depth data fusion. In Book: Loshkovska SSK, ed. ICT Innovations 2015: Advances in intelligent systems and computing. Cham: Springer; 2016: 99-108.
  22. Fakhrulddin AH, Fei X, Li H. Convolutional neural networks (CNN) based human fall detection on Body Sensor Networks (BSN) sensor data. Proc 2017 4th Int Conf Syst Informatics (ICSAI) 2018: 1461-1465.
  23. Hwang S, Ahn D, Park H, Park T. Maximizing accuracy of fall detection and alert systems based on 3D convolutional neural network. Proc Second Int Conf Internet-of-Things Des Implement (IoTDI’17) 2017: 343-344.
  24. Min W, Yao L, Lin Z, Liu L. Support vector machine approach to fall recognition based on simplified expression of human skeleton action and fast detection of start key frame using torso angle. IET Comput Vis 2018; 12(8): 1133-1140.
    .

© 2009, IPSI RAS
Россия, 443001, Самара, ул. Молодогвардейская, 151; электронная почта: ko@smr.ru ; тел: +7 (846) 242-41-24 (ответственный секретарь), +7 (846) 332-56-22 (технический редактор), факс: +7 (846) 332-56-20