Abstract
The automatic classification of ASD has recently been approached from a gait analysis perspective in controlled environments. This paper compares three Monocular Human Pose Estimation models: CLIFF, VideoPose using PoseAug, and PSTMO, on an ASD gait dataset captured using a mobile phone. The ability to collect the 3D joint positions from readily available devices, like mobile phone cameras, removes both: 1) the need to provide a bespoke collection device; and 2) the need for the subject being recorded to travel to a controlled environment. However, the relative performance of phone cameras versus more laboratory-like methods implemented using the Kinect camera sensor or equivalent has not been fully assessed for ASD gait data collection. A combination of peak detection, Dynamic Time Warping, and position correction using average offset values are utilised in the synchronisation and matching of gait data from two devices, the Kinect V2 and the Samsung Note 9 RGB Camera, that have different sampling rates. Experimental results show CLIFF is the most suitable model for gait analysis.
Original language | English |
---|---|
Pages | 1-4 |
Number of pages | 4 |
DOIs | |
Publication status | Published online - 20 Mar 2024 |
Event | Irish Conference on Artificial Intelligence and Cognitive Science (AICS 2023) - , Ireland Duration: 7 Dec 2023 → 8 Dec 2023 |
Conference
Conference | Irish Conference on Artificial Intelligence and Cognitive Science (AICS 2023) |
---|---|
Country/Territory | Ireland |
Period | 7/12/23 → 8/12/23 |
Bibliographical note
Publisher Copyright:© 2023 IEEE.
Keywords
- gait analysis
- monocular human pose estimation
- video analysis