Enrico Martini, Nicola Valè, Michele Boldo, Anna Righetti, N. Smania, N. Bombieri
{"title":"On the Pose Estimation Software for Measuring Movement Features in the Finger-to-Nose Test","authors":"Enrico Martini, Nicola Valè, Michele Boldo, Anna Righetti, N. Smania, N. Bombieri","doi":"10.1109/ICDH55609.2022.00021","DOIUrl":null,"url":null,"abstract":"Assessing upper limb (UL) movements post-stroke is crucial to monitor and understand sensorimotor recovery. Recently, several research works focused on the relationship between reach-to-target kinematics and clinical outcomes. Since, conventionally, the assessment of sensorimotor impairments is primarily based on clinical scales and observation, and hence likely to be subjective, one of the challenges is to quantify such kinematics through automated platforms like inertial measurement units, optical, or electromagnetic motion capture systems. Even more challenging is to quantify UL kinematics through non-invasive systems, to avoid any influence or bias in the measurements. In this context, tools based on video cameras and deep learning software have shown to achieve high levels of accuracy for the estimation of the human pose. Nevertheless, an analysis of their accuracy in measuring kinematics features for the Finger-to-Nose Test (FNT) is missing. We first present an extended quantitative evaluation of such inference software (i.e., OpenPose) for measuring a clinically meaningful set of UL movement features. Then, we propose an algorithm and the corresponding software implementation that automates the segmentation of the FNT movements. This allows us to automatically extrapolate the whole set of measures from the videos with no manual intervention. We measured the software accuracy by using an infrared motion capture system on a total of 26 healthy and 26 stroke subjects.","PeriodicalId":120923,"journal":{"name":"2022 IEEE International Conference on Digital Health (ICDH)","volume":"22 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE International Conference on Digital Health (ICDH)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICDH55609.2022.00021","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
Assessing upper limb (UL) movements post-stroke is crucial to monitor and understand sensorimotor recovery. Recently, several research works focused on the relationship between reach-to-target kinematics and clinical outcomes. Since, conventionally, the assessment of sensorimotor impairments is primarily based on clinical scales and observation, and hence likely to be subjective, one of the challenges is to quantify such kinematics through automated platforms like inertial measurement units, optical, or electromagnetic motion capture systems. Even more challenging is to quantify UL kinematics through non-invasive systems, to avoid any influence or bias in the measurements. In this context, tools based on video cameras and deep learning software have shown to achieve high levels of accuracy for the estimation of the human pose. Nevertheless, an analysis of their accuracy in measuring kinematics features for the Finger-to-Nose Test (FNT) is missing. We first present an extended quantitative evaluation of such inference software (i.e., OpenPose) for measuring a clinically meaningful set of UL movement features. Then, we propose an algorithm and the corresponding software implementation that automates the segmentation of the FNT movements. This allows us to automatically extrapolate the whole set of measures from the videos with no manual intervention. We measured the software accuracy by using an infrared motion capture system on a total of 26 healthy and 26 stroke subjects.