Philipp Barzyk, Philip Zimmermann, Manuel Stein, Daniel Keim, Markus Gruber
{"title":"反向运动跳跃时髋关节、膝关节和踝关节运动学的人工智能智能手机无标记运动捕捉。","authors":"Philipp Barzyk, Philip Zimmermann, Manuel Stein, Daniel Keim, Markus Gruber","doi":"10.1002/ejsc.12186","DOIUrl":null,"url":null,"abstract":"<p>Recently, AI-driven skeleton reconstruction tools that use multistage computer vision pipelines were designed to estimate 3D kinematics from 2D video sequences. In the present study, we validated a novel markerless, smartphone video-based artificial intelligence (AI) motion capture system for hip, knee, and ankle angles during countermovement jumps (CMJs). Eleven participants performed six CMJs. We used 2D videos created by a smartphone (Apple iPhone X, 4K, 60 fps) to create 24 different keypoints, which together built a full skeleton including joints and their connections. Body parts and skeletal keypoints were localized by calculating confidence maps using a multilevel convolutional neural network that integrated both spatial and temporal features. We calculated hip, knee, and ankle angles in the sagittal plane and compared it with the angles measured by a VICON system. We calculated the correlation between both method's angular progressions, mean squared error (MSE), mean average error (MAE), and the maximum and minimum angular error and run statistical parametric mapping (SPM) analysis. Pearson correlation coefficients (r) for hip, knee, and ankle angular progressions in the sagittal plane during the entire movement were 0.96, 0.99, and 0.87, respectively. SPM group-analysis revealed some significant differences only for ankle angular progression. MSE was below 5.7°, MAE was below 4.5°, and error for maximum amplitudes was below 3.2°. The smartphone AI motion capture system with the trained multistage computer vision pipeline was able to detect, especially hip and knee angles in the sagittal plane during CMJs with high precision from a frontal view only.</p>","PeriodicalId":93999,"journal":{"name":"European journal of sport science","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-08-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11451555/pdf/","citationCount":"0","resultStr":"{\"title\":\"AI-smartphone markerless motion capturing of hip, knee, and ankle joint kinematics during countermovement jumps\",\"authors\":\"Philipp Barzyk, Philip Zimmermann, Manuel Stein, Daniel Keim, Markus Gruber\",\"doi\":\"10.1002/ejsc.12186\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>Recently, AI-driven skeleton reconstruction tools that use multistage computer vision pipelines were designed to estimate 3D kinematics from 2D video sequences. In the present study, we validated a novel markerless, smartphone video-based artificial intelligence (AI) motion capture system for hip, knee, and ankle angles during countermovement jumps (CMJs). Eleven participants performed six CMJs. We used 2D videos created by a smartphone (Apple iPhone X, 4K, 60 fps) to create 24 different keypoints, which together built a full skeleton including joints and their connections. Body parts and skeletal keypoints were localized by calculating confidence maps using a multilevel convolutional neural network that integrated both spatial and temporal features. We calculated hip, knee, and ankle angles in the sagittal plane and compared it with the angles measured by a VICON system. We calculated the correlation between both method's angular progressions, mean squared error (MSE), mean average error (MAE), and the maximum and minimum angular error and run statistical parametric mapping (SPM) analysis. Pearson correlation coefficients (r) for hip, knee, and ankle angular progressions in the sagittal plane during the entire movement were 0.96, 0.99, and 0.87, respectively. SPM group-analysis revealed some significant differences only for ankle angular progression. MSE was below 5.7°, MAE was below 4.5°, and error for maximum amplitudes was below 3.2°. The smartphone AI motion capture system with the trained multistage computer vision pipeline was able to detect, especially hip and knee angles in the sagittal plane during CMJs with high precision from a frontal view only.</p>\",\"PeriodicalId\":93999,\"journal\":{\"name\":\"European journal of sport science\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-08-28\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11451555/pdf/\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"European journal of sport science\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://onlinelibrary.wiley.com/doi/10.1002/ejsc.12186\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"European journal of sport science","FirstCategoryId":"1085","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1002/ejsc.12186","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
AI-smartphone markerless motion capturing of hip, knee, and ankle joint kinematics during countermovement jumps
Recently, AI-driven skeleton reconstruction tools that use multistage computer vision pipelines were designed to estimate 3D kinematics from 2D video sequences. In the present study, we validated a novel markerless, smartphone video-based artificial intelligence (AI) motion capture system for hip, knee, and ankle angles during countermovement jumps (CMJs). Eleven participants performed six CMJs. We used 2D videos created by a smartphone (Apple iPhone X, 4K, 60 fps) to create 24 different keypoints, which together built a full skeleton including joints and their connections. Body parts and skeletal keypoints were localized by calculating confidence maps using a multilevel convolutional neural network that integrated both spatial and temporal features. We calculated hip, knee, and ankle angles in the sagittal plane and compared it with the angles measured by a VICON system. We calculated the correlation between both method's angular progressions, mean squared error (MSE), mean average error (MAE), and the maximum and minimum angular error and run statistical parametric mapping (SPM) analysis. Pearson correlation coefficients (r) for hip, knee, and ankle angular progressions in the sagittal plane during the entire movement were 0.96, 0.99, and 0.87, respectively. SPM group-analysis revealed some significant differences only for ankle angular progression. MSE was below 5.7°, MAE was below 4.5°, and error for maximum amplitudes was below 3.2°. The smartphone AI motion capture system with the trained multistage computer vision pipeline was able to detect, especially hip and knee angles in the sagittal plane during CMJs with high precision from a frontal view only.