Activity recognition from trunk muscle activations for wearable and non-wearable robot conditions

IF 3.5 Q3 GREEN & SUSTAINABLE SCIENCE & TECHNOLOGY Smart and Sustainable Built Environment Pub Date : 2022-11-24 DOI:10.1108/sasbe-07-2022-0130
Nihar J. Gonsalves, O. Ogunseiju, A. Akanmu
{"title":"Activity recognition from trunk muscle activations for wearable and non-wearable robot conditions","authors":"Nihar J. Gonsalves, O. Ogunseiju, A. Akanmu","doi":"10.1108/sasbe-07-2022-0130","DOIUrl":null,"url":null,"abstract":"PurposeRecognizing construction workers' activities is critical for on-site performance and safety management. Thus, this study presents the potential of automatically recognizing construction workers' actions from activations of the erector spinae muscles.Design/methodology/approachA lab study was conducted wherein the participants (n = 10) performed rebar task, which involved placing and tying subtasks, with and without a wearable robot (exoskeleton). Trunk muscle activations for both conditions were trained with nine well-established supervised machine learning algorithms. Hold-out validation was carried out, and the performance of the models was evaluated using accuracy, precision, recall and F1 score.FindingsResults indicate that classification models performed well for both experimental conditions with support vector machine, achieving the highest accuracy of 83.8% for the “exoskeleton” condition and 74.1% for the “without exoskeleton” condition.Research limitations/implicationsThe study paves the way for the development of smart wearable robotic technology which can augment itself based on the tasks performed by the construction workers.Originality/valueThis study contributes to the research on construction workers' action recognition using trunk muscle activity. Most of the human actions are largely performed with hands, and the advancements in ergonomic research have provided evidence for relationship between trunk muscles and the movements of hands. This relationship has not been explored for action recognition of construction workers, which is a gap in literature that this study attempts to address.","PeriodicalId":45779,"journal":{"name":"Smart and Sustainable Built Environment","volume":" ","pages":""},"PeriodicalIF":3.5000,"publicationDate":"2022-11-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Smart and Sustainable Built Environment","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1108/sasbe-07-2022-0130","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"GREEN & SUSTAINABLE SCIENCE & TECHNOLOGY","Score":null,"Total":0}
引用次数: 1

Abstract

PurposeRecognizing construction workers' activities is critical for on-site performance and safety management. Thus, this study presents the potential of automatically recognizing construction workers' actions from activations of the erector spinae muscles.Design/methodology/approachA lab study was conducted wherein the participants (n = 10) performed rebar task, which involved placing and tying subtasks, with and without a wearable robot (exoskeleton). Trunk muscle activations for both conditions were trained with nine well-established supervised machine learning algorithms. Hold-out validation was carried out, and the performance of the models was evaluated using accuracy, precision, recall and F1 score.FindingsResults indicate that classification models performed well for both experimental conditions with support vector machine, achieving the highest accuracy of 83.8% for the “exoskeleton” condition and 74.1% for the “without exoskeleton” condition.Research limitations/implicationsThe study paves the way for the development of smart wearable robotic technology which can augment itself based on the tasks performed by the construction workers.Originality/valueThis study contributes to the research on construction workers' action recognition using trunk muscle activity. Most of the human actions are largely performed with hands, and the advancements in ergonomic research have provided evidence for relationship between trunk muscles and the movements of hands. This relationship has not been explored for action recognition of construction workers, which is a gap in literature that this study attempts to address.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
可穿戴和不可穿戴机器人条件下躯干肌肉激活的活动识别
目的识别施工人员的活动对现场表现和安全管理至关重要。因此,这项研究提出了通过激活竖脊肌来自动识别建筑工人行为的潜力。设计/方法/方法进行了一项实验室研究,参与者(n=10)在有和没有可穿戴机器人(外骨骼)的情况下执行钢筋任务,包括放置和捆绑子任务。这两种情况下的躯干肌肉激活都是用九种公认的监督机器学习算法进行训练的。进行了保留验证,并使用准确性、精密度、召回率和F1分数评估了模型的性能。结果表明,使用支持向量机的分类模型在两种实验条件下都表现良好,在“外骨骼”条件下和“无外骨骼”情况下分别达到83.8%和74.1%的最高准确率。研究局限性/含义该研究为智能可穿戴机器人技术的发展铺平了道路,该技术可以根据建筑工人执行的任务进行自我增强。独创性/价值本研究有助于研究建筑工人利用躯干肌肉活动进行动作识别。大多数人类动作主要是用手进行的,人体工程学研究的进展为躯干肌肉和手的运动之间的关系提供了证据。这种关系尚未被探索用于建筑工人的行动识别,这是本研究试图解决的文献空白。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Smart and Sustainable Built Environment
Smart and Sustainable Built Environment GREEN & SUSTAINABLE SCIENCE & TECHNOLOGY-
CiteScore
9.20
自引率
8.30%
发文量
53
期刊最新文献
Smart personal protective equipment for intelligent construction safety monitoring The smart city conundrum: technology, privacy, and the quest for convenience An exploratory study on the benefits of transit orientated development (TOD) to rail infrastructure projects Augmenting the cities’ and metropolitan regional demands for mega rail infrastructure: the application of SWOT and factor analysis Development of an ontology-based asset information model for predictive maintenance in building facilities
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1