Towards an intuitive human-robot interaction based on hand gesture recognition and proximity sensors

Gorkem Anil Al, P. Estrela, Uriel Martinez-Hernandez
{"title":"Towards an intuitive human-robot interaction based on hand gesture recognition and proximity sensors","authors":"Gorkem Anil Al, P. Estrela, Uriel Martinez-Hernandez","doi":"10.1109/MFI49285.2020.9235264","DOIUrl":null,"url":null,"abstract":"In this paper, we present a multimodal sensor interface that is capable of recognizing hand gestures for human-robot interaction. The proposed system is composed of an array of proximity and gesture sensors, which have been mounted on a 3D printed bracelet. The gesture sensors are employed for data collection from four hand gesture movements (up, down, left and right) performed by the human at a predefined distance from the sensorised bracelet. The hand gesture movements are classified using Artificial Neural Networks. The proposed approach is validated with experiments in offline and real-time modes performed systematically. First, in offline mode, the accuracy for recognition of the four hand gesture movements achieved a mean of 97.86%. Second, the trained model was used for classification in real-time and achieved a mean recognition accuracy of 97.7%. The output from the recognised hand gesture in real-time mode was used to control the movement of a Universal Robot (UR3) arm in the CoppeliaSim simulation environment. Overall, the results from the experiments show that using multimodal sensors, together with computational intelligence methods, have the potential for the development of intuitive and safe human-robot interaction.","PeriodicalId":446154,"journal":{"name":"2020 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI)","volume":"26 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-09-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"9","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/MFI49285.2020.9235264","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 9

Abstract

In this paper, we present a multimodal sensor interface that is capable of recognizing hand gestures for human-robot interaction. The proposed system is composed of an array of proximity and gesture sensors, which have been mounted on a 3D printed bracelet. The gesture sensors are employed for data collection from four hand gesture movements (up, down, left and right) performed by the human at a predefined distance from the sensorised bracelet. The hand gesture movements are classified using Artificial Neural Networks. The proposed approach is validated with experiments in offline and real-time modes performed systematically. First, in offline mode, the accuracy for recognition of the four hand gesture movements achieved a mean of 97.86%. Second, the trained model was used for classification in real-time and achieved a mean recognition accuracy of 97.7%. The output from the recognised hand gesture in real-time mode was used to control the movement of a Universal Robot (UR3) arm in the CoppeliaSim simulation environment. Overall, the results from the experiments show that using multimodal sensors, together with computational intelligence methods, have the potential for the development of intuitive and safe human-robot interaction.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于手势识别和接近传感器的直观人机交互
在本文中,我们提出了一种能够识别人机交互手势的多模态传感器接口。该系统由一系列接近和手势传感器组成,安装在3D打印手镯上。手势传感器用于收集人类在与感应手环的预定义距离处进行的四种手势动作(上、下、左、右)的数据。使用人工神经网络对手势动作进行分类。系统地进行了离线和实时模式的实验,验证了该方法的有效性。首先,在离线模式下,四种手势动作的识别准确率平均达到97.86%。其次,将训练好的模型用于实时分类,平均识别准确率达到97.7%。在CoppeliaSim仿真环境中,实时模式下识别手势的输出用于控制通用机器人(UR3)手臂的运动。总的来说,实验结果表明,使用多模态传感器,加上计算智能方法,有可能发展直观和安全的人机交互。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
OAFuser: Online Adaptive Extended Object Tracking and Fusion using automotive Radar Detections Observability driven Multi-modal Line-scan Camera Calibration Localization and velocity estimation based on multiple bistatic measurements A Continuous Probabilistic Origin Association Filter for Extended Object Tracking Towards Automatic Classification of Fragmented Rock Piles via Proprioceptive Sensing and Wavelet Analysis
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1