基于手势控制的喷气推进实验室生物套筒:技术开发和现场试验

C. Assad, Michael T. Wolf, Jaakko T. Karras, Jason I. Reid, A. Stoica
{"title":"基于手势控制的喷气推进实验室生物套筒:技术开发和现场试验","authors":"C. Assad, Michael T. Wolf, Jaakko T. Karras, Jason I. Reid, A. Stoica","doi":"10.1109/TePRA.2015.7219668","DOIUrl":null,"url":null,"abstract":"The JPL BioSleeve is a wearable gesture-based human interface for natural robot control. Activity of the user's hand and arm is monitored via surface electromyography sensors and an inertial measurement unit that are embedded in a forearm sleeve. Gesture recognition software then decodes the sensor signals, classifies gesture type, and maps the result to output commands to be sent to a robot. The BioSleeve interface can accurately and reliably decode as many as sixteen discrete hand and finger gestures and estimate the continuous orientation of the forearm. Here we report development of a new wireless BioSleeve prototype that enables portable field use. Gesture-based commands were developed to control a QinetiQ Dragon Runner tracked robot, including a 4 degree-of-freedom manipulator and a stereo camera pair. Gestures can be sent in several modes: for supervisory point-to-goal driving commands, virtual joystick for teleoperation of driving and manipulator, and pan-tilt of the camera. Hand gestures and arm positions are mapped to various commands recognized by the robot's onboard control software, and are meant to integrate with the robot's perception of its environment and its ability to complete tasks with various levels of autonomy. The portable BioSleeve interface was demonstrated through control of the Dragon Runner during participation in field trials at the 2014 Intuitive Robotic Operator Control Challenge. The successful completion of Challenge events demonstrated the versatility of the system to provide multiple commands in different modes of control to a robot operating under difficult real-world environmental conditions.","PeriodicalId":325788,"journal":{"name":"2015 IEEE International Conference on Technologies for Practical Robot Applications (TePRA)","volume":"41 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2015-05-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":"{\"title\":\"JPL BioSleeve for gesture-based control: Technology development and field trials\",\"authors\":\"C. Assad, Michael T. Wolf, Jaakko T. Karras, Jason I. Reid, A. Stoica\",\"doi\":\"10.1109/TePRA.2015.7219668\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The JPL BioSleeve is a wearable gesture-based human interface for natural robot control. Activity of the user's hand and arm is monitored via surface electromyography sensors and an inertial measurement unit that are embedded in a forearm sleeve. Gesture recognition software then decodes the sensor signals, classifies gesture type, and maps the result to output commands to be sent to a robot. The BioSleeve interface can accurately and reliably decode as many as sixteen discrete hand and finger gestures and estimate the continuous orientation of the forearm. Here we report development of a new wireless BioSleeve prototype that enables portable field use. Gesture-based commands were developed to control a QinetiQ Dragon Runner tracked robot, including a 4 degree-of-freedom manipulator and a stereo camera pair. Gestures can be sent in several modes: for supervisory point-to-goal driving commands, virtual joystick for teleoperation of driving and manipulator, and pan-tilt of the camera. Hand gestures and arm positions are mapped to various commands recognized by the robot's onboard control software, and are meant to integrate with the robot's perception of its environment and its ability to complete tasks with various levels of autonomy. The portable BioSleeve interface was demonstrated through control of the Dragon Runner during participation in field trials at the 2014 Intuitive Robotic Operator Control Challenge. The successful completion of Challenge events demonstrated the versatility of the system to provide multiple commands in different modes of control to a robot operating under difficult real-world environmental conditions.\",\"PeriodicalId\":325788,\"journal\":{\"name\":\"2015 IEEE International Conference on Technologies for Practical Robot Applications (TePRA)\",\"volume\":\"41 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2015-05-11\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"4\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2015 IEEE International Conference on Technologies for Practical Robot Applications (TePRA)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/TePRA.2015.7219668\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2015 IEEE International Conference on Technologies for Practical Robot Applications (TePRA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/TePRA.2015.7219668","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4

摘要

喷气推进实验室BioSleeve是一种可穿戴的基于手势的人机界面,用于自然机器人控制。用户的手和手臂的活动是通过表面肌电传感器和惯性测量单元,嵌入在前臂套筒监测。手势识别软件随后解码传感器信号,对手势类型进行分类,并将结果映射为输出命令,发送给机器人。BioSleeve接口可以准确可靠地解码多达16个独立的手部和手指手势,并估计前臂的连续方向。在这里,我们报告了一种新的无线生物袖原型的开发,使便携式现场使用成为可能。开发了基于手势的指令来控制QinetiQ Dragon Runner履带式机器人,该机器人包括一个4自由度的机械手和一对立体摄像机。手势可以以几种模式发送:用于监督点到目标的驾驶命令,用于远程操作驾驶和机械手的虚拟操纵杆,以及摄像机的平移倾斜。手势和手臂位置被映射到机器人机载控制软件识别的各种命令中,旨在与机器人对环境的感知以及以不同程度的自主性完成任务的能力相结合。在2014年直观机器人操作员控制挑战赛的现场试验中,通过对Dragon Runner的控制展示了便携式BioSleeve界面。挑战赛的成功完成证明了该系统的多功能性,可以在不同的控制模式下为机器人提供在困难的现实环境条件下操作的多个命令。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
JPL BioSleeve for gesture-based control: Technology development and field trials
The JPL BioSleeve is a wearable gesture-based human interface for natural robot control. Activity of the user's hand and arm is monitored via surface electromyography sensors and an inertial measurement unit that are embedded in a forearm sleeve. Gesture recognition software then decodes the sensor signals, classifies gesture type, and maps the result to output commands to be sent to a robot. The BioSleeve interface can accurately and reliably decode as many as sixteen discrete hand and finger gestures and estimate the continuous orientation of the forearm. Here we report development of a new wireless BioSleeve prototype that enables portable field use. Gesture-based commands were developed to control a QinetiQ Dragon Runner tracked robot, including a 4 degree-of-freedom manipulator and a stereo camera pair. Gestures can be sent in several modes: for supervisory point-to-goal driving commands, virtual joystick for teleoperation of driving and manipulator, and pan-tilt of the camera. Hand gestures and arm positions are mapped to various commands recognized by the robot's onboard control software, and are meant to integrate with the robot's perception of its environment and its ability to complete tasks with various levels of autonomy. The portable BioSleeve interface was demonstrated through control of the Dragon Runner during participation in field trials at the 2014 Intuitive Robotic Operator Control Challenge. The successful completion of Challenge events demonstrated the versatility of the system to provide multiple commands in different modes of control to a robot operating under difficult real-world environmental conditions.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Transparent integration of a real-time collision safety system to a motor control chain of a service robot Enhanced performances for cable-driven flexible robotic systems with asymmetric backlash profile Autonomous convoy driving by night: The vehicle tracking system Service robots: An industrial perspective An iterative clustering algorithm for classification of object motion direction using infrared sensor array
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1