MoMa:带有基于网络摄像头的凝视控制系统的辅助移动机械手

IF 2 Q3 ENGINEERING, ELECTRICAL & ELECTRONIC HardwareX Pub Date : 2024-10-19 DOI:10.1016/j.ohx.2024.e00599
James Dominic O. Go, Neal Garnett T. Ong, Carlo A. Rafanan, Brian G. Tan, Timothy Scott C. Chu
{"title":"MoMa:带有基于网络摄像头的凝视控制系统的辅助移动机械手","authors":"James Dominic O. Go,&nbsp;Neal Garnett T. Ong,&nbsp;Carlo A. Rafanan,&nbsp;Brian G. Tan,&nbsp;Timothy Scott C. Chu","doi":"10.1016/j.ohx.2024.e00599","DOIUrl":null,"url":null,"abstract":"<div><div><em>Mobile Manipulators (MoMa) is a category of mobile robots</em> designed to assist people with motor disabilities to perform object retrieval tasks using a webcam-based gaze control system. Using off-the-shelf components such as reproducible acrylic and 3D-printed plates, and a webcam for eye tracking, <em>MoMa</em> serves as an inexpensive, open-source, and customizable solution in assistive robotics. The robotic system consists of a mobile base that can move forward and backward, as well as turn in place; and a 2-axis cartesian arm equipped with a claw gripper that opens and closes. The simple movement of the robot also allows for a simple control method and graphical user interface (GUI). The user receives information about what is in front of the robot through a mounted camera, and, by looking at parts of the screen that correspond to controls, has their gaze predicted by a convolutional neural network and sends commands wirelessly. The performance of the entire system has been validated through testing of the gaze prediction model, the integration of the control system, as well as its task completion capabilities. All the design, construction and software files are freely available online under the CC BY 4.0 license at <span><span>https://doi.org/10.17632/k7yfn6wdv7.2</span><svg><path></path></svg></span>.</div></div>","PeriodicalId":37503,"journal":{"name":"HardwareX","volume":null,"pages":null},"PeriodicalIF":2.0000,"publicationDate":"2024-10-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"MoMa: An assistive mobile manipulator with a webcam-based gaze control system\",\"authors\":\"James Dominic O. Go,&nbsp;Neal Garnett T. Ong,&nbsp;Carlo A. Rafanan,&nbsp;Brian G. Tan,&nbsp;Timothy Scott C. Chu\",\"doi\":\"10.1016/j.ohx.2024.e00599\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div><em>Mobile Manipulators (MoMa) is a category of mobile robots</em> designed to assist people with motor disabilities to perform object retrieval tasks using a webcam-based gaze control system. Using off-the-shelf components such as reproducible acrylic and 3D-printed plates, and a webcam for eye tracking, <em>MoMa</em> serves as an inexpensive, open-source, and customizable solution in assistive robotics. The robotic system consists of a mobile base that can move forward and backward, as well as turn in place; and a 2-axis cartesian arm equipped with a claw gripper that opens and closes. The simple movement of the robot also allows for a simple control method and graphical user interface (GUI). The user receives information about what is in front of the robot through a mounted camera, and, by looking at parts of the screen that correspond to controls, has their gaze predicted by a convolutional neural network and sends commands wirelessly. The performance of the entire system has been validated through testing of the gaze prediction model, the integration of the control system, as well as its task completion capabilities. All the design, construction and software files are freely available online under the CC BY 4.0 license at <span><span>https://doi.org/10.17632/k7yfn6wdv7.2</span><svg><path></path></svg></span>.</div></div>\",\"PeriodicalId\":37503,\"journal\":{\"name\":\"HardwareX\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":2.0000,\"publicationDate\":\"2024-10-19\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"HardwareX\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S2468067224000932\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"ENGINEERING, ELECTRICAL & ELECTRONIC\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"HardwareX","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2468067224000932","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0

摘要

移动机械手(MoMa)是一类移动机器人,旨在利用基于网络摄像头的注视控制系统,协助运动残疾人士执行物体检索任务。MoMa 使用现成的组件(如可复制的丙烯酸板和 3D 打印板)和用于眼动跟踪的网络摄像头,是一种廉价、开源和可定制的辅助机器人解决方案。该机器人系统包括一个可前后移动和原地旋转的移动底座,以及一个配备了可开合爪式抓手的双轴笛卡尔机械臂。机器人的简单移动也使得控制方法和图形用户界面(GUI)变得简单。用户通过安装的摄像头接收机器人前方的信息,并通过观察屏幕上与控制相对应的部分,由卷积神经网络预测其视线,然后通过无线方式发送指令。通过对注视预测模型、控制系统集成以及任务完成能力的测试,整个系统的性能得到了验证。所有的设计、构造和软件文件均可在 https://doi.org/10.17632/k7yfn6wdv7.2 网站上以 CC BY 4.0 许可的方式免费获取。
本文章由计算机程序翻译,如有差异,请以英文原文为准。

摘要图片

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
MoMa: An assistive mobile manipulator with a webcam-based gaze control system
Mobile Manipulators (MoMa) is a category of mobile robots designed to assist people with motor disabilities to perform object retrieval tasks using a webcam-based gaze control system. Using off-the-shelf components such as reproducible acrylic and 3D-printed plates, and a webcam for eye tracking, MoMa serves as an inexpensive, open-source, and customizable solution in assistive robotics. The robotic system consists of a mobile base that can move forward and backward, as well as turn in place; and a 2-axis cartesian arm equipped with a claw gripper that opens and closes. The simple movement of the robot also allows for a simple control method and graphical user interface (GUI). The user receives information about what is in front of the robot through a mounted camera, and, by looking at parts of the screen that correspond to controls, has their gaze predicted by a convolutional neural network and sends commands wirelessly. The performance of the entire system has been validated through testing of the gaze prediction model, the integration of the control system, as well as its task completion capabilities. All the design, construction and software files are freely available online under the CC BY 4.0 license at https://doi.org/10.17632/k7yfn6wdv7.2.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
HardwareX
HardwareX Engineering-Industrial and Manufacturing Engineering
CiteScore
4.10
自引率
18.20%
发文量
124
审稿时长
24 weeks
期刊介绍: HardwareX is an open access journal established to promote free and open source designing, building and customizing of scientific infrastructure (hardware). HardwareX aims to recognize researchers for the time and effort in developing scientific infrastructure while providing end-users with sufficient information to replicate and validate the advances presented. HardwareX is open to input from all scientific, technological and medical disciplines. Scientific infrastructure will be interpreted in the broadest sense. Including hardware modifications to existing infrastructure, sensors and tools that perform measurements and other functions outside of the traditional lab setting (such as wearables, air/water quality sensors, and low cost alternatives to existing tools), and the creation of wholly new tools for either standard or novel laboratory tasks. Authors are encouraged to submit hardware developments that address all aspects of science, not only the final measurement, for example, enhancements in sample preparation and handling, user safety, and quality control. The use of distributed digital manufacturing strategies (e.g. 3-D printing) is encouraged. All designs must be submitted under an open hardware license.
期刊最新文献
An open source isolated data acquisition with trigger pulse generation for ion mobility spectrometry A 3D-Printable smartphone accessory for plant leaf chlorophyll measurement ASMI: An automated, low-cost indenter for soft matter A multispectral camera in the VIS–NIR equipped with thermal imaging and environmental sensors for non invasive analysis in precision agriculture MoMa: An assistive mobile manipulator with a webcam-based gaze control system
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1