James Dominic O. Go, Neal Garnett T. Ong, Carlo A. Rafanan, Brian G. Tan, Timothy Scott C. Chu
{"title":"MoMa:带有基于网络摄像头的凝视控制系统的辅助移动机械手","authors":"James Dominic O. Go, Neal Garnett T. Ong, Carlo A. Rafanan, Brian G. Tan, Timothy Scott C. Chu","doi":"10.1016/j.ohx.2024.e00599","DOIUrl":null,"url":null,"abstract":"<div><div><em>Mobile Manipulators (MoMa) is a category of mobile robots</em> designed to assist people with motor disabilities to perform object retrieval tasks using a webcam-based gaze control system. Using off-the-shelf components such as reproducible acrylic and 3D-printed plates, and a webcam for eye tracking, <em>MoMa</em> serves as an inexpensive, open-source, and customizable solution in assistive robotics. The robotic system consists of a mobile base that can move forward and backward, as well as turn in place; and a 2-axis cartesian arm equipped with a claw gripper that opens and closes. The simple movement of the robot also allows for a simple control method and graphical user interface (GUI). The user receives information about what is in front of the robot through a mounted camera, and, by looking at parts of the screen that correspond to controls, has their gaze predicted by a convolutional neural network and sends commands wirelessly. The performance of the entire system has been validated through testing of the gaze prediction model, the integration of the control system, as well as its task completion capabilities. All the design, construction and software files are freely available online under the CC BY 4.0 license at <span><span>https://doi.org/10.17632/k7yfn6wdv7.2</span><svg><path></path></svg></span>.</div></div>","PeriodicalId":37503,"journal":{"name":"HardwareX","volume":null,"pages":null},"PeriodicalIF":2.0000,"publicationDate":"2024-10-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"MoMa: An assistive mobile manipulator with a webcam-based gaze control system\",\"authors\":\"James Dominic O. Go, Neal Garnett T. Ong, Carlo A. Rafanan, Brian G. Tan, Timothy Scott C. Chu\",\"doi\":\"10.1016/j.ohx.2024.e00599\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div><em>Mobile Manipulators (MoMa) is a category of mobile robots</em> designed to assist people with motor disabilities to perform object retrieval tasks using a webcam-based gaze control system. Using off-the-shelf components such as reproducible acrylic and 3D-printed plates, and a webcam for eye tracking, <em>MoMa</em> serves as an inexpensive, open-source, and customizable solution in assistive robotics. The robotic system consists of a mobile base that can move forward and backward, as well as turn in place; and a 2-axis cartesian arm equipped with a claw gripper that opens and closes. The simple movement of the robot also allows for a simple control method and graphical user interface (GUI). The user receives information about what is in front of the robot through a mounted camera, and, by looking at parts of the screen that correspond to controls, has their gaze predicted by a convolutional neural network and sends commands wirelessly. The performance of the entire system has been validated through testing of the gaze prediction model, the integration of the control system, as well as its task completion capabilities. All the design, construction and software files are freely available online under the CC BY 4.0 license at <span><span>https://doi.org/10.17632/k7yfn6wdv7.2</span><svg><path></path></svg></span>.</div></div>\",\"PeriodicalId\":37503,\"journal\":{\"name\":\"HardwareX\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":2.0000,\"publicationDate\":\"2024-10-19\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"HardwareX\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S2468067224000932\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"ENGINEERING, ELECTRICAL & ELECTRONIC\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"HardwareX","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2468067224000932","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0
摘要
移动机械手(MoMa)是一类移动机器人,旨在利用基于网络摄像头的注视控制系统,协助运动残疾人士执行物体检索任务。MoMa 使用现成的组件(如可复制的丙烯酸板和 3D 打印板)和用于眼动跟踪的网络摄像头,是一种廉价、开源和可定制的辅助机器人解决方案。该机器人系统包括一个可前后移动和原地旋转的移动底座,以及一个配备了可开合爪式抓手的双轴笛卡尔机械臂。机器人的简单移动也使得控制方法和图形用户界面(GUI)变得简单。用户通过安装的摄像头接收机器人前方的信息,并通过观察屏幕上与控制相对应的部分,由卷积神经网络预测其视线,然后通过无线方式发送指令。通过对注视预测模型、控制系统集成以及任务完成能力的测试,整个系统的性能得到了验证。所有的设计、构造和软件文件均可在 https://doi.org/10.17632/k7yfn6wdv7.2 网站上以 CC BY 4.0 许可的方式免费获取。
MoMa: An assistive mobile manipulator with a webcam-based gaze control system
Mobile Manipulators (MoMa) is a category of mobile robots designed to assist people with motor disabilities to perform object retrieval tasks using a webcam-based gaze control system. Using off-the-shelf components such as reproducible acrylic and 3D-printed plates, and a webcam for eye tracking, MoMa serves as an inexpensive, open-source, and customizable solution in assistive robotics. The robotic system consists of a mobile base that can move forward and backward, as well as turn in place; and a 2-axis cartesian arm equipped with a claw gripper that opens and closes. The simple movement of the robot also allows for a simple control method and graphical user interface (GUI). The user receives information about what is in front of the robot through a mounted camera, and, by looking at parts of the screen that correspond to controls, has their gaze predicted by a convolutional neural network and sends commands wirelessly. The performance of the entire system has been validated through testing of the gaze prediction model, the integration of the control system, as well as its task completion capabilities. All the design, construction and software files are freely available online under the CC BY 4.0 license at https://doi.org/10.17632/k7yfn6wdv7.2.
HardwareXEngineering-Industrial and Manufacturing Engineering
CiteScore
4.10
自引率
18.20%
发文量
124
审稿时长
24 weeks
期刊介绍:
HardwareX is an open access journal established to promote free and open source designing, building and customizing of scientific infrastructure (hardware). HardwareX aims to recognize researchers for the time and effort in developing scientific infrastructure while providing end-users with sufficient information to replicate and validate the advances presented. HardwareX is open to input from all scientific, technological and medical disciplines. Scientific infrastructure will be interpreted in the broadest sense. Including hardware modifications to existing infrastructure, sensors and tools that perform measurements and other functions outside of the traditional lab setting (such as wearables, air/water quality sensors, and low cost alternatives to existing tools), and the creation of wholly new tools for either standard or novel laboratory tasks. Authors are encouraged to submit hardware developments that address all aspects of science, not only the final measurement, for example, enhancements in sample preparation and handling, user safety, and quality control. The use of distributed digital manufacturing strategies (e.g. 3-D printing) is encouraged. All designs must be submitted under an open hardware license.