主动人机协作的时空预测与规划框架

IF 2.4 3区 工程技术 Q3 ENGINEERING, MANUFACTURING Journal of Manufacturing Science and Engineering-transactions of The Asme Pub Date : 2023-10-17 DOI:10.1115/1.4063502
Jared Flowers, Gloria Wiens
{"title":"主动人机协作的时空预测与规划框架","authors":"Jared Flowers, Gloria Wiens","doi":"10.1115/1.4063502","DOIUrl":null,"url":null,"abstract":"Abstract A significant challenge in human–robot collaboration (HRC) is coordinating robot and human motions. Discoordination can lead to production delays and human discomfort. Prior works seek coordination by planning robot paths that consider humans or their anticipated occupancy as static obstacles, making them nearsighted and prone to entrapment by human motion. This work presents the spatio-temporal avoidance of predictions-prediction and planning framework (STAP-PPF) to improve robot–human coordination in HRC. STAP-PPF predicts multi-step human motion sequences based on the locations of objects the human manipulates. STAP-PPF then proactively determines time-optimal robot paths considering predicted human motion and robot speed restrictions anticipated according to the ISO15066 speed and separation monitoring (SSM) mode. When executing robot paths, STAP-PPF continuously updates human motion predictions. In real-time, STAP-PPF warps the robot’s path to account for continuously updated human motion predictions and updated SSM effects to mitigate delays and human discomfort. Results show the STAP-PPF generates robot trajectories of shorter duration. STAP-PPF robot trajectories also adapted better to real-time human motion deviation. STAP-PPF robot trajectories also maintain greater robot/human separation throughout tasks requiring close human–robot interaction. Tests with an assembly sequence demonstrate STAP-PPF’s ability to predict multi-step human tasks and plan robot motions for the sequence. STAP-PPF also most accurately estimates robot trajectory durations, within 30% of actual, which can be used to adapt the robot sequencing to minimize disruption.","PeriodicalId":16299,"journal":{"name":"Journal of Manufacturing Science and Engineering-transactions of The Asme","volume":"2 4","pages":"0"},"PeriodicalIF":2.4000,"publicationDate":"2023-10-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"A Spatio-Temporal Prediction and Planning Framework for Proactive Human-Robot Collaboration\",\"authors\":\"Jared Flowers, Gloria Wiens\",\"doi\":\"10.1115/1.4063502\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Abstract A significant challenge in human–robot collaboration (HRC) is coordinating robot and human motions. Discoordination can lead to production delays and human discomfort. Prior works seek coordination by planning robot paths that consider humans or their anticipated occupancy as static obstacles, making them nearsighted and prone to entrapment by human motion. This work presents the spatio-temporal avoidance of predictions-prediction and planning framework (STAP-PPF) to improve robot–human coordination in HRC. STAP-PPF predicts multi-step human motion sequences based on the locations of objects the human manipulates. STAP-PPF then proactively determines time-optimal robot paths considering predicted human motion and robot speed restrictions anticipated according to the ISO15066 speed and separation monitoring (SSM) mode. When executing robot paths, STAP-PPF continuously updates human motion predictions. In real-time, STAP-PPF warps the robot’s path to account for continuously updated human motion predictions and updated SSM effects to mitigate delays and human discomfort. Results show the STAP-PPF generates robot trajectories of shorter duration. STAP-PPF robot trajectories also adapted better to real-time human motion deviation. STAP-PPF robot trajectories also maintain greater robot/human separation throughout tasks requiring close human–robot interaction. Tests with an assembly sequence demonstrate STAP-PPF’s ability to predict multi-step human tasks and plan robot motions for the sequence. STAP-PPF also most accurately estimates robot trajectory durations, within 30% of actual, which can be used to adapt the robot sequencing to minimize disruption.\",\"PeriodicalId\":16299,\"journal\":{\"name\":\"Journal of Manufacturing Science and Engineering-transactions of The Asme\",\"volume\":\"2 4\",\"pages\":\"0\"},\"PeriodicalIF\":2.4000,\"publicationDate\":\"2023-10-17\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Manufacturing Science and Engineering-transactions of The Asme\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1115/1.4063502\",\"RegionNum\":3,\"RegionCategory\":\"工程技术\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"ENGINEERING, MANUFACTURING\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Manufacturing Science and Engineering-transactions of The Asme","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1115/1.4063502","RegionNum":3,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"ENGINEERING, MANUFACTURING","Score":null,"Total":0}
引用次数: 1

摘要

人机协作(HRC)的一个重要挑战是协调机器人和人的运动。不协调会导致生产延迟和人类不适。先前的工作通过规划机器人路径来寻求协调,将人类或他们预期的占用视为静态障碍,使它们近视,容易被人类运动困住。本研究提出了一种时空回避预测-预测和规划框架(STAP-PPF),以改善人机协调。STAP-PPF基于人类操作对象的位置预测多步人类运动序列。然后,STAP-PPF根据ISO15066速度和分离监测(SSM)模式,考虑预测的人类运动和机器人速度限制,主动确定时间最优的机器人路径。在执行机器人路径时,STAP-PPF不断更新人类运动预测。在实时情况下,STAP-PPF会扭曲机器人的路径,以解释不断更新的人类运动预测和更新的SSM效应,以减轻延误和人类的不适。结果表明,STAP-PPF生成的机器人轨迹持续时间较短。STAP-PPF机器人的轨迹也能更好地适应实时的人体运动偏差。STAP-PPF机器人轨迹还在需要密切人机交互的任务中保持更大的机器人/人分离。装配序列的测试证明了STAP-PPF预测多步骤人类任务和计划机器人运动序列的能力。STAP-PPF还可以最准确地估计机器人的轨迹持续时间,在实际的30%以内,这可以用来调整机器人的排序,以尽量减少干扰。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
A Spatio-Temporal Prediction and Planning Framework for Proactive Human-Robot Collaboration
Abstract A significant challenge in human–robot collaboration (HRC) is coordinating robot and human motions. Discoordination can lead to production delays and human discomfort. Prior works seek coordination by planning robot paths that consider humans or their anticipated occupancy as static obstacles, making them nearsighted and prone to entrapment by human motion. This work presents the spatio-temporal avoidance of predictions-prediction and planning framework (STAP-PPF) to improve robot–human coordination in HRC. STAP-PPF predicts multi-step human motion sequences based on the locations of objects the human manipulates. STAP-PPF then proactively determines time-optimal robot paths considering predicted human motion and robot speed restrictions anticipated according to the ISO15066 speed and separation monitoring (SSM) mode. When executing robot paths, STAP-PPF continuously updates human motion predictions. In real-time, STAP-PPF warps the robot’s path to account for continuously updated human motion predictions and updated SSM effects to mitigate delays and human discomfort. Results show the STAP-PPF generates robot trajectories of shorter duration. STAP-PPF robot trajectories also adapted better to real-time human motion deviation. STAP-PPF robot trajectories also maintain greater robot/human separation throughout tasks requiring close human–robot interaction. Tests with an assembly sequence demonstrate STAP-PPF’s ability to predict multi-step human tasks and plan robot motions for the sequence. STAP-PPF also most accurately estimates robot trajectory durations, within 30% of actual, which can be used to adapt the robot sequencing to minimize disruption.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
6.80
自引率
20.00%
发文量
126
审稿时长
12 months
期刊介绍: Areas of interest including, but not limited to: Additive manufacturing; Advanced materials and processing; Assembly; Biomedical manufacturing; Bulk deformation processes (e.g., extrusion, forging, wire drawing, etc.); CAD/CAM/CAE; Computer-integrated manufacturing; Control and automation; Cyber-physical systems in manufacturing; Data science-enhanced manufacturing; Design for manufacturing; Electrical and electrochemical machining; Grinding and abrasive processes; Injection molding and other polymer fabrication processes; Inspection and quality control; Laser processes; Machine tool dynamics; Machining processes; Materials handling; Metrology; Micro- and nano-machining and processing; Modeling and simulation; Nontraditional manufacturing processes; Plant engineering and maintenance; Powder processing; Precision and ultra-precision machining; Process engineering; Process planning; Production systems optimization; Rapid prototyping and solid freeform fabrication; Robotics and flexible tooling; Sensing, monitoring, and diagnostics; Sheet and tube metal forming; Sustainable manufacturing; Tribology in manufacturing; Welding and joining
期刊最新文献
CONTINUOUS STEREOLITHOGRAPHY 3D PRINTING OF MULTI-NETWORK HYDROGELS IN TRIPLY PERIODIC MINIMAL STRUCTURES (TPMS) WITH TUNABLE MECHANICAL STRENGTH FOR ENERGY ABSORPTION A Review of Prospects and Opportunities in Disassembly with Human-Robot Collaboration The Effect of Microstructure on the Machinability of Natural Fiber Reinforced Plastic Composites: A Novel Explainable Machine Learning (XML) Approach A Digital Twin-based environment-adaptive assignment method for human-robot collaboration Combining Flexible and Sustainable Design Principles for Evaluating Designs: Textile Recycling Application
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1