Photogrammetry-based Dynamic Path Tracking of Industrial Robots Using Adaptive Neuro-PID Control Method and Robust Kalman Filter

Jianyu Tang, Tao Zhou, E. Zakeri, Tingting Shu, W. Xie
{"title":"Photogrammetry-based Dynamic Path Tracking of Industrial Robots Using Adaptive Neuro-PID Control Method and Robust Kalman Filter","authors":"Jianyu Tang, Tao Zhou, E. Zakeri, Tingting Shu, W. Xie","doi":"10.1109/ICARA56516.2023.10125681","DOIUrl":null,"url":null,"abstract":"This paper proposes a novel accurate dynamic path tracking (DPT) method for industrial robots based on photogrammetry sensors and an adaptive neuro-PID (ANPID) control method. First, the pose of the robot's end-effector is detected by the photogrammetry sensor (C-Track stereo camera). It passes through a robust Kalman filter to reduce the noise in the signals. Then, the filtered signals are fed to the ANPID, whose gains are tuned online using an adaptive multi-layer perceptron neural network (AMLPNN). The steepest descent optimization method is adopted online. The cost function is the least mean square of the system states errors. Experimental results on FANUC M-20iA robot show the tracking accuracy reaches ±0.08mm and ±0.04deg, which exhibits the superiority of the proposed method over the conventional methods such as PID (tracking error±0.2mm and ±0.1deg) [4].","PeriodicalId":443572,"journal":{"name":"2023 9th International Conference on Automation, Robotics and Applications (ICARA)","volume":"265 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-02-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 9th International Conference on Automation, Robotics and Applications (ICARA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICARA56516.2023.10125681","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

Abstract

This paper proposes a novel accurate dynamic path tracking (DPT) method for industrial robots based on photogrammetry sensors and an adaptive neuro-PID (ANPID) control method. First, the pose of the robot's end-effector is detected by the photogrammetry sensor (C-Track stereo camera). It passes through a robust Kalman filter to reduce the noise in the signals. Then, the filtered signals are fed to the ANPID, whose gains are tuned online using an adaptive multi-layer perceptron neural network (AMLPNN). The steepest descent optimization method is adopted online. The cost function is the least mean square of the system states errors. Experimental results on FANUC M-20iA robot show the tracking accuracy reaches ±0.08mm and ±0.04deg, which exhibits the superiority of the proposed method over the conventional methods such as PID (tracking error±0.2mm and ±0.1deg) [4].
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于自适应神经pid控制和鲁棒卡尔曼滤波的工业机器人动态路径跟踪
提出了一种基于摄影测量传感器和自适应神经pid (ANPID)控制的工业机器人精确动态路径跟踪方法。首先,利用摄影测量传感器(C-Track立体摄像机)检测机器人末端执行器的姿态。它通过鲁棒卡尔曼滤波来降低信号中的噪声。然后,将滤波后的信号送入ANPID,利用自适应多层感知器神经网络(amlnn)在线调整其增益。采用在线最陡下降优化方法。代价函数是系统状态误差的最小均方差。在FANUC M-20iA机器人上的实验结果表明,该方法的跟踪精度达到±0.08mm和±0.04°,优于PID等传统方法(跟踪误差分别为±0.2mm和±0.1°)[4]。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Fused Swish-ReLU Efficient-Net Model for Deepfakes Detection SensorClouds: A Framework for Real-Time Processing of Multi-modal Sensor Data for Human-Robot-Collaboration Modified Bug Algorithm with Proximity Sensors to Reduce Human-Cobot Collisions Toward Computationally Efficient Path Generation and Push Planning for Robotic Nonprehensile Manipulation Correlation Analysis of Factors Influencing the Motion Planning Accuracy of Articulated Robots
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1