High precision machining with robots is an open challenge. Achieving precision of dimensional and geometrical features with robotic machining would require compensation via feedback control which relies on accurate error prediction. Machining error prediction is a complex problem in high-precision manufacturing, where effective solutions must accurately estimate geometrical errors in different workpieces while minimizing quality inspection costs. It is also compounded by the need for real-time estimation for feedback control. This paper introduces a novel approach for predicting the quality of milled workpieces using low-cost, in-process signals and machine learning. The proposed method fuses internal machine controller commands—comprising end-effector trajectory coordinates and angular changes of six revolute joints in the robotic arm—with external laser tracker sensing signals that capture the real trajectory of the milling tool and predicts dimensional errors as would be obtained by a Coordinate Measuring Machine (CMM). To overcome the lack of knowledge of the dependence of the part dimensional error on the available signals, models with varying combinations of the sensors and the length of the time window of historical data for inclusion in the model were evaluated. In addition, five machine learning algorithms were selected, trained, evaluated and validated on data from two distinct workpieces and various spatial configurations. The best machine learning model achieved a sevenfold improvement in dimensional error prediction compared to solely using laser tracker data, with mean absolute error reduced from 0.0756 mm to 0.0097 mm. This study demonstrates the feasibility of using low-cost, in-process sensing signals to predict high-precision quality dimensional data that is normally measured by costly CMMs, enabling rapid part quality inspection and significant potential cost reduction.
扫码关注我们
求助内容:
应助结果提醒方式:
