S. J. A. Bakar, Richmond Tay Kim Hui, P. Goh, N. S. Ahmad
{"title":"Vision-Based Velocity Estimations for Autonomous Mobile Robots","authors":"S. J. A. Bakar, Richmond Tay Kim Hui, P. Goh, N. S. Ahmad","doi":"10.1109/ICEPECC57281.2023.10209483","DOIUrl":null,"url":null,"abstract":"Nowadays, the deployment of autonomous mobile robots (AMRs) in warehouses and factories is becoming increasingly prevalent. Various sensors are often fitted on the robot to identify adjacent objects for navigation. In order to maintain the performance in a dynamic environment, the robot must be able to predict the velocity of surrounding obstacles before a local path can be generated. In this work, a vision-based velocity estimation technique is proposed using a pattern matching technique via LabVIEW machine vision. The focus is on estimating the velocity of oncoming obstacles which can be either a robot, a human, or both. Three real-time experiments were conducted to evaluate the performance of the approach. From the experiments, the proposed technique results in average estimation errors of no greater than 0.8° for angle, and 0.41cm/s for speed for single obstacle detections. For multiple obstacle detections, average errors of no greater than 1.42° for angle, and 2cm/s for speed were obtained. Based on the recorded numerical results, the AMR is able to make decision when the obstacle is at least 39cm away from it, which is sufficient to avoid collision.","PeriodicalId":102289,"journal":{"name":"2023 International Conference on Energy, Power, Environment, Control, and Computing (ICEPECC)","volume":"8 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-03-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 International Conference on Energy, Power, Environment, Control, and Computing (ICEPECC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICEPECC57281.2023.10209483","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Nowadays, the deployment of autonomous mobile robots (AMRs) in warehouses and factories is becoming increasingly prevalent. Various sensors are often fitted on the robot to identify adjacent objects for navigation. In order to maintain the performance in a dynamic environment, the robot must be able to predict the velocity of surrounding obstacles before a local path can be generated. In this work, a vision-based velocity estimation technique is proposed using a pattern matching technique via LabVIEW machine vision. The focus is on estimating the velocity of oncoming obstacles which can be either a robot, a human, or both. Three real-time experiments were conducted to evaluate the performance of the approach. From the experiments, the proposed technique results in average estimation errors of no greater than 0.8° for angle, and 0.41cm/s for speed for single obstacle detections. For multiple obstacle detections, average errors of no greater than 1.42° for angle, and 2cm/s for speed were obtained. Based on the recorded numerical results, the AMR is able to make decision when the obstacle is at least 39cm away from it, which is sufficient to avoid collision.