Pub Date : 2015-05-11DOI: 10.1109/TePRA.2015.7219677
Torstein A. Myhre, A. Transeth, O. Egeland
Handling moving objects with robot manipulators is a challenging task as it involves tracking of objects with high accuracy. An industrial application of this type is the loading and unloading of objects on an overhead conveyor. A robotic solution to this problem is presented in this paper, where we describe a method for the interaction of an industrial robot and a free swinging object. Our approach is based on visual tracking using particle filtering where the equations of motion of the object are included in the filtering algorithm. The first contribution of this paper is that the Fisher information matrix is used to quantify the information content from each image feature. In particular, the Fisher information matrix is used to construct a weighted likelihood function. This improves the robustness of tracking algorithm significantly compared to the standard approach based on an unweighted likelihood function. The second contribution of this paper is that we detect occluded image features, and avoid the use of these features in the calculation of the likelihood function. This further improves the quality of the likelihood function. We demonstrate the improved performance of the proposed method in experiments involving the automatic loading of trolleys hanging from a moving overhead conveyor.
{"title":"Loading of hanging trolleys on overhead conveyor with industrial robots","authors":"Torstein A. Myhre, A. Transeth, O. Egeland","doi":"10.1109/TePRA.2015.7219677","DOIUrl":"https://doi.org/10.1109/TePRA.2015.7219677","url":null,"abstract":"Handling moving objects with robot manipulators is a challenging task as it involves tracking of objects with high accuracy. An industrial application of this type is the loading and unloading of objects on an overhead conveyor. A robotic solution to this problem is presented in this paper, where we describe a method for the interaction of an industrial robot and a free swinging object. Our approach is based on visual tracking using particle filtering where the equations of motion of the object are included in the filtering algorithm. The first contribution of this paper is that the Fisher information matrix is used to quantify the information content from each image feature. In particular, the Fisher information matrix is used to construct a weighted likelihood function. This improves the robustness of tracking algorithm significantly compared to the standard approach based on an unweighted likelihood function. The second contribution of this paper is that we detect occluded image features, and avoid the use of these features in the calculation of the likelihood function. This further improves the quality of the likelihood function. We demonstrate the improved performance of the proposed method in experiments involving the automatic loading of trolleys hanging from a moving overhead conveyor.","PeriodicalId":325788,"journal":{"name":"2015 IEEE International Conference on Technologies for Practical Robot Applications (TePRA)","volume":"24 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-05-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132019736","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2015-05-11DOI: 10.1109/TePRA.2015.7219691
Di-Wei Huang, Garrett E. Katz, J. Langsfeld, R. Gentili, J. Reggia
To support studies in robot imitation learning, this paper presents a software platform, SMILE (Simulator for Maryland Imitation Learning Environment), specifically targeting tasks in which exact human motions are not critical. We hypothesize that in this class of tasks, object behaviors are far more important than human behaviors, and thus one can significantly reduce complexity by not processing human motions at all. As such, SMILE simulates a virtual environment where a human demonstrator can manipulate objects using GUI controls without body parts being visible to a robot in the same environment. Imitation learning is therefore based on the behaviors of manipulated objects only. A simple Matlab interface for programming a simulated robot is also provided in SMILE, along with an XML interface for initializing objects in the virtual environment. SMILE lowers the barriers for studying robot imitation learning by (1) simplifying learning by making the human demonstrator be a virtual presence and (2) eliminating the immediate need to purchase special equipment for motion capturing.
{"title":"A virtual demonstrator environment for robot imitation learning","authors":"Di-Wei Huang, Garrett E. Katz, J. Langsfeld, R. Gentili, J. Reggia","doi":"10.1109/TePRA.2015.7219691","DOIUrl":"https://doi.org/10.1109/TePRA.2015.7219691","url":null,"abstract":"To support studies in robot imitation learning, this paper presents a software platform, SMILE (Simulator for Maryland Imitation Learning Environment), specifically targeting tasks in which exact human motions are not critical. We hypothesize that in this class of tasks, object behaviors are far more important than human behaviors, and thus one can significantly reduce complexity by not processing human motions at all. As such, SMILE simulates a virtual environment where a human demonstrator can manipulate objects using GUI controls without body parts being visible to a robot in the same environment. Imitation learning is therefore based on the behaviors of manipulated objects only. A simple Matlab interface for programming a simulated robot is also provided in SMILE, along with an XML interface for initializing objects in the virtual environment. SMILE lowers the barriers for studying robot imitation learning by (1) simplifying learning by making the human demonstrator be a virtual presence and (2) eliminating the immediate need to purchase special equipment for motion capturing.","PeriodicalId":325788,"journal":{"name":"2015 IEEE International Conference on Technologies for Practical Robot Applications (TePRA)","volume":"170 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-05-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121218041","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2015-05-11DOI: 10.1109/TePRA.2015.7219662
A. Samani, K. Panetta, S. Agaian
In robotic imaging systems, images are often subject to additive Gaussian noise and additive noise in the color components during image acquisition. These distortions can arise from poor illumination, excessive temperatures, or electronic circuit noise. Imaging sensors required to perform real time enhancement of images that is best suited to the human visual system often need parameter selection and optimization. This is achieved by using a quality metric for image enhancement. Most image quality assessment algorithms require parameter selection of their own to best assess the image quality. Some measures require a reference image to be used alongside the test image for comparison. In this article, we introduce a no-parameter no-reference metric that can determine the best visually pleasing image for human visual perception. Our proposed metric is algorithm independent such that it can be utilized for a variety of enhancement algorithms. Measure of enhancement methods can be categorized as either spatial or transform domain based measures. In this article, we present a DCT transform domain measure of enhancement to evaluate color images impacted by additive noise during image acquisition in robotics applications. Unlike the spatial domain measure of enhancement methods, our proposed measure is independent of image attributes and does not require parameter selection. The proposed measure is applicable to compressed and non-compressed images. This measure could be used as an enhancement metric for different image enhancement methods for both grayscale and the color images.
{"title":"TDMEC, a new measure for evaluating the image quality of color images acquired in vision systems","authors":"A. Samani, K. Panetta, S. Agaian","doi":"10.1109/TePRA.2015.7219662","DOIUrl":"https://doi.org/10.1109/TePRA.2015.7219662","url":null,"abstract":"In robotic imaging systems, images are often subject to additive Gaussian noise and additive noise in the color components during image acquisition. These distortions can arise from poor illumination, excessive temperatures, or electronic circuit noise. Imaging sensors required to perform real time enhancement of images that is best suited to the human visual system often need parameter selection and optimization. This is achieved by using a quality metric for image enhancement. Most image quality assessment algorithms require parameter selection of their own to best assess the image quality. Some measures require a reference image to be used alongside the test image for comparison. In this article, we introduce a no-parameter no-reference metric that can determine the best visually pleasing image for human visual perception. Our proposed metric is algorithm independent such that it can be utilized for a variety of enhancement algorithms. Measure of enhancement methods can be categorized as either spatial or transform domain based measures. In this article, we present a DCT transform domain measure of enhancement to evaluate color images impacted by additive noise during image acquisition in robotics applications. Unlike the spatial domain measure of enhancement methods, our proposed measure is independent of image attributes and does not require parameter selection. The proposed measure is applicable to compressed and non-compressed images. This measure could be used as an enhancement metric for different image enhancement methods for both grayscale and the color images.","PeriodicalId":325788,"journal":{"name":"2015 IEEE International Conference on Technologies for Practical Robot Applications (TePRA)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-05-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131101809","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2015-05-11DOI: 10.1109/TePRA.2015.7219679
Sang Choi, Gregory F. Rossano, George Zhang, T. Fuhlbrigge
This paper presents the overview and the introduction of commercialized robotic products in the industrial service, maintenance and repair sectors. General facts of the industrial service are briefly described, and then we focus on four specific applications including motor /generator inspection, solar panel inspection /cleaning, tank inspection and pipe inspection. For each application, service process characteristics, operational details, technical challenges, requirements are described. Robotics solutions with commercialized products of each application area were introduced and detailed with special features and specification.
{"title":"Service robots: An industrial perspective","authors":"Sang Choi, Gregory F. Rossano, George Zhang, T. Fuhlbrigge","doi":"10.1109/TePRA.2015.7219679","DOIUrl":"https://doi.org/10.1109/TePRA.2015.7219679","url":null,"abstract":"This paper presents the overview and the introduction of commercialized robotic products in the industrial service, maintenance and repair sectors. General facts of the industrial service are briefly described, and then we focus on four specific applications including motor /generator inspection, solar panel inspection /cleaning, tank inspection and pipe inspection. For each application, service process characteristics, operational details, technical challenges, requirements are described. Robotics solutions with commercialized products of each application area were introduced and detailed with special features and specification.","PeriodicalId":325788,"journal":{"name":"2015 IEEE International Conference on Technologies for Practical Robot Applications (TePRA)","volume":"20 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-05-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115935501","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2015-05-11DOI: 10.1109/TePRA.2015.7219682
T. Danko, Kenneth Chaney, P. Oh
Manipulating objects using arms mounted to unmanned aerial vehicles (UAVs) is attractive because UAVs may access many locations that are otherwise inaccessible to traditional mobile manipulation platforms such as ground vehicles. Most previous efforts seeking to coordinate the combined manipulator-UAV system have focused on using a manipulator to extend the UAV's reach and assume that both the UAV and manipulator can reliably reach commanded goal poses. This work accepts the reality that state of the art UAV positioning precision is not of a high enough quality to reliably perform simple tasks such as grasping objects. A 6 degree of freedom parallel manipulator is used to robustly maintain precise end-effector positions despite host UAV perturbations. A description of a unique parallel manipulator that allows for very little moving mass, and is easily stowed below a quadrotor UAV is presented along with flight test results and an analytical comparison to a serial manipulator.
{"title":"A parallel manipulator for mobile manipulating UAVs","authors":"T. Danko, Kenneth Chaney, P. Oh","doi":"10.1109/TePRA.2015.7219682","DOIUrl":"https://doi.org/10.1109/TePRA.2015.7219682","url":null,"abstract":"Manipulating objects using arms mounted to unmanned aerial vehicles (UAVs) is attractive because UAVs may access many locations that are otherwise inaccessible to traditional mobile manipulation platforms such as ground vehicles. Most previous efforts seeking to coordinate the combined manipulator-UAV system have focused on using a manipulator to extend the UAV's reach and assume that both the UAV and manipulator can reliably reach commanded goal poses. This work accepts the reality that state of the art UAV positioning precision is not of a high enough quality to reliably perform simple tasks such as grasping objects. A 6 degree of freedom parallel manipulator is used to robustly maintain precise end-effector positions despite host UAV perturbations. A description of a unique parallel manipulator that allows for very little moving mass, and is easily stowed below a quadrotor UAV is presented along with flight test results and an analytical comparison to a serial manipulator.","PeriodicalId":325788,"journal":{"name":"2015 IEEE International Conference on Technologies for Practical Robot Applications (TePRA)","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-05-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127505013","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2015-05-11DOI: 10.1109/TePRA.2015.7219672
Der-Lin Chow, W. Newman
This paper presents progress towards autonomous knot-tying in Robotic Assisted Minimally Invasive Surgery. While successful demonstrations of robotic knot-tying have been achieved, objective comparisons of competing approaches have been lacking. In this presentation we describe how to score a proposed procedure in terms of speed and volume. Applying the scoring metric has motivated an improved procedure for knot-tying, as well as a pathway to automated discovery for trajectory optimizations.
{"title":"Trajectory optimization of robotic suturing","authors":"Der-Lin Chow, W. Newman","doi":"10.1109/TePRA.2015.7219672","DOIUrl":"https://doi.org/10.1109/TePRA.2015.7219672","url":null,"abstract":"This paper presents progress towards autonomous knot-tying in Robotic Assisted Minimally Invasive Surgery. While successful demonstrations of robotic knot-tying have been achieved, objective comparisons of competing approaches have been lacking. In this presentation we describe how to score a proposed procedure in terms of speed and volume. Applying the scoring metric has motivated an improved procedure for knot-tying, as well as a pathway to automated discovery for trajectory optimizations.","PeriodicalId":325788,"journal":{"name":"2015 IEEE International Conference on Technologies for Practical Robot Applications (TePRA)","volume":"41 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-05-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121796578","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2015-05-11DOI: 10.1109/TePRA.2015.7219673
Zheng-Hao Chong, Robert T. W. Hung, Kit-Hang Lee, Weijia Wang, T. Ng, W. Newman
Autonomous wall cutting is described using an Atlas humanoid robot. An integrated wall-cutting skill is presented, which only requires an operator to issue supervisory-level commands to prescribe a desired cutting path, leading to autonomous cutting.
{"title":"Autonomous wall cutting with an Atlas humanoid robot","authors":"Zheng-Hao Chong, Robert T. W. Hung, Kit-Hang Lee, Weijia Wang, T. Ng, W. Newman","doi":"10.1109/TePRA.2015.7219673","DOIUrl":"https://doi.org/10.1109/TePRA.2015.7219673","url":null,"abstract":"Autonomous wall cutting is described using an Atlas humanoid robot. An integrated wall-cutting skill is presented, which only requires an operator to issue supervisory-level commands to prescribe a desired cutting path, leading to autonomous cutting.","PeriodicalId":325788,"journal":{"name":"2015 IEEE International Conference on Technologies for Practical Robot Applications (TePRA)","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-05-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130768048","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2015-05-11DOI: 10.1109/TePRA.2015.7219688
Katie Byl, M. Byl
This paper presents two prototype fast walking gaits for the quadruped robot RoboSimian, along with experimental results for each. The first gait uses a statically stable one-at-a-time swing-leg crawl. The second gait uses a two-at-a-time swingleg motion, which requires deliberate planning of zero-moment point (ZMP) to balance the robot on a narrow support base. Of particular focus are the development of practical means to exploit the fact that RoboSimian has high-dimensionality, with seven actuators per limb, as a means of partially overcoming low joint velocity limits at each joint. For both gaits, we use an inverse kinematics (IK) table that has been designed to maximize the reachable workspace of each limb while minimizing joint velocities during end effector motions. Even with the simplification provided by use of IK solutions, there are still a wide range of variables left open in the design of each gait. We discuss these and present practical methodologies for parameterizing and subsequently deriving approximate time-optimal solutions for each gait type, subject to joint velocity limits of the robot and to real-world requirements for safety margins in maintaining adequate balance. Results show that careful choice of parameters for each of the gaits improves their respective walking speeds significantly. Finally, we compare the fastest achievable walking speeds of each gait and find they are nearly equivalent, given current performance limits of the robot.
{"title":"Design of fast walking with one- versus two-at-a-time swing leg motions for RoboSimian","authors":"Katie Byl, M. Byl","doi":"10.1109/TePRA.2015.7219688","DOIUrl":"https://doi.org/10.1109/TePRA.2015.7219688","url":null,"abstract":"This paper presents two prototype fast walking gaits for the quadruped robot RoboSimian, along with experimental results for each. The first gait uses a statically stable one-at-a-time swing-leg crawl. The second gait uses a two-at-a-time swingleg motion, which requires deliberate planning of zero-moment point (ZMP) to balance the robot on a narrow support base. Of particular focus are the development of practical means to exploit the fact that RoboSimian has high-dimensionality, with seven actuators per limb, as a means of partially overcoming low joint velocity limits at each joint. For both gaits, we use an inverse kinematics (IK) table that has been designed to maximize the reachable workspace of each limb while minimizing joint velocities during end effector motions. Even with the simplification provided by use of IK solutions, there are still a wide range of variables left open in the design of each gait. We discuss these and present practical methodologies for parameterizing and subsequently deriving approximate time-optimal solutions for each gait type, subject to joint velocity limits of the robot and to real-world requirements for safety margins in maintaining adequate balance. Results show that careful choice of parameters for each of the gaits improves their respective walking speeds significantly. Finally, we compare the fastest achievable walking speeds of each gait and find they are nearly equivalent, given current performance limits of the robot.","PeriodicalId":325788,"journal":{"name":"2015 IEEE International Conference on Technologies for Practical Robot Applications (TePRA)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-05-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126145572","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2015-05-11DOI: 10.1109/TePRA.2015.7219660
A. Bennett, Victoria L. Preston, Jay Woo, Shivali Chandra, Devynn Diggins, Riley Chapman, Zhecan Wang, M. Rush, L. Lye, Mindy Tieu, Silas Hughes, Iain Kerr, A. Wee
Rapidly dropping costs and increasing capabilities of robotic systems are creating unprecedented opportunities for the world of scientific research. Remote sample collection in conditions that were once impossible due to expense, location, timing, or risk are now becoming a reality. Of particular interest in marine biological research is the aspect of removing additional stressors in the form of humans and equipment from whale monitoring. In a partnership between Olin College of Engineering and Ocean Alliance, a multirotor unmanned air vehicle (UAV) named SnotBot is being developed to enable marine biologists to collect observational data and biological samples from living whales in a less intrusive and more effective way. In Summer 2014 tests conducted in the Gulf of Mexico it was demonstrated that SnotBot may not be an irritant to whales of study with respect to the noise and downdraft generated by the UAV [1]. The results from those field tests are being used to apply for research permits to collect samples from real whales. Until formal authorization to operate over whales is granted, controlled testing at Olin College and in the Gloucester Harbor of Massachusetts Bay is being conducted to characterize the vehicles and develop autonomy. Beyond cetacean/whale research, the ability to collect physical samples in difficult or sensitive locations, as demonstrated by SnotBot, has far reaching applications in environmental monitoring, aerial surveying, and diagnosis of a transient events.
{"title":"Autonomous vehicles for remote sample collection in difficult conditions: Enabling remote sample collection by marine biologists","authors":"A. Bennett, Victoria L. Preston, Jay Woo, Shivali Chandra, Devynn Diggins, Riley Chapman, Zhecan Wang, M. Rush, L. Lye, Mindy Tieu, Silas Hughes, Iain Kerr, A. Wee","doi":"10.1109/TePRA.2015.7219660","DOIUrl":"https://doi.org/10.1109/TePRA.2015.7219660","url":null,"abstract":"Rapidly dropping costs and increasing capabilities of robotic systems are creating unprecedented opportunities for the world of scientific research. Remote sample collection in conditions that were once impossible due to expense, location, timing, or risk are now becoming a reality. Of particular interest in marine biological research is the aspect of removing additional stressors in the form of humans and equipment from whale monitoring. In a partnership between Olin College of Engineering and Ocean Alliance, a multirotor unmanned air vehicle (UAV) named SnotBot is being developed to enable marine biologists to collect observational data and biological samples from living whales in a less intrusive and more effective way. In Summer 2014 tests conducted in the Gulf of Mexico it was demonstrated that SnotBot may not be an irritant to whales of study with respect to the noise and downdraft generated by the UAV [1]. The results from those field tests are being used to apply for research permits to collect samples from real whales. Until formal authorization to operate over whales is granted, controlled testing at Olin College and in the Gloucester Harbor of Massachusetts Bay is being conducted to characterize the vehicles and develop autonomy. Beyond cetacean/whale research, the ability to collect physical samples in difficult or sensitive locations, as demonstrated by SnotBot, has far reaching applications in environmental monitoring, aerial surveying, and diagnosis of a transient events.","PeriodicalId":325788,"journal":{"name":"2015 IEEE International Conference on Technologies for Practical Robot Applications (TePRA)","volume":"20 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-05-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130095137","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2015-05-11DOI: 10.1109/TePRA.2015.7219675
C. Fries, Hans-Joachim Wünsche
Previous publications of our institute describe a robust vehicle tracking system for daylight conditions. This paper presents an improved vehicle tracking system which is able to detect and track the convoy leader also by twilight and night. The primary sensor equipment consists of a daytime camera, a LiDAR and an inertial navigation system. An expansion with a thermal and a lowlight camera was necessary to be robust against any illumination conditions. The system is capable of estimating the relative 3D position and orientation, the velocity and the steering angle of a convoy leader precisely in real-time. This makes it possible to follow the convoy leader's track. Another novelty is coupling a Kalman filter with a particle filter for higher stability and accuracy in vehicle tracking. The tracking system shows excellent functionality while driving more than 50km fully autonomously in urban- and unstructured environments at night.
{"title":"Autonomous convoy driving by night: The vehicle tracking system","authors":"C. Fries, Hans-Joachim Wünsche","doi":"10.1109/TePRA.2015.7219675","DOIUrl":"https://doi.org/10.1109/TePRA.2015.7219675","url":null,"abstract":"Previous publications of our institute describe a robust vehicle tracking system for daylight conditions. This paper presents an improved vehicle tracking system which is able to detect and track the convoy leader also by twilight and night. The primary sensor equipment consists of a daytime camera, a LiDAR and an inertial navigation system. An expansion with a thermal and a lowlight camera was necessary to be robust against any illumination conditions. The system is capable of estimating the relative 3D position and orientation, the velocity and the steering angle of a convoy leader precisely in real-time. This makes it possible to follow the convoy leader's track. Another novelty is coupling a Kalman filter with a particle filter for higher stability and accuracy in vehicle tracking. The tracking system shows excellent functionality while driving more than 50km fully autonomously in urban- and unstructured environments at night.","PeriodicalId":325788,"journal":{"name":"2015 IEEE International Conference on Technologies for Practical Robot Applications (TePRA)","volume":"121 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-05-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114567931","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}