Pub Date : 1994-09-12DOI: 10.1109/IROS.1994.407625
A. Nishi, H. Miyagi
Use of a wall-climbing robot for rescue or fire-fighting has been anticipated for a longtime. Four quite different models have been developed in the authors' laboratory. The present model can move on a wall by using the thrust force of a propeller and can fly whenever it is required. Its mechanism and control system are discussed.<>
{"title":"Mechanism and control of propeller type wall-climbing robot","authors":"A. Nishi, H. Miyagi","doi":"10.1109/IROS.1994.407625","DOIUrl":"https://doi.org/10.1109/IROS.1994.407625","url":null,"abstract":"Use of a wall-climbing robot for rescue or fire-fighting has been anticipated for a longtime. Four quite different models have been developed in the authors' laboratory. The present model can move on a wall by using the thrust force of a propeller and can fly whenever it is required. Its mechanism and control system are discussed.<<ETX>>","PeriodicalId":437805,"journal":{"name":"Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS'94)","volume":"27 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-09-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124533887","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1994-09-12DOI: 10.1109/IROS.1994.407520
K. Kosuge, K. Takeo, T. Fukuda, Tunehiko Sugiura, Akira Sakai, Koji Yamada
This paper proposes an alternative control scheme for a teleoperation system which consists of a master arm, a slave arm and a virtual environment. For a given task, the operator manipulates a virtual environment and executes the task, a skill for the task is extracted through the manipulation, and the skill is transferred to the remote site to execute the task in real. First, the authors consider the design of a control algorithm for a master arm manipulating the virtual environment so that the system has a given dynamics for the operator and the environment. Then the authors consider the design of a control algorithm for the master-slave system manipulating the real environment so that the system has the same dynamics given to the manipulator of the virtual environment. By designing a control scheme, with which the master arm with a real environment and the master arm with a virtual environment have the same dynamics, the skill extracted from the operation of the virtual environment would be applied to the real task using the slave arm. The proposed control system is applied to an experimental master-slave manipulator and experimental results illustrate the effectiveness of the system.<>
{"title":"Unified approach for teleoperation of virtual and real environment for skill based teleoperation","authors":"K. Kosuge, K. Takeo, T. Fukuda, Tunehiko Sugiura, Akira Sakai, Koji Yamada","doi":"10.1109/IROS.1994.407520","DOIUrl":"https://doi.org/10.1109/IROS.1994.407520","url":null,"abstract":"This paper proposes an alternative control scheme for a teleoperation system which consists of a master arm, a slave arm and a virtual environment. For a given task, the operator manipulates a virtual environment and executes the task, a skill for the task is extracted through the manipulation, and the skill is transferred to the remote site to execute the task in real. First, the authors consider the design of a control algorithm for a master arm manipulating the virtual environment so that the system has a given dynamics for the operator and the environment. Then the authors consider the design of a control algorithm for the master-slave system manipulating the real environment so that the system has the same dynamics given to the manipulator of the virtual environment. By designing a control scheme, with which the master arm with a real environment and the master arm with a virtual environment have the same dynamics, the skill extracted from the operation of the virtual environment would be applied to the real task using the slave arm. The proposed control system is applied to an experimental master-slave manipulator and experimental results illustrate the effectiveness of the system.<<ETX>>","PeriodicalId":437805,"journal":{"name":"Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS'94)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-09-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131350029","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1994-09-12DOI: 10.1109/IROS.1994.407590
Jong-Eun Byun, T. Nagata
For automatic handling of a flexible object such as an electric wire, cable or rope it is necessary to determine the 3-D pose of the object. However, it is difficult to achieve this because of the intrinsic flexibility of the flexible object. We use the k-curvature representation to describe the skeleton image of a flexible object. Moreover, we present two fast stereo matching methods to determine the pose for automatic handling. The one is based on a least square error method and the other is based on the interpolation between curvature extrema. At first, we need to calculate the curvatures of object skeleton images which are taken through two cameras. We apply our algorithms to these two calculated curvature representations. We execute computer simulations to evaluate the validity of the presented algorithms. The results give us some guidelines for the applications with hand-eye robots. Finally, we carry out a camera calibration process for a linear lens model and an experiment to determine the pose of a coaxial cable with the calibrated camera parameters.<>
{"title":"Determination of 3-D pose of a flexible object by stereo matching of curvature representations","authors":"Jong-Eun Byun, T. Nagata","doi":"10.1109/IROS.1994.407590","DOIUrl":"https://doi.org/10.1109/IROS.1994.407590","url":null,"abstract":"For automatic handling of a flexible object such as an electric wire, cable or rope it is necessary to determine the 3-D pose of the object. However, it is difficult to achieve this because of the intrinsic flexibility of the flexible object. We use the k-curvature representation to describe the skeleton image of a flexible object. Moreover, we present two fast stereo matching methods to determine the pose for automatic handling. The one is based on a least square error method and the other is based on the interpolation between curvature extrema. At first, we need to calculate the curvatures of object skeleton images which are taken through two cameras. We apply our algorithms to these two calculated curvature representations. We execute computer simulations to evaluate the validity of the presented algorithms. The results give us some guidelines for the applications with hand-eye robots. Finally, we carry out a camera calibration process for a linear lens model and an experiment to determine the pose of a coaxial cable with the calibrated camera parameters.<<ETX>>","PeriodicalId":437805,"journal":{"name":"Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS'94)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-09-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128686034","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1994-09-12DOI: 10.1109/IROS.1994.407648
E. Papadopoulos, S. Moosavian
Studies the motion control of a multi-arm free flying space robot chasing a passive object in close proximity. Free-flyer kinematics are developed using a minimum set of body-fixed barycentric vectors. Using a general and a quasi-coordinate Lagrangian formulation, two dynamics models are derived. Control algorithms are developed that allow coordinated tracking control of the manipulators and the spacecraft. The performance of model-based algorithms is compared, by simulation, to that of a transposed Jacobian algorithm. Results show that the latter can give reasonably good performance with reduced computational burden.<>
{"title":"Dynamics and control of multi-arm space robots during chase and capture operations","authors":"E. Papadopoulos, S. Moosavian","doi":"10.1109/IROS.1994.407648","DOIUrl":"https://doi.org/10.1109/IROS.1994.407648","url":null,"abstract":"Studies the motion control of a multi-arm free flying space robot chasing a passive object in close proximity. Free-flyer kinematics are developed using a minimum set of body-fixed barycentric vectors. Using a general and a quasi-coordinate Lagrangian formulation, two dynamics models are derived. Control algorithms are developed that allow coordinated tracking control of the manipulators and the spacecraft. The performance of model-based algorithms is compared, by simulation, to that of a transposed Jacobian algorithm. Results show that the latter can give reasonably good performance with reduced computational burden.<<ETX>>","PeriodicalId":437805,"journal":{"name":"Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS'94)","volume":"3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-09-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128887831","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1994-09-12DOI: 10.1109/IROS.1994.407368
G. Hirzinger, K. Landzettel, C. Fagerer
The paper discusses delay-compensating techniques when operating a space robot from ground or from a another remote spacecraft. These kind of techniques have been a key element in the space robot technology experiment ROTEX that has successfully flown with shuttle COLUMBIA end of April 93. During this "spacelab-D2"-mission for the first time in the history of space flight a small, multisensory robot (i.e. provided with modest local intelligence) has performed prototype tasks on board a spacecraft in different operational modes, namely preprogrammed (and reprogrammed from ground), remotely controlled (teleoperated) by the astronauts, but also remotely controlled from ground via the human operator as well as via machine intelligence. In these operational modes the robot successfully closed and opened connector plugs (bayonet closure), assembled structures from single parts and captured a free-floating object. This paper focuses on the powerful delay-compensating 3D-graphics simulation (predictive simulation) concepts that were realized in the telerobotic ground station and which allowed the authors to compensate delays of up to 7 sec e.g. when grasping the floating object fully automatically from ground.<>
{"title":"Telerobotics with large time delays-the ROTEX experience","authors":"G. Hirzinger, K. Landzettel, C. Fagerer","doi":"10.1109/IROS.1994.407368","DOIUrl":"https://doi.org/10.1109/IROS.1994.407368","url":null,"abstract":"The paper discusses delay-compensating techniques when operating a space robot from ground or from a another remote spacecraft. These kind of techniques have been a key element in the space robot technology experiment ROTEX that has successfully flown with shuttle COLUMBIA end of April 93. During this \"spacelab-D2\"-mission for the first time in the history of space flight a small, multisensory robot (i.e. provided with modest local intelligence) has performed prototype tasks on board a spacecraft in different operational modes, namely preprogrammed (and reprogrammed from ground), remotely controlled (teleoperated) by the astronauts, but also remotely controlled from ground via the human operator as well as via machine intelligence. In these operational modes the robot successfully closed and opened connector plugs (bayonet closure), assembled structures from single parts and captured a free-floating object. This paper focuses on the powerful delay-compensating 3D-graphics simulation (predictive simulation) concepts that were realized in the telerobotic ground station and which allowed the authors to compensate delays of up to 7 sec e.g. when grasping the floating object fully automatically from ground.<<ETX>>","PeriodicalId":437805,"journal":{"name":"Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS'94)","volume":"97 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-09-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116157569","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1994-09-12DOI: 10.1109/IROS.1994.407472
Shigang Li, M. Chiba, S. Tsuji
A human being often looks around to locate him/herself from the arrangement of surrounding objects. Here, the authors propose an idea to determine camera motion by line correspondence in omni-directional images. An omni-directional image has a 360 degree view and contains much more information than those taken by a conventional imaging method. Since the camera motion is estimated from line displacements in images and the line displacements change due to line arrangement in space and camera motion, an omni-directional view makes it possible to select a better line combination for estimating more accurate motion. Experimental results show that camera motion can be estimated more accurately using lines surrounding the camera in omni-directional images than using lines in a narrow view. First, the authors use lines surrounding the camera to determine the camera rotational component and find the true solution from multiple ones by voting from multiple line combinations. Then, the authors select an optimal line combination to estimate the translation component by examining their linear independence. Simulation and real-world experimental results are given.<>
{"title":"Estimating camera motion precisely from omni-directional images","authors":"Shigang Li, M. Chiba, S. Tsuji","doi":"10.1109/IROS.1994.407472","DOIUrl":"https://doi.org/10.1109/IROS.1994.407472","url":null,"abstract":"A human being often looks around to locate him/herself from the arrangement of surrounding objects. Here, the authors propose an idea to determine camera motion by line correspondence in omni-directional images. An omni-directional image has a 360 degree view and contains much more information than those taken by a conventional imaging method. Since the camera motion is estimated from line displacements in images and the line displacements change due to line arrangement in space and camera motion, an omni-directional view makes it possible to select a better line combination for estimating more accurate motion. Experimental results show that camera motion can be estimated more accurately using lines surrounding the camera in omni-directional images than using lines in a narrow view. First, the authors use lines surrounding the camera to determine the camera rotational component and find the true solution from multiple ones by voting from multiple line combinations. Then, the authors select an optimal line combination to estimate the translation component by examining their linear independence. Simulation and real-world experimental results are given.<<ETX>>","PeriodicalId":437805,"journal":{"name":"Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS'94)","volume":"105 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-09-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121105312","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1994-09-12DOI: 10.1109/IROS.1994.407582
S. Rougeaux, N. Kita, Y. Kuniyoshi, S. Sakane, F. Chavand
This paper presents a stereo active vision system which performs tracking tasks on smoothly moving objects in complex backgrounds. Dynamic control of the vergence angle adapts the horopter geometry to the target position and allows to pick it up easily on the basis of stereoscopic disparity features. We introduce a novel vergence control strategy based on the computation of "virtual horopters" to track a target movement generating rapid changes of disparity. The control strategy is implemented on a binocular head, whose right and left pan angles are controlled independently. Experimental results of gaze holding on a smoothly moving target translating and rotating in a complex surrounding demonstrate the efficiency of the tracking system.<>
{"title":"Binocular tracking based on virtual horopters","authors":"S. Rougeaux, N. Kita, Y. Kuniyoshi, S. Sakane, F. Chavand","doi":"10.1109/IROS.1994.407582","DOIUrl":"https://doi.org/10.1109/IROS.1994.407582","url":null,"abstract":"This paper presents a stereo active vision system which performs tracking tasks on smoothly moving objects in complex backgrounds. Dynamic control of the vergence angle adapts the horopter geometry to the target position and allows to pick it up easily on the basis of stereoscopic disparity features. We introduce a novel vergence control strategy based on the computation of \"virtual horopters\" to track a target movement generating rapid changes of disparity. The control strategy is implemented on a binocular head, whose right and left pan angles are controlled independently. Experimental results of gaze holding on a smoothly moving target translating and rotating in a complex surrounding demonstrate the efficiency of the tracking system.<<ETX>>","PeriodicalId":437805,"journal":{"name":"Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS'94)","volume":"113 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-09-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127576844","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1994-09-12DOI: 10.1109/IROS.1994.407503
B. Gery, S. Gottschlich
One factor that limits the range of tasks that robots can perform robustly is the scarcity of useful sensors available to provide feedback to the robot control system. While much progress has been made with vision sensors and force/torque sensors, tactile sensing systems have fallen behind, and so general-purpose tactile sensing systems are not commercially available. The aim of this work is therefore to produce a tactile sensing system that could be manufactured inexpensively, could be used on a wide variety of robotic systems, and would provide the kind of output typically required in dexterous manipulation applications. Such a tactile sensing system will be presented in this paper. The tactile transducers used in this system are based on semiconductive ink technology that allows transducers of any size, shape, and resistance range to be produced merely by altering the ink printing process and substrate geometry. Each sensor outputs three pieces of information that are useful in robotic manipulation-two parameters indicating the location of a contact point on the transducer and one parameter specifying the amount of force being exerted at the contact point. So that the sensing system could support transducers of different shapes and sizes, the analog interface circuitry of this system has been designed to be fully programmable, and also includes circuitry to enable self-calibration with appropriate software.<>
{"title":"A tactile sensing system for dexterous manipulation","authors":"B. Gery, S. Gottschlich","doi":"10.1109/IROS.1994.407503","DOIUrl":"https://doi.org/10.1109/IROS.1994.407503","url":null,"abstract":"One factor that limits the range of tasks that robots can perform robustly is the scarcity of useful sensors available to provide feedback to the robot control system. While much progress has been made with vision sensors and force/torque sensors, tactile sensing systems have fallen behind, and so general-purpose tactile sensing systems are not commercially available. The aim of this work is therefore to produce a tactile sensing system that could be manufactured inexpensively, could be used on a wide variety of robotic systems, and would provide the kind of output typically required in dexterous manipulation applications. Such a tactile sensing system will be presented in this paper. The tactile transducers used in this system are based on semiconductive ink technology that allows transducers of any size, shape, and resistance range to be produced merely by altering the ink printing process and substrate geometry. Each sensor outputs three pieces of information that are useful in robotic manipulation-two parameters indicating the location of a contact point on the transducer and one parameter specifying the amount of force being exerted at the contact point. So that the sensing system could support transducers of different shapes and sizes, the analog interface circuitry of this system has been designed to be fully programmable, and also includes circuitry to enable self-calibration with appropriate software.<<ETX>>","PeriodicalId":437805,"journal":{"name":"Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS'94)","volume":"29 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-09-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133571957","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1994-09-12DOI: 10.1109/IROS.1994.407373
L. Giugovaz, J. Hollerbach
Closed-loop kinematic calibration has been experimentally implemented on the Sarcos Dextrous Arm. The elbow joint is made mobile by adding an unsensed hinge joint at the endpoint attachment to ground. The calibrated parameters include the joint angle offsets and the hinge-related parameters.<>
{"title":"Closed-loop kinematic calibration of the Sarcos Dextrous Arm","authors":"L. Giugovaz, J. Hollerbach","doi":"10.1109/IROS.1994.407373","DOIUrl":"https://doi.org/10.1109/IROS.1994.407373","url":null,"abstract":"Closed-loop kinematic calibration has been experimentally implemented on the Sarcos Dextrous Arm. The elbow joint is made mobile by adding an unsensed hinge joint at the endpoint attachment to ground. The calibrated parameters include the joint angle offsets and the hinge-related parameters.<<ETX>>","PeriodicalId":437805,"journal":{"name":"Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS'94)","volume":"34 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-09-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131002259","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1994-09-12DOI: 10.1109/IROS.1994.407640
G. Alici, R. Daniel
This paper presents the analysis and characterisation of force controlled robotic drilling. We demonstrate that a robot manipulator can perform drilling if enough contact thrust-force is provided between a workpiece and drill, and is controlled properly. It is shown that the key parameters for robotic drilling are the drill rotational speed and thrust force. We believe that this implementation of a force control strategy on a robot manipulator for robotic drilling is unique; nobody has previously reported on the end point force control of a robot manipulator using a force/torque sensor for a drilling operation.<>
{"title":"Robotic drilling under force control: execution of a task","authors":"G. Alici, R. Daniel","doi":"10.1109/IROS.1994.407640","DOIUrl":"https://doi.org/10.1109/IROS.1994.407640","url":null,"abstract":"This paper presents the analysis and characterisation of force controlled robotic drilling. We demonstrate that a robot manipulator can perform drilling if enough contact thrust-force is provided between a workpiece and drill, and is controlled properly. It is shown that the key parameters for robotic drilling are the drill rotational speed and thrust force. We believe that this implementation of a force control strategy on a robot manipulator for robotic drilling is unique; nobody has previously reported on the end point force control of a robot manipulator using a force/torque sensor for a drilling operation.<<ETX>>","PeriodicalId":437805,"journal":{"name":"Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS'94)","volume":"50 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-09-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131210462","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}