Pub Date : 1995-08-05DOI: 10.1109/IROS.1995.525848
M. Seitz, Norbert Hartwig, J. Matthiesen
Automatic handling, maintenance and repair procedures in orbital space applications will gain importance in the future. Lightweight space robots could be used for many of these tasks, provided they would be dexterous enough and more flexible in their perception of the working environment than the present ones. The employment of vision enables a robot to grasp, and manipulate inspected objects autonomously. The paper describes the interaction between robot, camera and a multisensory gripper for performance of local repair tasks, e.g. opening the jammed solar panel of a satellite, by 3D active inspection. Central points of interest have been the composition of information acquired from image sequences as well as the object localization in the robot work space and the supervision of grasping procedures. Besides a description of the system structure and the underlying ideas, experiences and results from terrestrial lab experiments are presented in order to discuss the limits and possibilities of the use of vision for space robotic tasks.
{"title":"Towards vision assisted space robotics: some examples and experimental results","authors":"M. Seitz, Norbert Hartwig, J. Matthiesen","doi":"10.1109/IROS.1995.525848","DOIUrl":"https://doi.org/10.1109/IROS.1995.525848","url":null,"abstract":"Automatic handling, maintenance and repair procedures in orbital space applications will gain importance in the future. Lightweight space robots could be used for many of these tasks, provided they would be dexterous enough and more flexible in their perception of the working environment than the present ones. The employment of vision enables a robot to grasp, and manipulate inspected objects autonomously. The paper describes the interaction between robot, camera and a multisensory gripper for performance of local repair tasks, e.g. opening the jammed solar panel of a satellite, by 3D active inspection. Central points of interest have been the composition of information acquired from image sequences as well as the object localization in the robot work space and the supervision of grasping procedures. Besides a description of the system structure and the underlying ideas, experiences and results from terrestrial lab experiments are presented in order to discuss the limits and possibilities of the use of vision for space robotic tasks.","PeriodicalId":124483,"journal":{"name":"Proceedings 1995 IEEE/RSJ International Conference on Intelligent Robots and Systems. Human Robot Interaction and Cooperative Robots","volume":"36 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1995-08-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130263006","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1995-08-05DOI: 10.1109/IROS.1995.525846
M. Maimone, S. Shafer
Many aspects of the real world continue to plague stereo matching systems. One of these is perspective foreshortening, an effect that occurs when a surface is viewed at a sharp angle. Because each stereo camera has a slightly different view, the image of the surface is more compressed and occupies a smaller area in one view. These effects cause problems because most stereo methods compare similarly sized regions (using the same-sized windows in both images), tacitly assuming that objects occupy the same extents in both images. Clearly this condition is violated by perspective foreshortening. We show how to overcome this problem using a local spatial frequency representation. A simple geometric analysis leads to an elegant solution in the frequency domain which, when applied to a Gabor filter-based stereo system, increases the system's maximum matchable surface angle from 30 degrees to over 75 degrees.
{"title":"Modeling foreshortening in stereo vision using local spatial frequency","authors":"M. Maimone, S. Shafer","doi":"10.1109/IROS.1995.525846","DOIUrl":"https://doi.org/10.1109/IROS.1995.525846","url":null,"abstract":"Many aspects of the real world continue to plague stereo matching systems. One of these is perspective foreshortening, an effect that occurs when a surface is viewed at a sharp angle. Because each stereo camera has a slightly different view, the image of the surface is more compressed and occupies a smaller area in one view. These effects cause problems because most stereo methods compare similarly sized regions (using the same-sized windows in both images), tacitly assuming that objects occupy the same extents in both images. Clearly this condition is violated by perspective foreshortening. We show how to overcome this problem using a local spatial frequency representation. A simple geometric analysis leads to an elegant solution in the frequency domain which, when applied to a Gabor filter-based stereo system, increases the system's maximum matchable surface angle from 30 degrees to over 75 degrees.","PeriodicalId":124483,"journal":{"name":"Proceedings 1995 IEEE/RSJ International Conference on Intelligent Robots and Systems. Human Robot Interaction and Cooperative Robots","volume":"39 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1995-08-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125577206","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1995-08-05DOI: 10.1109/IROS.1995.525790
F. Hsu, L. Fu
This paper presents an adaptive robust fuzzy control architecture for robot manipulators. The control objective is to adaptively compensate for the unknown nonlinearity of robot manipulators which is represented as a fuzzy rule-base consisting of a collection of if-then rules. The algorithm embedded in the proposed architecture can automatically update fuzzy rules and, consequently it is guaranteed to be globally stable and to drive the tracking errors to a neighborhood of zero. Focusing on realization, hardware limitations such as traditional long computation time and excessive memory-space usage are also relaxed by incorporating heuristic concepts, which reveals the flexible feature of this architecture. The present work is applied to the control of a five degree-of-freedom (DOF) articulated robot manipulator. Experiment results show that the proposed control architecture features fast convergence.
{"title":"Nonlinear control of robot manipulators using adaptive fuzzy sliding mode control","authors":"F. Hsu, L. Fu","doi":"10.1109/IROS.1995.525790","DOIUrl":"https://doi.org/10.1109/IROS.1995.525790","url":null,"abstract":"This paper presents an adaptive robust fuzzy control architecture for robot manipulators. The control objective is to adaptively compensate for the unknown nonlinearity of robot manipulators which is represented as a fuzzy rule-base consisting of a collection of if-then rules. The algorithm embedded in the proposed architecture can automatically update fuzzy rules and, consequently it is guaranteed to be globally stable and to drive the tracking errors to a neighborhood of zero. Focusing on realization, hardware limitations such as traditional long computation time and excessive memory-space usage are also relaxed by incorporating heuristic concepts, which reveals the flexible feature of this architecture. The present work is applied to the control of a five degree-of-freedom (DOF) articulated robot manipulator. Experiment results show that the proposed control architecture features fast convergence.","PeriodicalId":124483,"journal":{"name":"Proceedings 1995 IEEE/RSJ International Conference on Intelligent Robots and Systems. Human Robot Interaction and Cooperative Robots","volume":"499 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1995-08-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123155223","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1995-08-05DOI: 10.1109/IROS.1995.525931
H. Seki, K. Sasaki, M. Takano
In this paper, a method for detecting kinematic constraints in a plane when the shapes of the grasped object and the environment are not given is presented. This method utilizes the displacement and force information obtained by "active search motion" of a robot. A new neural network configuration for this detection is proposed. It consists of two multilayer networks (primary and secondary network). The primary network learns the movable space (constraint) obtained by the search motion. By the generated link weights which reflect the movable space, the secondary network determines the type and the orientation of the constraint. Simulation and experimental results are presented and analyzed.
{"title":"Detection of kinematic constraint from search motion of a robot using link weights of a neural network","authors":"H. Seki, K. Sasaki, M. Takano","doi":"10.1109/IROS.1995.525931","DOIUrl":"https://doi.org/10.1109/IROS.1995.525931","url":null,"abstract":"In this paper, a method for detecting kinematic constraints in a plane when the shapes of the grasped object and the environment are not given is presented. This method utilizes the displacement and force information obtained by \"active search motion\" of a robot. A new neural network configuration for this detection is proposed. It consists of two multilayer networks (primary and secondary network). The primary network learns the movable space (constraint) obtained by the search motion. By the generated link weights which reflect the movable space, the secondary network determines the type and the orientation of the constraint. Simulation and experimental results are presented and analyzed.","PeriodicalId":124483,"journal":{"name":"Proceedings 1995 IEEE/RSJ International Conference on Intelligent Robots and Systems. Human Robot Interaction and Cooperative Robots","volume":"96 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1995-08-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125860574","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1995-08-05DOI: 10.1109/IROS.1995.525939
H. Osumi, T. Arai, T. Fukuoka, K. Moriyama, Hajime Torii
This paper proposes a new strategy for manipulation of a rigid object by cooperation of multiple position-controlled manipulators using force control devices. The force control device consists of springs and a position-controlled micro-manipulator. Force control is achieved by controlling deflections of each spring by the micro-manipulator. A cooperative control algorithm considering the flexibility of macro/micro manipulators is proposed. After the differential equation of the error vector system is derived, a method of designing feedback gain matrices is shown for linearizing the system. To verify the efficiency of the algorithm, experiments are done on carrying an object by two industrial robots.
{"title":"Cooperative control of two industrial robots with force control devices","authors":"H. Osumi, T. Arai, T. Fukuoka, K. Moriyama, Hajime Torii","doi":"10.1109/IROS.1995.525939","DOIUrl":"https://doi.org/10.1109/IROS.1995.525939","url":null,"abstract":"This paper proposes a new strategy for manipulation of a rigid object by cooperation of multiple position-controlled manipulators using force control devices. The force control device consists of springs and a position-controlled micro-manipulator. Force control is achieved by controlling deflections of each spring by the micro-manipulator. A cooperative control algorithm considering the flexibility of macro/micro manipulators is proposed. After the differential equation of the error vector system is derived, a method of designing feedback gain matrices is shown for linearizing the system. To verify the efficiency of the algorithm, experiments are done on carrying an object by two industrial robots.","PeriodicalId":124483,"journal":{"name":"Proceedings 1995 IEEE/RSJ International Conference on Intelligent Robots and Systems. Human Robot Interaction and Cooperative Robots","volume":"28 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1995-08-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115966425","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1995-08-05DOI: 10.1109/IROS.1995.526269
S. Bang, Wonpil Yu, M. Chung
Proposes a novel local homing algorithm for indoor mobile robot navigation. In the algorithm, we divide the whole navigation task into simple local tasks in order to reduce the computational burden and the required memory size. We develop a new environment model based on the omnidirectional sensor data obtained from the Omnidirectional Range and Intensity Sensing System (ORISS), which consists of a set of ultrasonic sensors and a vision sensor. In order to enhance the reliability of the sensor information, we fuse the sensor data by means of the characteristics of the indoor environment structure and the sensor model. To verify the proposed algorithm, experiments with a mobile robot are carried out in a corridor.
{"title":"Sensor-based local homing using Omnidirectional Range and Intensity Sensing System for indoor mobile robot navigation","authors":"S. Bang, Wonpil Yu, M. Chung","doi":"10.1109/IROS.1995.526269","DOIUrl":"https://doi.org/10.1109/IROS.1995.526269","url":null,"abstract":"Proposes a novel local homing algorithm for indoor mobile robot navigation. In the algorithm, we divide the whole navigation task into simple local tasks in order to reduce the computational burden and the required memory size. We develop a new environment model based on the omnidirectional sensor data obtained from the Omnidirectional Range and Intensity Sensing System (ORISS), which consists of a set of ultrasonic sensors and a vision sensor. In order to enhance the reliability of the sensor information, we fuse the sensor data by means of the characteristics of the indoor environment structure and the sensor model. To verify the proposed algorithm, experiments with a mobile robot are carried out in a corridor.","PeriodicalId":124483,"journal":{"name":"Proceedings 1995 IEEE/RSJ International Conference on Intelligent Robots and Systems. Human Robot Interaction and Cooperative Robots","volume":"45 2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1995-08-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116535777","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1995-08-05DOI: 10.1109/IROS.1995.525789
Rudolf Bauer, W. Rencken
A mobile robot wishing to explore unknown and dynamic environments, has to build up maps of its environment while at the same time ensuring that it does not become lost in the process. However current feature based localisation techniques can become unstable if the robot drives in such a way that its sensors cannot recognise these features. A novel dynamic path planning approach is presented that supports the data acquisition for the localisation process as well as the extraction of new feature while exploring an a priori unknown indoor environment. Real experiments on the authors' mobile robot Roamer have shown that the overall localisation stability and the feature extraction process were improved significantly while exploring an unknown environment.
{"title":"Sonar feature based exploration","authors":"Rudolf Bauer, W. Rencken","doi":"10.1109/IROS.1995.525789","DOIUrl":"https://doi.org/10.1109/IROS.1995.525789","url":null,"abstract":"A mobile robot wishing to explore unknown and dynamic environments, has to build up maps of its environment while at the same time ensuring that it does not become lost in the process. However current feature based localisation techniques can become unstable if the robot drives in such a way that its sensors cannot recognise these features. A novel dynamic path planning approach is presented that supports the data acquisition for the localisation process as well as the extraction of new feature while exploring an a priori unknown indoor environment. Real experiments on the authors' mobile robot Roamer have shown that the overall localisation stability and the feature extraction process were improved significantly while exploring an unknown environment.","PeriodicalId":124483,"journal":{"name":"Proceedings 1995 IEEE/RSJ International Conference on Intelligent Robots and Systems. Human Robot Interaction and Cooperative Robots","volume":"52 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1995-08-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127561696","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1995-08-05DOI: 10.1109/IROS.1995.525794
Farabi Bensalah, F. Chaumette
This paper describes a real time visual target tracking using the generalized likelihood ratio (GLR) algorithm. The authors' first introduce the visual servoing approach and the application of the task function concept to vision-based tasks. Then, the authors present a complete control scheme which explicitly enables a moving object to be pursued. In order to make the tracking errors as low as possible, the authors use the GLR test, an algorithm able to detect, estimate and compensate abrupt jumps in target motion. Finally, real-time experimental results using a camera mounted on the end effector of a six-DOF robot are presented.
{"title":"Compensation of abrupt motion changes in target tracking by visual servoing","authors":"Farabi Bensalah, F. Chaumette","doi":"10.1109/IROS.1995.525794","DOIUrl":"https://doi.org/10.1109/IROS.1995.525794","url":null,"abstract":"This paper describes a real time visual target tracking using the generalized likelihood ratio (GLR) algorithm. The authors' first introduce the visual servoing approach and the application of the task function concept to vision-based tasks. Then, the authors present a complete control scheme which explicitly enables a moving object to be pursued. In order to make the tracking errors as low as possible, the authors use the GLR test, an algorithm able to detect, estimate and compensate abrupt jumps in target motion. Finally, real-time experimental results using a camera mounted on the end effector of a six-DOF robot are presented.","PeriodicalId":124483,"journal":{"name":"Proceedings 1995 IEEE/RSJ International Conference on Intelligent Robots and Systems. Human Robot Interaction and Cooperative Robots","volume":"20 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1995-08-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123036298","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1995-08-05DOI: 10.1109/IROS.1995.525916
Wenheng Liu, P. Will
This paper introduces the concept of using a dense array of individual manipulator mechanisms as a programmable intelligent motion surface (IMS). The individual robots in the array can be implemented in a variety of technologies with different sizes. Programmability is the common necessary characteristic for an IMS; the array, with groups of contiguous robots acting in unison, can be programmed to various configurations to have the effect of imparting force fields on objects being carried on its surface. The appropriate choice of force fields is shown to cause parts placed on the array to be moved in manners that are useful. These include such functions as translation, rotation, orientation alignment, spatial filtering and the feeding of parts. The use of the IMS is described for primitive assembly operations. Limitations of the approach, extensions and possibilities for future work, particularly in microelectromechanical system (MEMS) implementations, are discussed in detail in the paper. The practicability and the programming of such an IMS-based active assembly bench was explored in a simulated environment.
{"title":"Parts manipulation on an intelligent motion surface","authors":"Wenheng Liu, P. Will","doi":"10.1109/IROS.1995.525916","DOIUrl":"https://doi.org/10.1109/IROS.1995.525916","url":null,"abstract":"This paper introduces the concept of using a dense array of individual manipulator mechanisms as a programmable intelligent motion surface (IMS). The individual robots in the array can be implemented in a variety of technologies with different sizes. Programmability is the common necessary characteristic for an IMS; the array, with groups of contiguous robots acting in unison, can be programmed to various configurations to have the effect of imparting force fields on objects being carried on its surface. The appropriate choice of force fields is shown to cause parts placed on the array to be moved in manners that are useful. These include such functions as translation, rotation, orientation alignment, spatial filtering and the feeding of parts. The use of the IMS is described for primitive assembly operations. Limitations of the approach, extensions and possibilities for future work, particularly in microelectromechanical system (MEMS) implementations, are discussed in detail in the paper. The practicability and the programming of such an IMS-based active assembly bench was explored in a simulated environment.","PeriodicalId":124483,"journal":{"name":"Proceedings 1995 IEEE/RSJ International Conference on Intelligent Robots and Systems. Human Robot Interaction and Cooperative Robots","volume":"3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1995-08-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131250822","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1995-08-05DOI: 10.1109/IROS.1995.525914
N. Chen, R. Rink, Hong Zhang
A real-time tactile image processing algorithm for edge contact is presented in this paper. Based on basic elasticity results, closed-form solutions for calculating contact force and local contact geometries (i.e., location and orientation of the line of contact) from the first three moments of a tactile image are derived. Computational complexity of the proposed algorithm and those of the previous approaches are compared, and passive tactile sensing experiments are performed. It is shown that the proposed algorithm has the advantage of persevering force information and is more consistent and computationally efficient.
{"title":"Efficient edge detection from tactile data","authors":"N. Chen, R. Rink, Hong Zhang","doi":"10.1109/IROS.1995.525914","DOIUrl":"https://doi.org/10.1109/IROS.1995.525914","url":null,"abstract":"A real-time tactile image processing algorithm for edge contact is presented in this paper. Based on basic elasticity results, closed-form solutions for calculating contact force and local contact geometries (i.e., location and orientation of the line of contact) from the first three moments of a tactile image are derived. Computational complexity of the proposed algorithm and those of the previous approaches are compared, and passive tactile sensing experiments are performed. It is shown that the proposed algorithm has the advantage of persevering force information and is more consistent and computationally efficient.","PeriodicalId":124483,"journal":{"name":"Proceedings 1995 IEEE/RSJ International Conference on Intelligent Robots and Systems. Human Robot Interaction and Cooperative Robots","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1995-08-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132530431","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}