Pub Date : 1987-03-01DOI: 10.1109/ROBOT.1987.1087755
S. Jacobsen, I. McCammon, K. Biggers, R. Phillips
Research in robotics and automation is gradually revealing the importance of tactile information in the control of machine manipulation systems. Substantial research efforts have been devoted to the construction of compact, high resolution force sensing arrays which employ sophisticated transduction and processing techniques. A variety of systems have been experimentally investigated and a widespread optimism exists regarding the potential use of complex tactile sensing systems in robotic end effectors. Unfortunately, relatively few multi-detector systems have seen actual have seen actual use in real manipulation systems. Those designs that have been applied in automatic environments have almost invariably been used in static circumstances for simple contact imaging. As a consequence of these efforts, it is becoming clear that much work remains to be done before machine touch can be understood and then used to enhance the performance of a dynamic manipulation process. The slow progress in the development of comprehensive tactile sensing systems indicates that the fundamental problem is not simply one of transducer array design and fabrication. Advancements in this area will require: (1) understanding new concepts related to contact detection and image formation as well as the use of contact information to control grasp and to aid in task planning; and (2) the development of actual sensing systems which can be used first to experimentally explore important issues in machine manipulation, and later as a basis for the future design of practical and economic tactile sensing systems. The development of appropriate tactile sensors for research applications will require an exhaustive design effort aimed at understanding the architecture of these systems at all levels, including: (1) transducers and preprocessors which acquire data indicating the type of contact between end effector surfaces and an object and which prepare that data for transmission; (2) multiplexing and transmission systems which efficiently supply sensor data to the controller; and (3) tactile focus control systems which, in order to maximize system transmission efficiency, will dynamically select which sensors will be interrogated for information which will be integrated with other sensory input. This paper reviews preliminary work aimed at understanding the general issues and trade-offs governing the design of real tactile sensing systems. Also, specific designs emphasizing practical necessities such as simplicity, reliability, and economy will be discussed along with plans to integrate this system into the Utah/MIT Dextrous Hand.
{"title":"Tactile sensing system design issues in machine manipulation","authors":"S. Jacobsen, I. McCammon, K. Biggers, R. Phillips","doi":"10.1109/ROBOT.1987.1087755","DOIUrl":"https://doi.org/10.1109/ROBOT.1987.1087755","url":null,"abstract":"Research in robotics and automation is gradually revealing the importance of tactile information in the control of machine manipulation systems. Substantial research efforts have been devoted to the construction of compact, high resolution force sensing arrays which employ sophisticated transduction and processing techniques. A variety of systems have been experimentally investigated and a widespread optimism exists regarding the potential use of complex tactile sensing systems in robotic end effectors. Unfortunately, relatively few multi-detector systems have seen actual have seen actual use in real manipulation systems. Those designs that have been applied in automatic environments have almost invariably been used in static circumstances for simple contact imaging. As a consequence of these efforts, it is becoming clear that much work remains to be done before machine touch can be understood and then used to enhance the performance of a dynamic manipulation process. The slow progress in the development of comprehensive tactile sensing systems indicates that the fundamental problem is not simply one of transducer array design and fabrication. Advancements in this area will require: (1) understanding new concepts related to contact detection and image formation as well as the use of contact information to control grasp and to aid in task planning; and (2) the development of actual sensing systems which can be used first to experimentally explore important issues in machine manipulation, and later as a basis for the future design of practical and economic tactile sensing systems. The development of appropriate tactile sensors for research applications will require an exhaustive design effort aimed at understanding the architecture of these systems at all levels, including: (1) transducers and preprocessors which acquire data indicating the type of contact between end effector surfaces and an object and which prepare that data for transmission; (2) multiplexing and transmission systems which efficiently supply sensor data to the controller; and (3) tactile focus control systems which, in order to maximize system transmission efficiency, will dynamically select which sensors will be interrogated for information which will be integrated with other sensory input. This paper reviews preliminary work aimed at understanding the general issues and trade-offs governing the design of real tactile sensing systems. Also, specific designs emphasizing practical necessities such as simplicity, reliability, and economy will be discussed along with plans to integrate this system into the Utah/MIT Dextrous Hand.","PeriodicalId":438447,"journal":{"name":"Proceedings. 1987 IEEE International Conference on Robotics and Automation","volume":"62 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1987-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123761441","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1987-03-01DOI: 10.1109/ROBOT.1987.1087876
H. Asada, K. Ogawa
Dynamic behavior of a manipulator arm and its end effector that interact with the environment is analyzed. Inertial properties of the arm and the end effector are represented with respect to a point of contact between the end effector and the environment. Virtual mass is then defined to be the equivalent mass of the arm and the end effector reflected to the point of contact, and is given by the ratio of a force acting on the point to the acceleration caused by the force at the point. Unlike a real mass, the virtual mass varies depending on the direction of the applied force and the location of the contact point. The maximum and minimum values of the virtual mass are then obtained and the physical meanings are discussed. Next, the rotational motion of the end effector is considered. A single rigid body possesses a centroid; a particular point at which rotation and translation of the rigid body are separated. The concept of the centroid is extended to the one for a system of rigid bodies such as arm links and the members of the end effector. The point is referred to as the generalized centroid, at which a linear force causes only a linear acceleration and a pure moment causes only an angular acceleration, hence separated. The virtual mass and the generalized centroid are then applied to task planning for chipping, hard surface contact, and dynamic insertion operations. The orientation of a tool and the configuration of the manipulator arm are optimized so that a desired dynamic behavior can be accomplished by having an appropriate virtual mass and the generalized centroid at an appropriate point.
{"title":"On the dynamic analysis of a manipulator and its end effector interacting with the environment","authors":"H. Asada, K. Ogawa","doi":"10.1109/ROBOT.1987.1087876","DOIUrl":"https://doi.org/10.1109/ROBOT.1987.1087876","url":null,"abstract":"Dynamic behavior of a manipulator arm and its end effector that interact with the environment is analyzed. Inertial properties of the arm and the end effector are represented with respect to a point of contact between the end effector and the environment. Virtual mass is then defined to be the equivalent mass of the arm and the end effector reflected to the point of contact, and is given by the ratio of a force acting on the point to the acceleration caused by the force at the point. Unlike a real mass, the virtual mass varies depending on the direction of the applied force and the location of the contact point. The maximum and minimum values of the virtual mass are then obtained and the physical meanings are discussed. Next, the rotational motion of the end effector is considered. A single rigid body possesses a centroid; a particular point at which rotation and translation of the rigid body are separated. The concept of the centroid is extended to the one for a system of rigid bodies such as arm links and the members of the end effector. The point is referred to as the generalized centroid, at which a linear force causes only a linear acceleration and a pure moment causes only an angular acceleration, hence separated. The virtual mass and the generalized centroid are then applied to task planning for chipping, hard surface contact, and dynamic insertion operations. The orientation of a tool and the configuration of the manipulator arm are optimized so that a desired dynamic behavior can be accomplished by having an appropriate virtual mass and the generalized centroid at an appropriate point.","PeriodicalId":438447,"journal":{"name":"Proceedings. 1987 IEEE International Conference on Robotics and Automation","volume":"25 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1987-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121480613","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1987-03-01DOI: 10.1109/ROBOT.1987.1087801
M. Montgomery, D. Gaw, A. Meystel
An algorithm NAVIGATOR for robot path planning in a 2D world with polygonal obstacles is presented. The method employs A* search in a subset of the visibility graph of obstacle vertices. A procedure is given for finding this subset of the visibility graph without computing line intersections. The NAVIGATOR module is part of a complete hierarchical system for control and world representation for a robot which operates in an unkown and unstructured environment.
{"title":"Navigation algorithm for a nested hierarchical system of robot path planning among polyhedral obstacles","authors":"M. Montgomery, D. Gaw, A. Meystel","doi":"10.1109/ROBOT.1987.1087801","DOIUrl":"https://doi.org/10.1109/ROBOT.1987.1087801","url":null,"abstract":"An algorithm NAVIGATOR for robot path planning in a 2D world with polygonal obstacles is presented. The method employs A* search in a subset of the visibility graph of obstacle vertices. A procedure is given for finding this subset of the visibility graph without computing line intersections. The NAVIGATOR module is part of a complete hierarchical system for control and world representation for a robot which operates in an unkown and unstructured environment.","PeriodicalId":438447,"journal":{"name":"Proceedings. 1987 IEEE International Conference on Robotics and Automation","volume":"30 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1987-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125154994","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1987-03-01DOI: 10.1109/ROBOT.1987.1088003
F. Schenker, R. French, A. Sirota
{"title":"The NASA/JPL telerobot testbed: An evolvable system demonstration","authors":"F. Schenker, R. French, A. Sirota","doi":"10.1109/ROBOT.1987.1088003","DOIUrl":"https://doi.org/10.1109/ROBOT.1987.1088003","url":null,"abstract":"","PeriodicalId":438447,"journal":{"name":"Proceedings. 1987 IEEE International Conference on Robotics and Automation","volume":"62 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1987-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125179390","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1987-03-01DOI: 10.1109/ROBOT.1987.1088060
H. Werstiuk, D. Gossain
{"title":"The role of the mobile servicing system on space station","authors":"H. Werstiuk, D. Gossain","doi":"10.1109/ROBOT.1987.1088060","DOIUrl":"https://doi.org/10.1109/ROBOT.1987.1088060","url":null,"abstract":"","PeriodicalId":438447,"journal":{"name":"Proceedings. 1987 IEEE International Conference on Robotics and Automation","volume":"91 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1987-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122657432","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1987-03-01DOI: 10.1109/ROBOT.1987.1087819
H. Asada, H. Izumi
A methodology for the automatic generation. of robot programs for hybrid position/force control is presented. While hybrid control allows a robot to perform skillful manipulations, its programing is more complex and intricate than simple position control schemes. In hybrid control, control modes must be designated to individual C-frame axes, in such a way that the robot motion conform to geometrical or natural constraints. Both position and force reference inputs must be provided as artifical constraints. This is difficult , since it requires the interpretation of a given task and the translation into a set of commands used in the hybrid position/force control. In this paper, an efficient method is developed to eliminate manual programming and task interpretation/translation. The operator teaches a given task by "teaching-by-showing", in which the operator contacts the robot end effector to the environment, and accommodates the contact force. During the operator's motion, the force applied by the operator as well as the position of the end effector are measured, The acquired motion data are then processed and interpreted so that necessary information to generate robot programs is obtained. The choice of control modes as well as reference inputs to the robot controller are derived from the motion data. Then the result is translated into a robot program. First, the principle of this method is described. The algorithm to interpret motion data is then developed for a simple palletizing job. The method is implemented on a force-controlled direct-drive arm using a personal computer.
{"title":"Direct teaching and automatic program generation for the hybrid control of robot manipulators","authors":"H. Asada, H. Izumi","doi":"10.1109/ROBOT.1987.1087819","DOIUrl":"https://doi.org/10.1109/ROBOT.1987.1087819","url":null,"abstract":"A methodology for the automatic generation. of robot programs for hybrid position/force control is presented. While hybrid control allows a robot to perform skillful manipulations, its programing is more complex and intricate than simple position control schemes. In hybrid control, control modes must be designated to individual C-frame axes, in such a way that the robot motion conform to geometrical or natural constraints. Both position and force reference inputs must be provided as artifical constraints. This is difficult , since it requires the interpretation of a given task and the translation into a set of commands used in the hybrid position/force control. In this paper, an efficient method is developed to eliminate manual programming and task interpretation/translation. The operator teaches a given task by \"teaching-by-showing\", in which the operator contacts the robot end effector to the environment, and accommodates the contact force. During the operator's motion, the force applied by the operator as well as the position of the end effector are measured, The acquired motion data are then processed and interpreted so that necessary information to generate robot programs is obtained. The choice of control modes as well as reference inputs to the robot controller are derived from the motion data. Then the result is translated into a robot program. First, the principle of this method is described. The algorithm to interpret motion data is then developed for a simple palletizing job. The method is implemented on a force-controlled direct-drive arm using a personal computer.","PeriodicalId":438447,"journal":{"name":"Proceedings. 1987 IEEE International Conference on Robotics and Automation","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1987-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127769555","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1987-03-01DOI: 10.1109/ROBOT.1987.1087913
M. Cutkosky, John M. Jourdain, P. Wright
Compliant materials were tested both under clean, dry conditions and with environmental contamination to determine which materials were most suitable for the contact areas of a robotic hand and to establish accurate models of the materials' frictional behavior. Some of the most promising materials under clean, dry conditions performed unreliably in the presence of oil and water. Contact shape, surface texture and surface porosity each had a strong effect on the ultimate coefficient of friction. Combining the results of these tests with what we know about the skin on human and primate hands leads to several design conclusions about the ideal skin for the fingers of a robotic hand.
{"title":"Skin materials for robotic fingers","authors":"M. Cutkosky, John M. Jourdain, P. Wright","doi":"10.1109/ROBOT.1987.1087913","DOIUrl":"https://doi.org/10.1109/ROBOT.1987.1087913","url":null,"abstract":"Compliant materials were tested both under clean, dry conditions and with environmental contamination to determine which materials were most suitable for the contact areas of a robotic hand and to establish accurate models of the materials' frictional behavior. Some of the most promising materials under clean, dry conditions performed unreliably in the presence of oil and water. Contact shape, surface texture and surface porosity each had a strong effect on the ultimate coefficient of friction. Combining the results of these tests with what we know about the skin on human and primate hands leads to several design conclusions about the ideal skin for the fingers of a robotic hand.","PeriodicalId":438447,"journal":{"name":"Proceedings. 1987 IEEE International Conference on Robotics and Automation","volume":"2015 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1987-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128074609","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1987-03-01DOI: 10.1109/ROBOT.1987.1088062
Ming Z. Huang, K. Waldron
For legged locomotion the ability to preplan allowable payload given specified operating conditions is important, both in design and high level task planning applications. In this paper, the dependency of payload on vehicle operating speed, a problem peculiar to legged locomotion, is examined via a case study of a hexapod using wave gaits. As a result, a set of closed form analytic expressions describing the payload-speed relationship has been obtained. The solution is based on the assumption of equal leg compliances. The utility of such relationships lies in determining the allowable payload given the vehicle speed and weight or, conversely, in preplanning the payload to be carried and determining the maximum speed at which the vehicle can operate safely.
{"title":"Relationship between payload and speed in legged locomotion","authors":"Ming Z. Huang, K. Waldron","doi":"10.1109/ROBOT.1987.1088062","DOIUrl":"https://doi.org/10.1109/ROBOT.1987.1088062","url":null,"abstract":"For legged locomotion the ability to preplan allowable payload given specified operating conditions is important, both in design and high level task planning applications. In this paper, the dependency of payload on vehicle operating speed, a problem peculiar to legged locomotion, is examined via a case study of a hexapod using wave gaits. As a result, a set of closed form analytic expressions describing the payload-speed relationship has been obtained. The solution is based on the assumption of equal leg compliances. The utility of such relationships lies in determining the allowable payload given the vehicle speed and weight or, conversely, in preplanning the payload to be carried and determining the maximum speed at which the vehicle can operate safely.","PeriodicalId":438447,"journal":{"name":"Proceedings. 1987 IEEE International Conference on Robotics and Automation","volume":"30 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1987-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132733015","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1987-03-01DOI: 10.1109/ROBOT.1987.1088023
R. Douglass
{"title":"Second generation architecture of the autonomous land vehicle","authors":"R. Douglass","doi":"10.1109/ROBOT.1987.1088023","DOIUrl":"https://doi.org/10.1109/ROBOT.1987.1088023","url":null,"abstract":"","PeriodicalId":438447,"journal":{"name":"Proceedings. 1987 IEEE International Conference on Robotics and Automation","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1987-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133425010","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1987-03-01DOI: 10.1109/ROBOT.1987.1087782
K. Rueb, A. Wong
In this paper, we present a method capable of identifying and locating objects with known three-dimensional models in a single perspective image. The identity and location of an object are obtained through a procedure which we refer to as a knowledge-directed search. The search encodes declarative and procedural knowledge in a rule network designed to ensure that only pertinent information is processed. The search is organized as a collection of search activations. Each search activation has a context memory for recording previously inferred or assumed information. As a result, multiple search activations may be used to test various hypotheses of object identity and location.
{"title":"Analysis of point feature representation of a perspective image","authors":"K. Rueb, A. Wong","doi":"10.1109/ROBOT.1987.1087782","DOIUrl":"https://doi.org/10.1109/ROBOT.1987.1087782","url":null,"abstract":"In this paper, we present a method capable of identifying and locating objects with known three-dimensional models in a single perspective image. The identity and location of an object are obtained through a procedure which we refer to as a knowledge-directed search. The search encodes declarative and procedural knowledge in a rule network designed to ensure that only pertinent information is processed. The search is organized as a collection of search activations. Each search activation has a context memory for recording previously inferred or assumed information. As a result, multiple search activations may be used to test various hypotheses of object identity and location.","PeriodicalId":438447,"journal":{"name":"Proceedings. 1987 IEEE International Conference on Robotics and Automation","volume":"48 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1987-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133657525","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}