Pub Date : 1987-03-01DOI: 10.1109/ROBOT.1987.1087849
V. Seshadri
A real-time direct kinematics algorithm has been implemented on a general-purpose signal processor. The implementation features fixed-point calculation and on-chip generation of sinusoidal functions. It is based on a parallel, pipelined architecture including a 16*16 multiplier and two 36-bit accumulators. Various algorithms are used for sinusoidal computations to trade off speed against memory usage. The algorithms have been both simulated and run on the actual hardware. The results indicate that the direct kinematics solution is obtained in under 10 microseconds with a 16-bit resolution. This amounts to a speed improvement of three orders of magnitude compared to execution on a conventional 16-bit microprocessor.
{"title":"A real-time VLSI architecture for direct kinematics","authors":"V. Seshadri","doi":"10.1109/ROBOT.1987.1087849","DOIUrl":"https://doi.org/10.1109/ROBOT.1987.1087849","url":null,"abstract":"A real-time direct kinematics algorithm has been implemented on a general-purpose signal processor. The implementation features fixed-point calculation and on-chip generation of sinusoidal functions. It is based on a parallel, pipelined architecture including a 16*16 multiplier and two 36-bit accumulators. Various algorithms are used for sinusoidal computations to trade off speed against memory usage. The algorithms have been both simulated and run on the actual hardware. The results indicate that the direct kinematics solution is obtained in under 10 microseconds with a 16-bit resolution. This amounts to a speed improvement of three orders of magnitude compared to execution on a conventional 16-bit microprocessor.","PeriodicalId":438447,"journal":{"name":"Proceedings. 1987 IEEE International Conference on Robotics and Automation","volume":"44 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1987-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125351192","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1987-03-01DOI: 10.1109/ROBOT.1987.1087821
Wallace S. Rutkowski, Ronald Benton, E. Kent
The National Bureau of Standards robot sensory system employs multiple hierarchical levels of sensory interpretation that interact with matching levels of world modeling. At each level, the world-modeling processes generate hypotheses about the sensory data based on a priori knowledge, prior sensory input, and knowledge of robot motion. The sensory-interpretative processes use these hypotheses to facilitate their analyses of new data. The results of the analyses are used by the world-modeling processes to correct their models of the environment. This interaction requires the development of real-time algorithms for the analysis of sensory data that can usefully employ guidance from models. This paper presents an algorithm for accomplishing this at the level of object location and pose determination. Its desirable features include the ability to deal with underconstrained problems, the ability to employ all the data in a structured-light image, and robustness in the face of several types of error and noise.
{"title":"Model-driven determination of object pose for a visually servoed robot","authors":"Wallace S. Rutkowski, Ronald Benton, E. Kent","doi":"10.1109/ROBOT.1987.1087821","DOIUrl":"https://doi.org/10.1109/ROBOT.1987.1087821","url":null,"abstract":"The National Bureau of Standards robot sensory system employs multiple hierarchical levels of sensory interpretation that interact with matching levels of world modeling. At each level, the world-modeling processes generate hypotheses about the sensory data based on a priori knowledge, prior sensory input, and knowledge of robot motion. The sensory-interpretative processes use these hypotheses to facilitate their analyses of new data. The results of the analyses are used by the world-modeling processes to correct their models of the environment. This interaction requires the development of real-time algorithms for the analysis of sensory data that can usefully employ guidance from models. This paper presents an algorithm for accomplishing this at the level of object location and pose determination. Its desirable features include the ability to deal with underconstrained problems, the ability to employ all the data in a structured-light image, and robustness in the face of several types of error and noise.","PeriodicalId":438447,"journal":{"name":"Proceedings. 1987 IEEE International Conference on Robotics and Automation","volume":"27 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1987-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125351358","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1987-03-01DOI: 10.1109/ROBOT.1987.1087916
G. Roth, D. O'Hara, M. Levine
The acquisition of parts by a robot is an important problem industrially. The parts may be of various types and can be separate or jumbled together. We have mounted a laser rangefinder on the wrist of a robot and use the profile data it gives to determine a location, called a holdsite, at which to grasp the part. The laser rangefinder is lightweight and collects data quickly with more than sufficient accuracy for this application. The parts are not recognized, instead the best holdsite according to the criteria of stability and safety is chosen using methods from the field of spatial planning. The method has been tested on a variety of parts, from screws to rings and works well. The time required to acquire a part is approximately one minute but we believe this time can be decreased substantially to about ten seconds per part.
{"title":"A holdsite method for parts acquisition using a laser rangefinder mounted on a robot wrist","authors":"G. Roth, D. O'Hara, M. Levine","doi":"10.1109/ROBOT.1987.1087916","DOIUrl":"https://doi.org/10.1109/ROBOT.1987.1087916","url":null,"abstract":"The acquisition of parts by a robot is an important problem industrially. The parts may be of various types and can be separate or jumbled together. We have mounted a laser rangefinder on the wrist of a robot and use the profile data it gives to determine a location, called a holdsite, at which to grasp the part. The laser rangefinder is lightweight and collects data quickly with more than sufficient accuracy for this application. The parts are not recognized, instead the best holdsite according to the criteria of stability and safety is chosen using methods from the field of spatial planning. The method has been tested on a variety of parts, from screws to rings and works well. The time required to acquire a part is approximately one minute but we believe this time can be decreased substantially to about ten seconds per part.","PeriodicalId":438447,"journal":{"name":"Proceedings. 1987 IEEE International Conference on Robotics and Automation","volume":"48 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1987-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126724288","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1987-03-01DOI: 10.1109/ROBOT.1987.1087887
Shaheen Ahmad, Bo Li
In this paper we discuss preliminary design considerations for the optimal design of multiple-apu (arithmetic processing unit) based robot controllers. We justify this design interms of its ability to adopt to various different control, kinematic and trajectory computation methods which are being developed. We then show that with eight apu's, it is possible to compute the inverse kinematics, inverse dynamics and the trajectory for the PUMA arm in less then 3ms. In this design we assume the floating point processing times of a relatively slow 16.7 MHZ 68881.
{"title":"Optimal design of multiple arithmetic processor-based robot controllers","authors":"Shaheen Ahmad, Bo Li","doi":"10.1109/ROBOT.1987.1087887","DOIUrl":"https://doi.org/10.1109/ROBOT.1987.1087887","url":null,"abstract":"In this paper we discuss preliminary design considerations for the optimal design of multiple-apu (arithmetic processing unit) based robot controllers. We justify this design interms of its ability to adopt to various different control, kinematic and trajectory computation methods which are being developed. We then show that with eight apu's, it is possible to compute the inverse kinematics, inverse dynamics and the trajectory for the PUMA arm in less then 3ms. In this design we assume the floating point processing times of a relatively slow 16.7 MHZ 68881.","PeriodicalId":438447,"journal":{"name":"Proceedings. 1987 IEEE International Conference on Robotics and Automation","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1987-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116031664","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1987-03-01DOI: 10.1109/ROBOT.1987.1087911
R. Fearing
A tactile sensing finger tip for the Stanford/JPL Hand has been developed. This sensor incorporates an 8 × 8 sensor array with complete coverage of the cylindrical finger tip. A preliminary analysis has been done to determine contact center, magnitude, and orientation, and to judge if the constructed sensor has adequate sensitivity and density. This low level tactile information provides the first steps needed for reliable object manipulation in a dextrous hand. Experiments were performed during an open loop re-grasping manipulation to determine how well the sensors and algorithms performed.
{"title":"Some experiments with tactile sensing during grasping","authors":"R. Fearing","doi":"10.1109/ROBOT.1987.1087911","DOIUrl":"https://doi.org/10.1109/ROBOT.1987.1087911","url":null,"abstract":"A tactile sensing finger tip for the Stanford/JPL Hand has been developed. This sensor incorporates an 8 × 8 sensor array with complete coverage of the cylindrical finger tip. A preliminary analysis has been done to determine contact center, magnitude, and orientation, and to judge if the constructed sensor has adequate sensitivity and density. This low level tactile information provides the first steps needed for reliable object manipulation in a dextrous hand. Experiments were performed during an open loop re-grasping manipulation to determine how well the sensors and algorithms performed.","PeriodicalId":438447,"journal":{"name":"Proceedings. 1987 IEEE International Conference on Robotics and Automation","volume":"41 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1987-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122486580","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1987-03-01DOI: 10.1109/ROBOT.1987.1087862
H. Kazerooni
A new strategy has been developed for precision deburring and grinding to guarantee burr removal while compensating for robot oscillations and small uncertainties in the location of the part relative to the robot. This problem has been posed as a frequency domain control problem. Electronic compliancy (impedance control) is demanded as an "adaptive" mechanism to satisfy the requirements of this new strategy. This paper examines the development and implementation of impedance control methodology [7,8,9,10,16] on an active end-effector[15] (or the whole robot if it is possible) for precision deburring and grinding tasks.
{"title":"Automated roboting deburring using electronic compliancy; Impedance control","authors":"H. Kazerooni","doi":"10.1109/ROBOT.1987.1087862","DOIUrl":"https://doi.org/10.1109/ROBOT.1987.1087862","url":null,"abstract":"A new strategy has been developed for precision deburring and grinding to guarantee burr removal while compensating for robot oscillations and small uncertainties in the location of the part relative to the robot. This problem has been posed as a frequency domain control problem. Electronic compliancy (impedance control) is demanded as an \"adaptive\" mechanism to satisfy the requirements of this new strategy. This paper examines the development and implementation of impedance control methodology [7,8,9,10,16] on an active end-effector[15] (or the whole robot if it is possible) for precision deburring and grinding tasks.","PeriodicalId":438447,"journal":{"name":"Proceedings. 1987 IEEE International Conference on Robotics and Automation","volume":"232 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1987-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122817029","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1987-03-01DOI: 10.1109/ROBOT.1987.1087744
R. Vijaykumar, M. Arbib
Automatic planning of an assembly task suitable for execution by a robot system requires analysis of constraints arising from task and object geometry as well as constraints arising from the specific configuration of the robot system. This paper describes our current efforts toward building a high-level planner that plans robot assembly tasks while confining its attention to constraints arising from task and object characteristics. The robot itself is modelled in terms of its basic capabilities such as, grasp, free motion and certain fine-motion commands. The planner works from a specification of the assembly operations, the objects involved in the operations, and an incomplete geometric model of the objects. The assembly planning problem is viewed as comprising of the determination of positions and orientations of component objects relative to each other and the determination of the robot motion primitives required to achieve these relations. The paper discusses how we decompose the assembly planning problem to reflect this view, the intermediate representations used, and our current implementations by use of an example assembly task.
{"title":"Problem decomposition for assembly planning","authors":"R. Vijaykumar, M. Arbib","doi":"10.1109/ROBOT.1987.1087744","DOIUrl":"https://doi.org/10.1109/ROBOT.1987.1087744","url":null,"abstract":"Automatic planning of an assembly task suitable for execution by a robot system requires analysis of constraints arising from task and object geometry as well as constraints arising from the specific configuration of the robot system. This paper describes our current efforts toward building a high-level planner that plans robot assembly tasks while confining its attention to constraints arising from task and object characteristics. The robot itself is modelled in terms of its basic capabilities such as, grasp, free motion and certain fine-motion commands. The planner works from a specification of the assembly operations, the objects involved in the operations, and an incomplete geometric model of the objects. The assembly planning problem is viewed as comprising of the determination of positions and orientations of component objects relative to each other and the determination of the robot motion primitives required to achieve these relations. The paper discusses how we decompose the assembly planning problem to reflect this view, the intermediate representations used, and our current implementations by use of an example assembly task.","PeriodicalId":438447,"journal":{"name":"Proceedings. 1987 IEEE International Conference on Robotics and Automation","volume":"39 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1987-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122882260","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1987-03-01DOI: 10.1109/ROBOT.1987.1087965
J. R. White, H. Harvey, K. A. Farnstrom
In-plant testing of a mobile surveillance robot (SURBOT) was performed at the Browns Ferry Nuclear Plant by TVA personnel. The results verified that SURBOT can be used for remote surveillance in 54 separate controlled radiation rooms at the plant. High-quality color video, audio, and other data are collected, digitized by an on-board computer, and transmitted through a cable to the control console for real-time display and videotaping. TVA projects that the use of SURBOT for surveillance during plant operation will produce annual savings of about 100 person-rem radiation exposure and $200,000 in operating costs. Based on the successful results of this program, REMOTEC is now commercializing the SURBOT technology on both wheeled and tracked mobile robots for use in nuclear power plants and other hazardous environments.
{"title":"Testing of mobile surveillance robot at a nuclear power plant","authors":"J. R. White, H. Harvey, K. A. Farnstrom","doi":"10.1109/ROBOT.1987.1087965","DOIUrl":"https://doi.org/10.1109/ROBOT.1987.1087965","url":null,"abstract":"In-plant testing of a mobile surveillance robot (SURBOT) was performed at the Browns Ferry Nuclear Plant by TVA personnel. The results verified that SURBOT can be used for remote surveillance in 54 separate controlled radiation rooms at the plant. High-quality color video, audio, and other data are collected, digitized by an on-board computer, and transmitted through a cable to the control console for real-time display and videotaping. TVA projects that the use of SURBOT for surveillance during plant operation will produce annual savings of about 100 person-rem radiation exposure and $200,000 in operating costs. Based on the successful results of this program, REMOTEC is now commercializing the SURBOT technology on both wheeled and tracked mobile robots for use in nuclear power plants and other hazardous environments.","PeriodicalId":438447,"journal":{"name":"Proceedings. 1987 IEEE International Conference on Robotics and Automation","volume":"63 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1987-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121837778","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1987-03-01DOI: 10.1109/ROBOT.1987.1087984
S. Parthasarathy, D. Wolf, Evelyn Hu, S. Hackwood, G. Beni
The use of color vision as a tool for machine vision provides a powerful means of performing rapid, accurate inspection of microelectronic structures. Since microelectronics fabrication is in large part a thin film technology, and since thin films have characteristic colors, this approach extends the range of optical analysis possible. We have constructed a color vision system used to measure thin film dielectric materials. Color matching is performed rapidly (<100 msecs) and with resolution better than 20 Å. The resolution limit has been so far set only by the samples available for measurement. We have further extended the capability of the system beyond simple color matching to identify true unknown samples whose thickness fall within the range of the original system database. Feed-back control of the illumination has been incorporated into the system; we present data on the effect of shifts in lighting or magnification. Microscopic, as well as broad area measurements (for uniformity) can be made.
{"title":"A color vision system for film thickness determination","authors":"S. Parthasarathy, D. Wolf, Evelyn Hu, S. Hackwood, G. Beni","doi":"10.1109/ROBOT.1987.1087984","DOIUrl":"https://doi.org/10.1109/ROBOT.1987.1087984","url":null,"abstract":"The use of color vision as a tool for machine vision provides a powerful means of performing rapid, accurate inspection of microelectronic structures. Since microelectronics fabrication is in large part a thin film technology, and since thin films have characteristic colors, this approach extends the range of optical analysis possible. We have constructed a color vision system used to measure thin film dielectric materials. Color matching is performed rapidly (<100 msecs) and with resolution better than 20 Å. The resolution limit has been so far set only by the samples available for measurement. We have further extended the capability of the system beyond simple color matching to identify true unknown samples whose thickness fall within the range of the original system database. Feed-back control of the illumination has been incorporated into the system; we present data on the effect of shifts in lighting or magnification. Microscopic, as well as broad area measurements (for uniformity) can be made.","PeriodicalId":438447,"journal":{"name":"Proceedings. 1987 IEEE International Conference on Robotics and Automation","volume":"65 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1987-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116788070","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 1987-03-01DOI: 10.1109/ROBOT.1987.1087906
Edward T. Lee
Interior-defined regions, flood-fill algorithms, and a simple 4-connected region filling algorithm are presented together with their properties. An algorithm for region filling using two dimensional grammars is also presented together with illustrative examples. The results obtained in this paper may have useful application in intelligent systems, computer graphics, artificial intelligence, expert systems, knowledge engineering, pattern recognition, pictorial databases and related areas.
{"title":"Region filling using two dimensional grammars","authors":"Edward T. Lee","doi":"10.1109/ROBOT.1987.1087906","DOIUrl":"https://doi.org/10.1109/ROBOT.1987.1087906","url":null,"abstract":"Interior-defined regions, flood-fill algorithms, and a simple 4-connected region filling algorithm are presented together with their properties. An algorithm for region filling using two dimensional grammars is also presented together with illustrative examples. The results obtained in this paper may have useful application in intelligent systems, computer graphics, artificial intelligence, expert systems, knowledge engineering, pattern recognition, pictorial databases and related areas.","PeriodicalId":438447,"journal":{"name":"Proceedings. 1987 IEEE International Conference on Robotics and Automation","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1987-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128289995","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}