Pub Date : 2002-12-10DOI: 10.1109/ROMAN.2002.1045660
Tomio Watanabe, M. Okubo, M. Nakashige, R. Danbara
A speech-driven embodied interactive actor called InterActor with both functions of speaker and listener is developed for activating human interaction and communication by generating expressive actions and motions coherently related to speech input. InterActor is the electronic media version of physical interaction robot called InterRobot for robot-mediated communication support, which sets free from the hardware restriction. By using InterActor, the concept of speech driven embodied interaction system is proposed for human interaction sharing by the entrainment between human speech and InterActor's motions in remote communication. The prototype of the system is developed, and the sensory evaluation and behavioral analysis in human communication through InterActor demonstrate the effectiveness of the system. Actual applications of InterActor to human interface are also demonstrated.
{"title":"InterActor: Speech-driven embodied interactive actor","authors":"Tomio Watanabe, M. Okubo, M. Nakashige, R. Danbara","doi":"10.1109/ROMAN.2002.1045660","DOIUrl":"https://doi.org/10.1109/ROMAN.2002.1045660","url":null,"abstract":"A speech-driven embodied interactive actor called InterActor with both functions of speaker and listener is developed for activating human interaction and communication by generating expressive actions and motions coherently related to speech input. InterActor is the electronic media version of physical interaction robot called InterRobot for robot-mediated communication support, which sets free from the hardware restriction. By using InterActor, the concept of speech driven embodied interaction system is proposed for human interaction sharing by the entrainment between human speech and InterActor's motions in remote communication. The prototype of the system is developed, and the sensory evaluation and behavioral analysis in human communication through InterActor demonstrate the effectiveness of the system. Actual applications of InterActor to human interface are also demonstrated.","PeriodicalId":222409,"journal":{"name":"Proceedings. 11th IEEE International Workshop on Robot and Human Interactive Communication","volume":"99 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-12-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133443380","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2002-12-10DOI: 10.1109/ROMAN.2002.1045645
J. Fritsch, S. Lang, A. Kleinehagenbrock, G. Fink, G. Sagerer
The visual tracking of human faces is a basic functionality needed for human-machine interfaces. This paper describes an approach that explores the combined use of adaptive skin color segmentation and face detection for improved face tracking on a mobile robot. To cope with inhomogeneous lighting within a single image, the color of each tracked image region is modeled with an individual, unimodal Gaussian. Face detection is performed locally on all segmented skin-colored regions. If a face is detected, the appropriate color model is updated with the image pixels in an elliptical area around the face position. Updating is restricted to pixels that are contained in a global skin color distribution obtained off-line. The presented method allows us to track faces that undergo changes in lighting conditions while at the same time providing information about the attention of the user, i.e. whether the user looks at the robot. This forms the basis for developing more sophisticated human-machine interfaces capable of dealing with unrestricted environments.
{"title":"Improving adaptive skin color segmentation by incorporating results from face detection","authors":"J. Fritsch, S. Lang, A. Kleinehagenbrock, G. Fink, G. Sagerer","doi":"10.1109/ROMAN.2002.1045645","DOIUrl":"https://doi.org/10.1109/ROMAN.2002.1045645","url":null,"abstract":"The visual tracking of human faces is a basic functionality needed for human-machine interfaces. This paper describes an approach that explores the combined use of adaptive skin color segmentation and face detection for improved face tracking on a mobile robot. To cope with inhomogeneous lighting within a single image, the color of each tracked image region is modeled with an individual, unimodal Gaussian. Face detection is performed locally on all segmented skin-colored regions. If a face is detected, the appropriate color model is updated with the image pixels in an elliptical area around the face position. Updating is restricted to pixels that are contained in a global skin color distribution obtained off-line. The presented method allows us to track faces that undergo changes in lighting conditions while at the same time providing information about the attention of the user, i.e. whether the user looks at the robot. This forms the basis for developing more sophisticated human-machine interfaces capable of dealing with unrestricted environments.","PeriodicalId":222409,"journal":{"name":"Proceedings. 11th IEEE International Workshop on Robot and Human Interactive Communication","volume":"2015 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-12-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114445171","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2002-12-10DOI: 10.1109/ROMAN.2002.1045607
B. Trouvain, Hans-Ludwig Wolf
Designing a human-machine interface for a mobile semi-autonomous multi-robot system is a challenging task. The requirements range from operating in a real time environment, facilitating asynchronous command execution to supporting the operator in dividing his monitoring and control resources among multiple robots. The experiment presented in this paper is the first in our effort to develop a human-multi-robot system. Its aim is to provide empirical data enabling us to measure the effectiveness of the interface in order to support design decisions as the development is progressing. Additional data is gathered to explore how effective operators can manage multiple independently acting mobile robots simultaneously. The central task of this simulation based experiment is to navigate the robots to various "inspection points" to perform an "inspection". This task is performed by a single operator with 2, 4, and 8 robots in two different environments.
{"title":"Evaluation of multi-robot control and monitoring performance","authors":"B. Trouvain, Hans-Ludwig Wolf","doi":"10.1109/ROMAN.2002.1045607","DOIUrl":"https://doi.org/10.1109/ROMAN.2002.1045607","url":null,"abstract":"Designing a human-machine interface for a mobile semi-autonomous multi-robot system is a challenging task. The requirements range from operating in a real time environment, facilitating asynchronous command execution to supporting the operator in dividing his monitoring and control resources among multiple robots. The experiment presented in this paper is the first in our effort to develop a human-multi-robot system. Its aim is to provide empirical data enabling us to measure the effectiveness of the interface in order to support design decisions as the development is progressing. Additional data is gathered to explore how effective operators can manage multiple independently acting mobile robots simultaneously. The central task of this simulation based experiment is to navigate the robots to various \"inspection points\" to perform an \"inspection\". This task is performed by a single operator with 2, 4, and 8 robots in two different environments.","PeriodicalId":222409,"journal":{"name":"Proceedings. 11th IEEE International Workshop on Robot and Human Interactive Communication","volume":"63 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-12-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116897547","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2002-12-10DOI: 10.1109/ROMAN.2002.1045650
M. Hashimoto, T. Ichikawa
Manipulation of deformable objects is an important subject for housekeeping robots. Dynamic manipulation of a string was studied in this paper as an example. The string is modeled by a rigid body linkage with passive joints in two-dimensional space. The parameters of the joint stiffness and viscosity are identified experimentally using a high speed video camera. A simulation of the dynamic manipulation is performed based on an optimal control theory to obtain the desired trajectory of the manipulator. The validity of the proposed method is shown by experimental results of the dynamic manipulation.
{"title":"Dynamic manipulation of strings for housekeeping robots","authors":"M. Hashimoto, T. Ichikawa","doi":"10.1109/ROMAN.2002.1045650","DOIUrl":"https://doi.org/10.1109/ROMAN.2002.1045650","url":null,"abstract":"Manipulation of deformable objects is an important subject for housekeeping robots. Dynamic manipulation of a string was studied in this paper as an example. The string is modeled by a rigid body linkage with passive joints in two-dimensional space. The parameters of the joint stiffness and viscosity are identified experimentally using a high speed video camera. A simulation of the dynamic manipulation is performed based on an optimal control theory to obtain the desired trajectory of the manipulator. The validity of the proposed method is shown by experimental results of the dynamic manipulation.","PeriodicalId":222409,"journal":{"name":"Proceedings. 11th IEEE International Workshop on Robot and Human Interactive Communication","volume":"119 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-12-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116210219","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2002-12-10DOI: 10.1109/ROMAN.2002.1045681
Colin T. Schmidt
Discussion about the application of scientific knowledge in robotics in order to build people helpers is widespread. The issue herein addressed is philosophically poignant, that of robots that are 'people'. It is currently popular to speak about robots and the image of Man. Behind this lurks the dialogical mind and the questions on its artificial existence. Without intending to defend or refute the discourse in favour of 'recreating' Man, a lesser familiar question is brought forth: 'Given that we are capable of creating a man (constructing a robot-person), what would the consequences of this be and would we be satisfied with such technology?' Thorny topic; it questions the entire knowledge foundation upon which strong AI/Robotics is positioned. The author argues for improved monitoring of technological progress and thus favours 'soft' (weak) implementation techniques.
{"title":"Socially interactive robots. Why our current beliefs about them still work","authors":"Colin T. Schmidt","doi":"10.1109/ROMAN.2002.1045681","DOIUrl":"https://doi.org/10.1109/ROMAN.2002.1045681","url":null,"abstract":"Discussion about the application of scientific knowledge in robotics in order to build people helpers is widespread. The issue herein addressed is philosophically poignant, that of robots that are 'people'. It is currently popular to speak about robots and the image of Man. Behind this lurks the dialogical mind and the questions on its artificial existence. Without intending to defend or refute the discourse in favour of 'recreating' Man, a lesser familiar question is brought forth: 'Given that we are capable of creating a man (constructing a robot-person), what would the consequences of this be and would we be satisfied with such technology?' Thorny topic; it questions the entire knowledge foundation upon which strong AI/Robotics is positioned. The author argues for improved monitoring of technological progress and thus favours 'soft' (weak) implementation techniques.","PeriodicalId":222409,"journal":{"name":"Proceedings. 11th IEEE International Workshop on Robot and Human Interactive Communication","volume":"21 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-12-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126601907","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2002-12-10DOI: 10.1109/ROMAN.2002.1045637
D. Bank
This paper presents an approach to safe navigation of autonomous mobile systems within partially known or unknown dynamic environments. In this context, faulty, maladjusted and otherwise influenced sensors must be recognized (error detection), and adequate measures for failure correction (error recovery) must be taken. As a general basis for monitoring the state of environmental sensors, a so called "error detection model" was created, which consists of sub-models for data from laser range finders and ultrasonic sensors. With the aid of the created models, different kinds of redundancy can be utilized and consistency and plausibility checks can be carried out. The error detection model serves for failure recognition based on environment modeling and on hypotheses for expected sensor readings.
{"title":"An error detection model for ultrasonic sensor evaluation on autonomous mobile systems","authors":"D. Bank","doi":"10.1109/ROMAN.2002.1045637","DOIUrl":"https://doi.org/10.1109/ROMAN.2002.1045637","url":null,"abstract":"This paper presents an approach to safe navigation of autonomous mobile systems within partially known or unknown dynamic environments. In this context, faulty, maladjusted and otherwise influenced sensors must be recognized (error detection), and adequate measures for failure correction (error recovery) must be taken. As a general basis for monitoring the state of environmental sensors, a so called \"error detection model\" was created, which consists of sub-models for data from laser range finders and ultrasonic sensors. With the aid of the created models, different kinds of redundancy can be utilized and consistency and plausibility checks can be carried out. The error detection model serves for failure recognition based on environment modeling and on hypotheses for expected sensor readings.","PeriodicalId":222409,"journal":{"name":"Proceedings. 11th IEEE International Workshop on Robot and Human Interactive Communication","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-12-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126708216","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2002-12-10DOI: 10.1109/ROMAN.2002.1045638
K. Komoriya, T. Kotoku, K. Ohba
A mobile manipulator will be a practical robot in our daily life environment. In order for the robot to move around in the environment accurate localization is sometimes necessary. Distributed cameras can be used for such a purpose. Cameras are embedded in the environment and emit the image data it gets. When the robot approaches the camera, it receives the image data including its own figure. Through the model matching or the motion analysis of the captured image, the robot estimates the relative position to the camera location. Combining the relative position data with the absolute location data of the camera, the robot can localize itself more accurately than conventional dead reckoning system. This paper describes one method to localize the robot using the distributed camera, and the basic experimental results.
{"title":"Utilization of distributed cameras in environment for actions of mobile manipulator","authors":"K. Komoriya, T. Kotoku, K. Ohba","doi":"10.1109/ROMAN.2002.1045638","DOIUrl":"https://doi.org/10.1109/ROMAN.2002.1045638","url":null,"abstract":"A mobile manipulator will be a practical robot in our daily life environment. In order for the robot to move around in the environment accurate localization is sometimes necessary. Distributed cameras can be used for such a purpose. Cameras are embedded in the environment and emit the image data it gets. When the robot approaches the camera, it receives the image data including its own figure. Through the model matching or the motion analysis of the captured image, the robot estimates the relative position to the camera location. Combining the relative position data with the absolute location data of the camera, the robot can localize itself more accurately than conventional dead reckoning system. This paper describes one method to localize the robot using the distributed camera, and the basic experimental results.","PeriodicalId":222409,"journal":{"name":"Proceedings. 11th IEEE International Workshop on Robot and Human Interactive Communication","volume":"34 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-12-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128893001","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2002-12-10DOI: 10.1109/ROMAN.2002.1045590
M. Takasaki, T. Nara, T. Mizuno
We have developed a tactile display using surface acoustic wave (SAW). A pulse modulated driving voltages excite temporal distribution of shear force in a pad on the surface of SAW substrate. There are two transducers on the surface to excite alternative shear force. The force can be perceived as tactile sensation at mechanoreceptors in the finger skin. To enhance the force, force transformer was improved. The display was installed on a computer mouse button. Reproduction of human tactile sensation, such as the roughness sensation of rubbing a solid surface, was demonstrated using the computer screen and the mouse. The mouse could display tactile sensation successfully. Difference of roughness was tested by another experimental equipment.
{"title":"A progressive SAW tactile display on a PC mouse button","authors":"M. Takasaki, T. Nara, T. Mizuno","doi":"10.1109/ROMAN.2002.1045590","DOIUrl":"https://doi.org/10.1109/ROMAN.2002.1045590","url":null,"abstract":"We have developed a tactile display using surface acoustic wave (SAW). A pulse modulated driving voltages excite temporal distribution of shear force in a pad on the surface of SAW substrate. There are two transducers on the surface to excite alternative shear force. The force can be perceived as tactile sensation at mechanoreceptors in the finger skin. To enhance the force, force transformer was improved. The display was installed on a computer mouse button. Reproduction of human tactile sensation, such as the roughness sensation of rubbing a solid surface, was demonstrated using the computer screen and the mouse. The mouse could display tactile sensation successfully. Difference of roughness was tested by another experimental equipment.","PeriodicalId":222409,"journal":{"name":"Proceedings. 11th IEEE International Workshop on Robot and Human Interactive Communication","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-12-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131079015","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2002-12-10DOI: 10.1109/ROMAN.2002.1045600
T. Koyama, T. Tanaka, K. Tanaka, M. Feng
In order to provide quality nursing care for bed-bound and disabled people without causing physical and mental stress to both patients and nurses, we have been developing a prototype of a human-assisting robotic system, referred to as HARO (Human-Assisting RObot), which is worn by an operator to amplify his or her power. This paper focuses on the suggestion of new evaluating method about assisting ability of a man-machine system based on muscular character, body mechanics and operator's link model.
{"title":"Evaluation method of man-machine system based on muscular characteristic","authors":"T. Koyama, T. Tanaka, K. Tanaka, M. Feng","doi":"10.1109/ROMAN.2002.1045600","DOIUrl":"https://doi.org/10.1109/ROMAN.2002.1045600","url":null,"abstract":"In order to provide quality nursing care for bed-bound and disabled people without causing physical and mental stress to both patients and nurses, we have been developing a prototype of a human-assisting robotic system, referred to as HARO (Human-Assisting RObot), which is worn by an operator to amplify his or her power. This paper focuses on the suggestion of new evaluating method about assisting ability of a man-machine system based on muscular character, body mechanics and operator's link model.","PeriodicalId":222409,"journal":{"name":"Proceedings. 11th IEEE International Workshop on Robot and Human Interactive Communication","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-12-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115369932","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2002-12-10DOI: 10.1109/ROMAN.2002.1045673
B. Kluge, Dirk Bank, Erwin Prassler
In this paper we describe the problem of coordinating the motion of a mobile robot with a moving guide in dynamic, continuously changing environments, and present an approach to this problem based on velocity obstacles. As a test application, the approach has been implemented on a robotic wheelchair, which is thus enabled to accompany a person through the concourse of a railway station or a pedestrian area.
{"title":"Motion coordination in dynamic environments: reaching a moving goal while avoiding moving obstacles","authors":"B. Kluge, Dirk Bank, Erwin Prassler","doi":"10.1109/ROMAN.2002.1045673","DOIUrl":"https://doi.org/10.1109/ROMAN.2002.1045673","url":null,"abstract":"In this paper we describe the problem of coordinating the motion of a mobile robot with a moving guide in dynamic, continuously changing environments, and present an approach to this problem based on velocity obstacles. As a test application, the approach has been implemented on a robotic wheelchair, which is thus enabled to accompany a person through the concourse of a railway station or a pedestrian area.","PeriodicalId":222409,"journal":{"name":"Proceedings. 11th IEEE International Workshop on Robot and Human Interactive Communication","volume":"44 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-12-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114720920","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}