Pub Date : 2011-11-01DOI: 10.1109/URAI.2011.6145907
J. Kwak, C. Kang
External influences in a motion control system contaminate measurement data due to various sources. Much effort has been tried to remove or reduce the external influences in order to obtain more accurate data. In this paper, we present a recipe to reduce the external influences, which is to integrate two sets of data utilizing two opto-electronic devices. The recipe is verified by experimental demonstrations using an up-down motion control system.
{"title":"Integration of two opto-electronic devices for data acquisition reducing external influences","authors":"J. Kwak, C. Kang","doi":"10.1109/URAI.2011.6145907","DOIUrl":"https://doi.org/10.1109/URAI.2011.6145907","url":null,"abstract":"External influences in a motion control system contaminate measurement data due to various sources. Much effort has been tried to remove or reduce the external influences in order to obtain more accurate data. In this paper, we present a recipe to reduce the external influences, which is to integrate two sets of data utilizing two opto-electronic devices. The recipe is verified by experimental demonstrations using an up-down motion control system.","PeriodicalId":385925,"journal":{"name":"2011 8th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI)","volume":"23 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134083656","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2011-11-01DOI: 10.1109/URAI.2011.6146030
S. Divakar
This paper presents a way to build a cell phone controlled rocker-bogie suspension type rover with a scooping arm. This rover can swap between autonomous and manual control states. It has a wireless camera onboard which transmits videos to a nearby laptop with a range of 100 feet. As well as an ultra-sonic sensor which is mounted on a B0 motor to help the sonar rotate in place to detect and avoid obstacles on all 3 sides. It also houses a samsung 3200 mobile phone along with the Atmega32 board with 6 onboard drivers so that it can be controlled by a cell phone by the human operator through the DTMF technique. The scooping arm is attached to the rear portion of the rover and a B0 motor controls the scooping operations. Here the rover continues moving straight until an obstacle has been detected by the sonar. The sonar then turns left to see if there are any obstacles to the left, if there are no obstacles then the rover turns left. If there is an obstacle then it sees right, if there no obstacles then the rover turns right. However if there are obstacles then the scooping arm scoops some rock samples onto the body of the rover and the rover turns 180 degrees backwards and the process repeats. If at all at any point the human operator watching what the rover is seeing wishes to control it, he can call the cell phone stationed on the rover and control all actions of the rover and swap back when done.
{"title":"Cell phone controlled rocker-bogie suspension type rover with a scooping arm","authors":"S. Divakar","doi":"10.1109/URAI.2011.6146030","DOIUrl":"https://doi.org/10.1109/URAI.2011.6146030","url":null,"abstract":"This paper presents a way to build a cell phone controlled rocker-bogie suspension type rover with a scooping arm. This rover can swap between autonomous and manual control states. It has a wireless camera onboard which transmits videos to a nearby laptop with a range of 100 feet. As well as an ultra-sonic sensor which is mounted on a B0 motor to help the sonar rotate in place to detect and avoid obstacles on all 3 sides. It also houses a samsung 3200 mobile phone along with the Atmega32 board with 6 onboard drivers so that it can be controlled by a cell phone by the human operator through the DTMF technique. The scooping arm is attached to the rear portion of the rover and a B0 motor controls the scooping operations. Here the rover continues moving straight until an obstacle has been detected by the sonar. The sonar then turns left to see if there are any obstacles to the left, if there are no obstacles then the rover turns left. If there is an obstacle then it sees right, if there no obstacles then the rover turns right. However if there are obstacles then the scooping arm scoops some rock samples onto the body of the rover and the rover turns 180 degrees backwards and the process repeats. If at all at any point the human operator watching what the rover is seeing wishes to control it, he can call the cell phone stationed on the rover and control all actions of the rover and swap back when done.","PeriodicalId":385925,"journal":{"name":"2011 8th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI)","volume":"32 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134402564","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2011-11-01DOI: 10.1109/URAI.2011.6145902
R. R. Igorevich, E. P. Ismoilovich, D. Min
This paper explores interaction design and real time implementation of behavioral synchronization of Human and Humanoid robot. Human postures are captured into a 3D skeleton using Kinect sensor. Single 3D coordinates of joints can be derived from the skeleton. For the behavioral synchronization primitive humanoid robot with 16 servo motors are used. Human's joint coordinates are extracted from the standing posture. Using mechanism suggested in this paper, the joints information converted and mapped to the humanoid robot. The overall architecture was designed and implemented in expandable way. This gives opportunity to control humanoid robot locally as well as remotely. Suggested behavioral synchronization method from this work is implemented and validated. Time delay issues are also considered as a significant topic in instance implementation. Behavior synchronization of human and robot have totally 400 millisecond time delays, where 50% is related to humanoid's internal latency. Additionally suggested architecture provides another mode, where user can use alternative controllers.
{"title":"Behavioral synchronization of human and humanoid robot","authors":"R. R. Igorevich, E. P. Ismoilovich, D. Min","doi":"10.1109/URAI.2011.6145902","DOIUrl":"https://doi.org/10.1109/URAI.2011.6145902","url":null,"abstract":"This paper explores interaction design and real time implementation of behavioral synchronization of Human and Humanoid robot. Human postures are captured into a 3D skeleton using Kinect sensor. Single 3D coordinates of joints can be derived from the skeleton. For the behavioral synchronization primitive humanoid robot with 16 servo motors are used. Human's joint coordinates are extracted from the standing posture. Using mechanism suggested in this paper, the joints information converted and mapped to the humanoid robot. The overall architecture was designed and implemented in expandable way. This gives opportunity to control humanoid robot locally as well as remotely. Suggested behavioral synchronization method from this work is implemented and validated. Time delay issues are also considered as a significant topic in instance implementation. Behavior synchronization of human and robot have totally 400 millisecond time delays, where 50% is related to humanoid's internal latency. Additionally suggested architecture provides another mode, where user can use alternative controllers.","PeriodicalId":385925,"journal":{"name":"2011 8th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI)","volume":"67 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133611096","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2011-11-01DOI: 10.1109/URAI.2011.6145978
M. Ryoo, J. Joung, Wonpil Yu
In this work-in-progress paper, we present an efficient methodology for a scale-adaptive recognition of objects. We introduce a new object recognition approach, which detects an object in a scene while probabilistically predicting visually missing features. The idea is to enable a better recognition by considering the fact that object features may not be detected depending on its situation (e.g. distance and occlusion). A probabilistic voting-based methodology is developed.
{"title":"Compensating for visually missing features: Scale adaptive recognition of objects using probabilistic voting","authors":"M. Ryoo, J. Joung, Wonpil Yu","doi":"10.1109/URAI.2011.6145978","DOIUrl":"https://doi.org/10.1109/URAI.2011.6145978","url":null,"abstract":"In this work-in-progress paper, we present an efficient methodology for a scale-adaptive recognition of objects. We introduce a new object recognition approach, which detects an object in a scene while probabilistically predicting visually missing features. The idea is to enable a better recognition by considering the fact that object features may not be detected depending on its situation (e.g. distance and occlusion). A probabilistic voting-based methodology is developed.","PeriodicalId":385925,"journal":{"name":"2011 8th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI)","volume":"8 2","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132365594","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2011-11-01DOI: 10.1109/URAI.2011.6145916
Ji-Hyun Jung, Yoo-Seon Bang
As a part of the national policy which aimed at advancing early childhood education, two types of intelligent robot platforms authenticated by KIST (Korea Institute of Science and Technology) have been disseminated to kindergarten classes all over the country. The effectiveness, the usability and capabilities of the R-learning content should be reviewed and verified throughout the practices held in the classroom. It is also very important to check if those authenticated robots have been playing the adequate roles using high quality of educational contents. The purpose of this study is to demonstrate the best R-learning practices from kindergarten classes developed by teachers and further provide basic information that is useful in continually developing the R-learning content and teaching-learning models. Two cases from the best R-learning practices were drawn and depicted in detail. The two cases were devised through teachers' voluntary involvement in an R-learning study group after robots were offered to the kindergarten classes. If R-learning can be used easily in everyday classes, it will be really inspiring and encouraging. However, it would be much more beneficial to fully understand the contents and functions of R-learning and create qualified and differentiated R-learning classes. It was suggested that to train teachers and develop and apply educational programs using R-learning at each phase, there needs to be continued interest and support for R-learning study group activities
{"title":"A study of the use of R-learning content in kindergartens","authors":"Ji-Hyun Jung, Yoo-Seon Bang","doi":"10.1109/URAI.2011.6145916","DOIUrl":"https://doi.org/10.1109/URAI.2011.6145916","url":null,"abstract":"As a part of the national policy which aimed at advancing early childhood education, two types of intelligent robot platforms authenticated by KIST (Korea Institute of Science and Technology) have been disseminated to kindergarten classes all over the country. The effectiveness, the usability and capabilities of the R-learning content should be reviewed and verified throughout the practices held in the classroom. It is also very important to check if those authenticated robots have been playing the adequate roles using high quality of educational contents. The purpose of this study is to demonstrate the best R-learning practices from kindergarten classes developed by teachers and further provide basic information that is useful in continually developing the R-learning content and teaching-learning models. Two cases from the best R-learning practices were drawn and depicted in detail. The two cases were devised through teachers' voluntary involvement in an R-learning study group after robots were offered to the kindergarten classes. If R-learning can be used easily in everyday classes, it will be really inspiring and encouraging. However, it would be much more beneficial to fully understand the contents and functions of R-learning and create qualified and differentiated R-learning classes. It was suggested that to train teachers and develop and apply educational programs using R-learning at each phase, there needs to be continued interest and support for R-learning study group activities","PeriodicalId":385925,"journal":{"name":"2011 8th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI)","volume":"43 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115005081","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2011-11-01DOI: 10.1109/URAI.2011.6145893
Won-Sup Kim
There is a little difference between engineers and users in their attitude towards the robot. Many service robots have been developed and designers are concerned about the appearance of the service robot. It is a similar situation in today's smart product development process. This paper introduces the design of a new type of home service robot and proposes the design approach method needed to make the service robots successful goods. The focus is on the appearance of the design to aid the ongoing relationship between the users and the product. The appropriate user stimulation will build a good relationship between users and the service robots.
{"title":"Robotized products","authors":"Won-Sup Kim","doi":"10.1109/URAI.2011.6145893","DOIUrl":"https://doi.org/10.1109/URAI.2011.6145893","url":null,"abstract":"There is a little difference between engineers and users in their attitude towards the robot. Many service robots have been developed and designers are concerned about the appearance of the service robot. It is a similar situation in today's smart product development process. This paper introduces the design of a new type of home service robot and proposes the design approach method needed to make the service robots successful goods. The focus is on the appearance of the design to aid the ongoing relationship between the users and the product. The appropriate user stimulation will build a good relationship between users and the service robots.","PeriodicalId":385925,"journal":{"name":"2011 8th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI)","volume":"16 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115140169","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2011-11-01DOI: 10.1109/URAI.2011.6145881
Sang-Mun Lee, Kyoung-Don Lee, Heung-Ki Min, Tae-Sung Noh, Sung-Tae Kim, Jeong-Woo Lee
The parameter study on the grasping characteristics of the humanoid robot hand with a spherical four bar linkage as a finger joint. The humanoid robot hand has one thumb and three fingers, each of which has two spherical four bar linkages, respectively. The humanoid robot hand is 280mm long and less than 2.5kg by weight. The robot hand totally has 13 degree of freedom: one thumb has 4 degree of freedom: each finger has 3 degree of freedom. The spherical four bar linkage was specially designed to rotate like a human finger joint. The robot hand can perform the dexterous motion like a human hand but also the relatively powerful grasping. Two parameters for the grasping characteristics studied in this paper were a diameter of a cylinder to be grasped, the type of grasping. The contact force of the finger joints were calculated for four cases. Numerical analysis had been done in a position control mode by Recurdyn with Matlab/Simulink.
{"title":"Parameter study on the grasping characteristics of the humanoid robot hand with spherical four bar linkages","authors":"Sang-Mun Lee, Kyoung-Don Lee, Heung-Ki Min, Tae-Sung Noh, Sung-Tae Kim, Jeong-Woo Lee","doi":"10.1109/URAI.2011.6145881","DOIUrl":"https://doi.org/10.1109/URAI.2011.6145881","url":null,"abstract":"The parameter study on the grasping characteristics of the humanoid robot hand with a spherical four bar linkage as a finger joint. The humanoid robot hand has one thumb and three fingers, each of which has two spherical four bar linkages, respectively. The humanoid robot hand is 280mm long and less than 2.5kg by weight. The robot hand totally has 13 degree of freedom: one thumb has 4 degree of freedom: each finger has 3 degree of freedom. The spherical four bar linkage was specially designed to rotate like a human finger joint. The robot hand can perform the dexterous motion like a human hand but also the relatively powerful grasping. Two parameters for the grasping characteristics studied in this paper were a diameter of a cylinder to be grasped, the type of grasping. The contact force of the finger joints were calculated for four cases. Numerical analysis had been done in a position control mode by Recurdyn with Matlab/Simulink.","PeriodicalId":385925,"journal":{"name":"2011 8th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI)","volume":"42 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114720138","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2011-11-01DOI: 10.1109/URAI.2011.6145975
Jaemin Byun, Junyoung Sung, Myungchan Roh, Sunghoon Kim
This paper presents a strategy to detect and track the curb which means a stone separates road and non-road using a LRF (Laser Range Finder) with particle filter. Our approach to find the position of curb involves two stages. In first stage, Curb detection through curb geometric shape recognize and detect the position of the curbs in laser scanner data. A second stage processes is the curb estimation and tracking using particle filter with laser scanner. Experiments are carried out at the autonomous vehicle in real structured road, and results show the promising performance of the presented method.
{"title":"Autonomous driving through Curb detection and tracking","authors":"Jaemin Byun, Junyoung Sung, Myungchan Roh, Sunghoon Kim","doi":"10.1109/URAI.2011.6145975","DOIUrl":"https://doi.org/10.1109/URAI.2011.6145975","url":null,"abstract":"This paper presents a strategy to detect and track the curb which means a stone separates road and non-road using a LRF (Laser Range Finder) with particle filter. Our approach to find the position of curb involves two stages. In first stage, Curb detection through curb geometric shape recognize and detect the position of the curbs in laser scanner data. A second stage processes is the curb estimation and tracking using particle filter with laser scanner. Experiments are carried out at the autonomous vehicle in real structured road, and results show the promising performance of the presented method.","PeriodicalId":385925,"journal":{"name":"2011 8th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI)","volume":"77 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114778570","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2011-11-01DOI: 10.1109/URAI.2011.6172974
K. Varadarajan, Ishaan Gupta, M. Vincze
Grasping by Components (GBC) is a very important component of any scalable and holistic grasping system that abstracts point cloud object data to work with arbitrary shapes with no apriori data. Superquadric representation of point cloud data is a suitable parametric method for representing and manipulating point cloud data. Most Superquadrics based grasp hypotheses generation methods perform the step of classifying the parametric shapes into one of different simple shapes with apriori established grasp hypotheses. Such a method is suitable for simple scenarios. But for a holistic and scalable grasping system, direct grasp hypothesis generation from Superquadric representation is crucial. In this paper, we present an algorithm to directly estimate grasp points and approach vectors from Superquadric parameters. We also present results for a number of complex Superquadric shapes and show that the results are in line with grasp hypotheses conventionally generated by humans.
{"title":"Grasp hypothesis generation for parametric object 3D point cloud models","authors":"K. Varadarajan, Ishaan Gupta, M. Vincze","doi":"10.1109/URAI.2011.6172974","DOIUrl":"https://doi.org/10.1109/URAI.2011.6172974","url":null,"abstract":"Grasping by Components (GBC) is a very important component of any scalable and holistic grasping system that abstracts point cloud object data to work with arbitrary shapes with no apriori data. Superquadric representation of point cloud data is a suitable parametric method for representing and manipulating point cloud data. Most Superquadrics based grasp hypotheses generation methods perform the step of classifying the parametric shapes into one of different simple shapes with apriori established grasp hypotheses. Such a method is suitable for simple scenarios. But for a holistic and scalable grasping system, direct grasp hypothesis generation from Superquadric representation is crucial. In this paper, we present an algorithm to directly estimate grasp points and approach vectors from Superquadric parameters. We also present results for a number of complex Superquadric shapes and show that the results are in line with grasp hypotheses conventionally generated by humans.","PeriodicalId":385925,"journal":{"name":"2011 8th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI)","volume":"49 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114788594","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2011-11-01DOI: 10.1109/URAI.2011.6145965
S. Chi, Young-Jo Cho, Jaeyeon Lee, H. Yoon, Do-Hyung Kim, Jaehong Kim, T. Hori, Miki Sato
This paper is based on our extensive surveys on human-robot interaction function methodologies and implementations, which are currently used in robotic products and research project in Japan and Korea. This specification defines a framework that can handle messages and data exchanged between human-robot interaction service components and service applications.
{"title":"Revised robotic interaction service (RoIS) framework","authors":"S. Chi, Young-Jo Cho, Jaeyeon Lee, H. Yoon, Do-Hyung Kim, Jaehong Kim, T. Hori, Miki Sato","doi":"10.1109/URAI.2011.6145965","DOIUrl":"https://doi.org/10.1109/URAI.2011.6145965","url":null,"abstract":"This paper is based on our extensive surveys on human-robot interaction function methodologies and implementations, which are currently used in robotic products and research project in Japan and Korea. This specification defines a framework that can handle messages and data exchanged between human-robot interaction service components and service applications.","PeriodicalId":385925,"journal":{"name":"2011 8th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI)","volume":"195 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134243766","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}