Pub Date : 2013-12-02DOI: 10.1109/URAI.2013.6677463
N. Ha, N. Goo
In this study, we experimentally studied the relationship between wingbeat frequency and resonant frequency of 30 individuals of eight insect species from five families: Odonata (Sympetrum flaveolum), Lepidoptera (Pieris rapae, Plusia gamma and Ochlodes), Hymenoptera (Xylocopa pubescens and Bombus rupestric), Hemiptera (Tibicen linnei) and Coleoptera (Allomyrina dichotoma). We found that wingbeat frequency does not have a strong relation with resonance frequency: in other words, insects have not been evolved sufficiently to flap at their wings' structural resonant frequency. This contradicts the general conclusion of other reports-that insects flap at their wings' resonant frequency to take advantage of passive deformation to save energy.
{"title":"Flapping frequency and resonant frequency of insect wings","authors":"N. Ha, N. Goo","doi":"10.1109/URAI.2013.6677463","DOIUrl":"https://doi.org/10.1109/URAI.2013.6677463","url":null,"abstract":"In this study, we experimentally studied the relationship between wingbeat frequency and resonant frequency of 30 individuals of eight insect species from five families: Odonata (Sympetrum flaveolum), Lepidoptera (Pieris rapae, Plusia gamma and Ochlodes), Hymenoptera (Xylocopa pubescens and Bombus rupestric), Hemiptera (Tibicen linnei) and Coleoptera (Allomyrina dichotoma). We found that wingbeat frequency does not have a strong relation with resonance frequency: in other words, insects have not been evolved sufficiently to flap at their wings' structural resonant frequency. This contradicts the general conclusion of other reports-that insects flap at their wings' resonant frequency to take advantage of passive deformation to save energy.","PeriodicalId":431699,"journal":{"name":"2013 10th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI)","volume":"70 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-12-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130841576","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-12-02DOI: 10.1109/URAI.2013.6677332
Won-Young Lee, Seung-Hyun Lee, Mun-Suck Jang, Eung-Hyuk Lee
This paper suggests linearity enhancement algorithm for slope driving of Intelligent Walker. Intelligent Walker happens to get off track due to external forces from robot's weight and the degree of the slope while slope driving. In order to compensate this, this research used the controller that estimates the external forces according to the slope of road surface and adjusts it to the motor output. Also, through comparisons between targeted rotational angular velocity which the user inputs and its velocity of the robot, algorithm was applied which applies a weight to each shaft. As a result of applying the proposed correction controller to Intelligent Walker, it diverges in case of non-compensation experiments that deviates when moving, but it case of applying the ramp calibration algorithm, the deviation distance at max was within 5cm that it keeps safe driving, and change rate of deviation distance was also stabilized after 1m where no more changes occurred.
{"title":"A study on methods for improving the straightness of the intelligent walker to move on slope","authors":"Won-Young Lee, Seung-Hyun Lee, Mun-Suck Jang, Eung-Hyuk Lee","doi":"10.1109/URAI.2013.6677332","DOIUrl":"https://doi.org/10.1109/URAI.2013.6677332","url":null,"abstract":"This paper suggests linearity enhancement algorithm for slope driving of Intelligent Walker. Intelligent Walker happens to get off track due to external forces from robot's weight and the degree of the slope while slope driving. In order to compensate this, this research used the controller that estimates the external forces according to the slope of road surface and adjusts it to the motor output. Also, through comparisons between targeted rotational angular velocity which the user inputs and its velocity of the robot, algorithm was applied which applies a weight to each shaft. As a result of applying the proposed correction controller to Intelligent Walker, it diverges in case of non-compensation experiments that deviates when moving, but it case of applying the ramp calibration algorithm, the deviation distance at max was within 5cm that it keeps safe driving, and change rate of deviation distance was also stabilized after 1m where no more changes occurred.","PeriodicalId":431699,"journal":{"name":"2013 10th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI)","volume":"20 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-12-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127771859","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-12-02DOI: 10.1109/URAI.2013.6677313
Kuk Cho, Seung-Ho Baeg, Sangdeok Park
This paper describes a 3D pose estimation method for a quadruped robot and a target tracking method for its navigation. The estimated 3D pose is a key resource for walking robot operation. The pose is applied to two components: the robot's walking control and navigation. The estimated robot pose can be used to compensate sensor data such as camera and lidar. The estimated target is used as part of a leader-following system in a GPS-denied environment. In this paper, we show a 3D pose estimation method for the robot for and target tracking for leader-following navigation.
{"title":"3D pose and target position estimation for a quadruped walking robot","authors":"Kuk Cho, Seung-Ho Baeg, Sangdeok Park","doi":"10.1109/URAI.2013.6677313","DOIUrl":"https://doi.org/10.1109/URAI.2013.6677313","url":null,"abstract":"This paper describes a 3D pose estimation method for a quadruped robot and a target tracking method for its navigation. The estimated 3D pose is a key resource for walking robot operation. The pose is applied to two components: the robot's walking control and navigation. The estimated robot pose can be used to compensate sensor data such as camera and lidar. The estimated target is used as part of a leader-following system in a GPS-denied environment. In this paper, we show a 3D pose estimation method for the robot for and target tracking for leader-following navigation.","PeriodicalId":431699,"journal":{"name":"2013 10th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI)","volume":"2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-12-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125514325","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-12-02DOI: 10.1109/URAI.2013.6677323
Shimon Ajisaka, T. Kubota, H. Hashimoto
Recently personal vehicles have received a lot of attention to expand our mobility for low carbon society and diversification of individual mobility. So far several small vehicles have been developed. However a new thesis of designing and controlling personal vehicles is required, because there are some difference between general vehicle-used environment and personal vehicle-used environment like mixed traffic. In such new kinds of environment, vehicles should have close relationship with drivers. To have closer relationship, personal vehicles must be smaller and lighter. In addition, psychological and biological status should be considered to control vehicles, but there is few method to measure and estimate such status. The authors define such personal vehicles as Affinitive Personal Vehicle and try to estimate balance control ability of driver. In this study, the authors focus on vehicle's posture and body sway of a driver to observe driver's status and get some knowledge on balance control ability.
{"title":"Human balance control ability for affinitive personal vehicle","authors":"Shimon Ajisaka, T. Kubota, H. Hashimoto","doi":"10.1109/URAI.2013.6677323","DOIUrl":"https://doi.org/10.1109/URAI.2013.6677323","url":null,"abstract":"Recently personal vehicles have received a lot of attention to expand our mobility for low carbon society and diversification of individual mobility. So far several small vehicles have been developed. However a new thesis of designing and controlling personal vehicles is required, because there are some difference between general vehicle-used environment and personal vehicle-used environment like mixed traffic. In such new kinds of environment, vehicles should have close relationship with drivers. To have closer relationship, personal vehicles must be smaller and lighter. In addition, psychological and biological status should be considered to control vehicles, but there is few method to measure and estimate such status. The authors define such personal vehicles as Affinitive Personal Vehicle and try to estimate balance control ability of driver. In this study, the authors focus on vehicle's posture and body sway of a driver to observe driver's status and get some knowledge on balance control ability.","PeriodicalId":431699,"journal":{"name":"2013 10th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-12-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129043673","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-12-02DOI: 10.1109/URAI.2013.6677345
Jaeshik Yoon, Jae-Han Park, M. Baeg
This paper presents GPU-based collision detection method that accelerates collision queries for sampling-based motion planning. This approach uses many-core GPUs. To take advantage of a many-core GPU, kinematic and collision detection is calculated by the GPU. The experimental results indicate that this approach can result in a ten-fold faster performance than when using a CPU.
{"title":"GPU-based collision detection for sampling-based motion planning","authors":"Jaeshik Yoon, Jae-Han Park, M. Baeg","doi":"10.1109/URAI.2013.6677345","DOIUrl":"https://doi.org/10.1109/URAI.2013.6677345","url":null,"abstract":"This paper presents GPU-based collision detection method that accelerates collision queries for sampling-based motion planning. This approach uses many-core GPUs. To take advantage of a many-core GPU, kinematic and collision detection is calculated by the GPU. The experimental results indicate that this approach can result in a ten-fold faster performance than when using a CPU.","PeriodicalId":431699,"journal":{"name":"2013 10th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI)","volume":"79 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-12-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126823581","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-12-02DOI: 10.1109/URAI.2013.6677361
Kyekyung Kim, Sangseung Kang, Jaehong Kim, Jaeyeon Lee, Joongbae Kim, Jinho Kim
Vision-based object recognition has been studied intensively because of many application fields, especially, manufacturing process in industrial robot application. But it has been challenged due to illumination effect, diverse material object, atypical shape object, etc. In this paper, multiple object recognition including complex shape object has been proposed. The object is consisted of variable characteristic, which has reflection material surface wrapped by plastic or flexible shape. Object segmentation using back light and pose estimation by maximal axis detection, and object recognition by NN have developed. We have evaluated recognition performance on database of ETRI, which has acquired under various lighting conditions.
{"title":"Multiple objects recognition for industrial robot applications","authors":"Kyekyung Kim, Sangseung Kang, Jaehong Kim, Jaeyeon Lee, Joongbae Kim, Jinho Kim","doi":"10.1109/URAI.2013.6677361","DOIUrl":"https://doi.org/10.1109/URAI.2013.6677361","url":null,"abstract":"Vision-based object recognition has been studied intensively because of many application fields, especially, manufacturing process in industrial robot application. But it has been challenged due to illumination effect, diverse material object, atypical shape object, etc. In this paper, multiple object recognition including complex shape object has been proposed. The object is consisted of variable characteristic, which has reflection material surface wrapped by plastic or flexible shape. Object segmentation using back light and pose estimation by maximal axis detection, and object recognition by NN have developed. We have evaluated recognition performance on database of ETRI, which has acquired under various lighting conditions.","PeriodicalId":431699,"journal":{"name":"2013 10th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI)","volume":"60 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-12-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126925028","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-12-02DOI: 10.1109/URAI.2013.6677399
Kyu-Dae Ban, H. Yoon, Jaehong Kim
It's difficult to find location in indoor environments such as subway station and airport passenger terminal. However, the detection and recognition of public sign make it possible to find out the place to go. Proposed system can help visually impaired persons or mobile robots in unfamiliar places. We propose the detection method of public signs in natural images. Especially, we dealt with several signs in subway station such as numbers, arrows, ticket, elevator, escalator, phone, toilet, and arrows giving the direction of those places.
{"title":"Public signs detection in subway station images","authors":"Kyu-Dae Ban, H. Yoon, Jaehong Kim","doi":"10.1109/URAI.2013.6677399","DOIUrl":"https://doi.org/10.1109/URAI.2013.6677399","url":null,"abstract":"It's difficult to find location in indoor environments such as subway station and airport passenger terminal. However, the detection and recognition of public sign make it possible to find out the place to go. Proposed system can help visually impaired persons or mobile robots in unfamiliar places. We propose the detection method of public signs in natural images. Especially, we dealt with several signs in subway station such as numbers, arrows, ticket, elevator, escalator, phone, toilet, and arrows giving the direction of those places.","PeriodicalId":431699,"journal":{"name":"2013 10th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI)","volume":"42 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-12-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127093700","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-12-02DOI: 10.1109/URAI.2013.6677474
Chaehyeuk Lee, Da-Ni Joo, G. Kim, Byung soo Kim, Gwang hoon Lee, Soon‐Geul Lee
Robots for pipe inspection have been studied for a long time and many mobile mechanisms have been proposed to achieve the inspection task within pipelines. Localization is an important factor for inpipe robot to perform a successful autonomous operation. However, because of its unique characteristics of inpipe condition, sensors which have good performance in localization, like GPS and beacon, cannot be used. In this paper, a localization using elbow detection is presented. With the laser light image processing and the angular velocity of IMU, the odometry module of robot determines whether it is on straight pipe or elbow, and minimizes the integration error in orientation. The experiment environment has consisted of two straight pipes and an elbow, and the pipe map has been constructed as the result.
{"title":"Elbow detection for localization of a mobile robot inside pipeline using laser pointers","authors":"Chaehyeuk Lee, Da-Ni Joo, G. Kim, Byung soo Kim, Gwang hoon Lee, Soon‐Geul Lee","doi":"10.1109/URAI.2013.6677474","DOIUrl":"https://doi.org/10.1109/URAI.2013.6677474","url":null,"abstract":"Robots for pipe inspection have been studied for a long time and many mobile mechanisms have been proposed to achieve the inspection task within pipelines. Localization is an important factor for inpipe robot to perform a successful autonomous operation. However, because of its unique characteristics of inpipe condition, sensors which have good performance in localization, like GPS and beacon, cannot be used. In this paper, a localization using elbow detection is presented. With the laser light image processing and the angular velocity of IMU, the odometry module of robot determines whether it is on straight pipe or elbow, and minimizes the integration error in orientation. The experiment environment has consisted of two straight pipes and an elbow, and the pipe map has been constructed as the result.","PeriodicalId":431699,"journal":{"name":"2013 10th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI)","volume":"72 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-12-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127357686","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-12-02DOI: 10.1109/URAI.2013.6677453
Chan-Soon Lim, W. Lee, Jeong-Yean Yang, D. Kwon
We are developing a XENMA pelvis rehabilitation robot for the gait motion which can generate periodic motion with specific variable crank structures. We are focused on trigonometric property of pelvis motion on gait. We designed robot for patient to ride it, and control his pelvis with two crank-slider mechanisms and variable crank structure that can control its crank length. We also designed modified inverse kinematics methods for a variable crank structures with both geometrical and resolved motion rate control solutions. Modified IK method is simulated and result shows our solution is valid and makes better following performance.
{"title":"The modified inverse kinematics on variable crank for the XENMA pelvis rehabilitation robot","authors":"Chan-Soon Lim, W. Lee, Jeong-Yean Yang, D. Kwon","doi":"10.1109/URAI.2013.6677453","DOIUrl":"https://doi.org/10.1109/URAI.2013.6677453","url":null,"abstract":"We are developing a XENMA pelvis rehabilitation robot for the gait motion which can generate periodic motion with specific variable crank structures. We are focused on trigonometric property of pelvis motion on gait. We designed robot for patient to ride it, and control his pelvis with two crank-slider mechanisms and variable crank structure that can control its crank length. We also designed modified inverse kinematics methods for a variable crank structures with both geometrical and resolved motion rate control solutions. Modified IK method is simulated and result shows our solution is valid and makes better following performance.","PeriodicalId":431699,"journal":{"name":"2013 10th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI)","volume":"3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-12-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121861503","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-12-02DOI: 10.1109/URAI.2013.6677502
Tae-Keun Kim, Dong Yeop Kim, D. H. Cha, Seunz-vun Choi, Bone-Seck Kim, Jung-Hoon Hwang, Chang-Woo Park
This paper discusses a torque sensor for robot hands along with its novel calibration method. Recently, there has been increasing interest in service robots which can manipulate various objects to assist humans. A dexterous robot with precise sensors can greatly enhance the performance of such robots. In this study, we propose a joint torque sensor for robot hands and a calibration method, which allows the users to easily calibrate the robot hand without disassembling it.
{"title":"Development of joint torque sensor and calibration method for robot finger","authors":"Tae-Keun Kim, Dong Yeop Kim, D. H. Cha, Seunz-vun Choi, Bone-Seck Kim, Jung-Hoon Hwang, Chang-Woo Park","doi":"10.1109/URAI.2013.6677502","DOIUrl":"https://doi.org/10.1109/URAI.2013.6677502","url":null,"abstract":"This paper discusses a torque sensor for robot hands along with its novel calibration method. Recently, there has been increasing interest in service robots which can manipulate various objects to assist humans. A dexterous robot with precise sensors can greatly enhance the performance of such robots. In this study, we propose a joint torque sensor for robot hands and a calibration method, which allows the users to easily calibrate the robot hand without disassembling it.","PeriodicalId":431699,"journal":{"name":"2013 10th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI)","volume":"2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-12-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126728563","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}