Pub Date : 2010-12-01DOI: 10.1109/SII.2010.5708298
Ken Sakurada, Shihoko Suzuki, K. Ohno, E. Takeuchi, S. Tadokoro, Akihiko Hata, Naoki Miyahara, K. Higashi
This thesis describes a new method that in real time predicts fall and collision in order to support remote control of a tracked vehicle with sub-tracks. A tracked vehicle has high ability of getting over rough terrain. However, it is difficult for an operator at a remote place to control the vehicle's moving direction and speed. Hence, we propose a new path evaluation system based on the measurement of environmental shapes around the vehicle. In this system, the candidate paths are generated by operator inputs and terrain information. For evaluating the traversability of the path, we estimate the pose of the robot on the path and contact points with the ground. Then, the combination of translational and rotational velocity is chosen.
{"title":"Real-time prediction of fall and collision of tracked vehicle for remote-control support","authors":"Ken Sakurada, Shihoko Suzuki, K. Ohno, E. Takeuchi, S. Tadokoro, Akihiko Hata, Naoki Miyahara, K. Higashi","doi":"10.1109/SII.2010.5708298","DOIUrl":"https://doi.org/10.1109/SII.2010.5708298","url":null,"abstract":"This thesis describes a new method that in real time predicts fall and collision in order to support remote control of a tracked vehicle with sub-tracks. A tracked vehicle has high ability of getting over rough terrain. However, it is difficult for an operator at a remote place to control the vehicle's moving direction and speed. Hence, we propose a new path evaluation system based on the measurement of environmental shapes around the vehicle. In this system, the candidate paths are generated by operator inputs and terrain information. For evaluating the traversability of the path, we estimate the pose of the robot on the path and contact points with the ground. Then, the combination of translational and rotational velocity is chosen.","PeriodicalId":334652,"journal":{"name":"2010 IEEE/SICE International Symposium on System Integration","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117185010","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2010-12-01DOI: 10.1109/SII.2010.5708328
Zhaojia Liu, Hiromasa Kamogawa, J. Ota
Fast transition from a stable initial state to a stable handling state is important when multiple mobile robots manipulate and transport a heavy and bulky object. In this paper, a cooperative system consisting of two mobile robots was designed to realize fast transition. A gripper robot grasps and lifts an object from one side to provide enough space for a lifter robot to lift the object. Fast transition can be formulated as an optimization problem. We propose an algorithm to realize fast transition based on the cooperative system designed. The experimental results illustrate the validity of the proposed method.
{"title":"Manipulation of an irregularly shaped object by two mobile robots","authors":"Zhaojia Liu, Hiromasa Kamogawa, J. Ota","doi":"10.1109/SII.2010.5708328","DOIUrl":"https://doi.org/10.1109/SII.2010.5708328","url":null,"abstract":"Fast transition from a stable initial state to a stable handling state is important when multiple mobile robots manipulate and transport a heavy and bulky object. In this paper, a cooperative system consisting of two mobile robots was designed to realize fast transition. A gripper robot grasps and lifts an object from one side to provide enough space for a lifter robot to lift the object. Fast transition can be formulated as an optimization problem. We propose an algorithm to realize fast transition based on the cooperative system designed. The experimental results illustrate the validity of the proposed method.","PeriodicalId":334652,"journal":{"name":"2010 IEEE/SICE International Symposium on System Integration","volume":"280 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115425540","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2010-12-01DOI: 10.1109/SII.2010.5708367
Y. Maki, S. Kagami, K. Hashimoto
We have been working on detecting an object with an accelerometer from many moving objects in a camera view. Our previous method relied on knowledges of specific appearance of the moving objects such as colors, which limited the practical use. In this paper, in order to reduce this dependency on the knowledges of object appearance as much as possible, we apply our method to natural feature points detected in the camera view. To deal with numerous feature points in real time, we propose a method to speed up the computation. We also investigate a method to cope with natural features that are not always tracked continuously. Experimental results show that the proposed method is successfully applied to natural feature points detected and tracked in real time.
{"title":"Accelerometer detection in a camera view based on feature point tracking","authors":"Y. Maki, S. Kagami, K. Hashimoto","doi":"10.1109/SII.2010.5708367","DOIUrl":"https://doi.org/10.1109/SII.2010.5708367","url":null,"abstract":"We have been working on detecting an object with an accelerometer from many moving objects in a camera view. Our previous method relied on knowledges of specific appearance of the moving objects such as colors, which limited the practical use. In this paper, in order to reduce this dependency on the knowledges of object appearance as much as possible, we apply our method to natural feature points detected in the camera view. To deal with numerous feature points in real time, we propose a method to speed up the computation. We also investigate a method to cope with natural features that are not always tracked continuously. Experimental results show that the proposed method is successfully applied to natural feature points detected and tracked in real time.","PeriodicalId":334652,"journal":{"name":"2010 IEEE/SICE International Symposium on System Integration","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124703582","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2010-12-01DOI: 10.1109/SII.2010.5708303
M. Sutoh, Tsuyoshi Ito, K. Nagatani, Kazuya Yoshida
Planetary rovers play a significant role in surface explorations on the Moon and/or Mars. However, because of wheel slippage, the wheels of planetary rovers can get stuck in loose soil, and the exploration mission may fail because of this situation. To avoid slippage and increase the wheels' drawbar pull, the wheels of planetary rovers typically have parallel fins called lugs on their surface. In this study, we conducted experiments using two-wheeled testbeds in a sandbox to provide a quantitative confirmation of the influence of lugs on the traversability of planetary rovers. In this paper, we report the results of the experiments, and discuss the influence of lugs on the traversability of planetary rovers.
{"title":"Influence evaluation of wheel surface profile on traversability of planetary rovers","authors":"M. Sutoh, Tsuyoshi Ito, K. Nagatani, Kazuya Yoshida","doi":"10.1109/SII.2010.5708303","DOIUrl":"https://doi.org/10.1109/SII.2010.5708303","url":null,"abstract":"Planetary rovers play a significant role in surface explorations on the Moon and/or Mars. However, because of wheel slippage, the wheels of planetary rovers can get stuck in loose soil, and the exploration mission may fail because of this situation. To avoid slippage and increase the wheels' drawbar pull, the wheels of planetary rovers typically have parallel fins called lugs on their surface. In this study, we conducted experiments using two-wheeled testbeds in a sandbox to provide a quantitative confirmation of the influence of lugs on the traversability of planetary rovers. In this paper, we report the results of the experiments, and discuss the influence of lugs on the traversability of planetary rovers.","PeriodicalId":334652,"journal":{"name":"2010 IEEE/SICE International Symposium on System Integration","volume":"22 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121553621","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2010-12-01DOI: 10.1109/SII.2010.5708306
Kota Toma, S. Kagami, K. Hashimoto
In this paper, we describe high-speed 3D position and normal measurement of a specified point of interest using a high-speed projector-camera systems. A special cross line pattern is adaptively projected onto the point of interest and the reflected pattern on an object surface is captured by the camera and analyzed to obtain the position and normal information. The experimental results show that the estimated normal directions were apparently correct and stable except for occasional large errors due to instantaneous detection failures. Based on this measurement system, we describe a game system by which real-time interaction with non-structured 3D real environment is achieved.
{"title":"3D measurement of a surface point using a high-speed projector-camera system for augmented reality games","authors":"Kota Toma, S. Kagami, K. Hashimoto","doi":"10.1109/SII.2010.5708306","DOIUrl":"https://doi.org/10.1109/SII.2010.5708306","url":null,"abstract":"In this paper, we describe high-speed 3D position and normal measurement of a specified point of interest using a high-speed projector-camera systems. A special cross line pattern is adaptively projected onto the point of interest and the reflected pattern on an object surface is captured by the camera and analyzed to obtain the position and normal information. The experimental results show that the estimated normal directions were apparently correct and stable except for occasional large errors due to instantaneous detection failures. Based on this measurement system, we describe a game system by which real-time interaction with non-structured 3D real environment is achieved.","PeriodicalId":334652,"journal":{"name":"2010 IEEE/SICE International Symposium on System Integration","volume":"1941 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128024615","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2010-12-01DOI: 10.1109/SII.2010.5708323
K. Hoshino, Takuya Kasahara, Naoki Igo, Motomasa Tomida, T. Mukai, Kinji Nishi, Hajime Kotani
The aim of this paper is to propose the technology to allow people to control robots by means of everyday gestures without using sensors or controllers. The hand pose estimation we propose reduces the number of image features per data set to 64, which makes the construction of a large-scale database possible. This has also made it possible to estimate the 3D hand poses of unspecified users with individual differences without sacrificing estimation accuracy. Specifically, the system we propose involved the construction in advance of a large database comprising three elements: hand joint information including the wrist, low-order proportional information on the hand images to indicate the rough hand shape, and hand pose data comprised of 64 image features per data set. To estimate a hand pose, the system first performs coarse screening to select similar data sets from the database based on the three hand proportions of the input image, and then performed a detailed search to find the data set most similar to the input images based on 64 image features. Using subjects with varying hand poses, we performed joint angle estimation using our hand pose estimation system comprised of 750,000 hand pose data sets, achieving roughly the same average estimation error as our previous system, about 2 degrees. However, the standard deviation of the estimation error was smaller than in our previous system having roughly 30,000 data sets: down from 26.91 degrees to 14.57 degrees for the index finger PIP joint and from 15.77 degrees to 10.28 degrees for the thumb. We were thus able to confirm an improvement in estimation accuracy, even for unspecified users. Further, the processing speed, using a notebook PC of normal specifications and a compact high-speed camera, was about 80 fps or more, including image capture, hand pose estimation, and CG rendering and robot control of the estimation result.
{"title":"Gesture-world environment technology for mobile manipulation","authors":"K. Hoshino, Takuya Kasahara, Naoki Igo, Motomasa Tomida, T. Mukai, Kinji Nishi, Hajime Kotani","doi":"10.1109/SII.2010.5708323","DOIUrl":"https://doi.org/10.1109/SII.2010.5708323","url":null,"abstract":"The aim of this paper is to propose the technology to allow people to control robots by means of everyday gestures without using sensors or controllers. The hand pose estimation we propose reduces the number of image features per data set to 64, which makes the construction of a large-scale database possible. This has also made it possible to estimate the 3D hand poses of unspecified users with individual differences without sacrificing estimation accuracy. Specifically, the system we propose involved the construction in advance of a large database comprising three elements: hand joint information including the wrist, low-order proportional information on the hand images to indicate the rough hand shape, and hand pose data comprised of 64 image features per data set. To estimate a hand pose, the system first performs coarse screening to select similar data sets from the database based on the three hand proportions of the input image, and then performed a detailed search to find the data set most similar to the input images based on 64 image features. Using subjects with varying hand poses, we performed joint angle estimation using our hand pose estimation system comprised of 750,000 hand pose data sets, achieving roughly the same average estimation error as our previous system, about 2 degrees. However, the standard deviation of the estimation error was smaller than in our previous system having roughly 30,000 data sets: down from 26.91 degrees to 14.57 degrees for the index finger PIP joint and from 15.77 degrees to 10.28 degrees for the thumb. We were thus able to confirm an improvement in estimation accuracy, even for unspecified users. Further, the processing speed, using a notebook PC of normal specifications and a compact high-speed camera, was about 80 fps or more, including image capture, hand pose estimation, and CG rendering and robot control of the estimation result.","PeriodicalId":334652,"journal":{"name":"2010 IEEE/SICE International Symposium on System Integration","volume":"13 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128366689","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2010-12-01DOI: 10.1109/SII.2010.5708335
Y. Yamanishi, T. Nakano, Yu Sawada, K. Itoga, T. Okano, F. Arai
This paper presents a novel method of three-dimensional fabrication using Mask-less exposure equipment and a three dimensional microfluidic application for the cell manipulation. The grayscale data can directly control the height of the photoresist to be exposed without using any mask. Three-dimensional microchannel was successfully fabricated simply by using the low cost exposure system with the height range of 0–200 μm. We have succeeded in removing the zona pellucida of oocyte passing through the 3D-microchannel whose cross section is gradually restricted along the path to provide mechanical stimuli on the surface of the oocyte in every direction. This microfluidic chip contributes to the effective high throughput of the peeled oocyte without damaging them.
{"title":"3D-microfluidic device to remove zona pellucida fabricated by Mask-less exposure technology","authors":"Y. Yamanishi, T. Nakano, Yu Sawada, K. Itoga, T. Okano, F. Arai","doi":"10.1109/SII.2010.5708335","DOIUrl":"https://doi.org/10.1109/SII.2010.5708335","url":null,"abstract":"This paper presents a novel method of three-dimensional fabrication using Mask-less exposure equipment and a three dimensional microfluidic application for the cell manipulation. The grayscale data can directly control the height of the photoresist to be exposed without using any mask. Three-dimensional microchannel was successfully fabricated simply by using the low cost exposure system with the height range of 0–200 μm. We have succeeded in removing the zona pellucida of oocyte passing through the 3D-microchannel whose cross section is gradually restricted along the path to provide mechanical stimuli on the surface of the oocyte in every direction. This microfluidic chip contributes to the effective high throughput of the peeled oocyte without damaging them.","PeriodicalId":334652,"journal":{"name":"2010 IEEE/SICE International Symposium on System Integration","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128435336","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2010-12-01DOI: 10.1109/SII.2010.5708316
S. Taghvaei, Y. Hirata, K. Kosuge
The motion of a passive-type intelligent walker is controlled based on visual estimation of the motion state of the user. The controlled walker would be a support for standing up and prevent the user from falling down. The visual motion analysis detects and tracks the human upper body parts and localizes them in 3D using a stereovision approach. Using this data the user's state is estimated to be whether seated, standing up, falling down or walking. The controller activates the servo brakes in sitting, standing and falling situations as a support to insure both comfort and safety of the user. Estimation methods are experimented with a passive-type intelligent walker referred to as “RT Walker”, equipped with servo brakes.
{"title":"Vision-based human state estimation to control an intelligent passive walker","authors":"S. Taghvaei, Y. Hirata, K. Kosuge","doi":"10.1109/SII.2010.5708316","DOIUrl":"https://doi.org/10.1109/SII.2010.5708316","url":null,"abstract":"The motion of a passive-type intelligent walker is controlled based on visual estimation of the motion state of the user. The controlled walker would be a support for standing up and prevent the user from falling down. The visual motion analysis detects and tracks the human upper body parts and localizes them in 3D using a stereovision approach. Using this data the user's state is estimated to be whether seated, standing up, falling down or walking. The controller activates the servo brakes in sitting, standing and falling situations as a support to insure both comfort and safety of the user. Estimation methods are experimented with a passive-type intelligent walker referred to as “RT Walker”, equipped with servo brakes.","PeriodicalId":334652,"journal":{"name":"2010 IEEE/SICE International Symposium on System Integration","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129714461","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2010-12-01DOI: 10.1109/SII.2010.5708312
K. Itabashi, M. Kumagai
There are types of walking robots using wheels with their legs. Among those, the robots with passive wheels generate propulsive force by specially designed periodic leg motions. The authors had proposed to use a special axle mechanism that can change its curvature to track a designed path for propulsion. The mechanism showed not only straightforward motion but also curved motion and pivoting motion that is unique to the method. However, the robot did not have enough stiffness for further quantitative investigation. Therefore a new bipedal walking robot was developed for the work. The developed robot could perform the roller walking with the designed forward movement. The method of the roller walk of a biped robot is described briefly followed by the design and implementation of the robot. An idea to use Bézier curve for motion trajectory is also introduced. Experimental results are also described and shown in an accompanied video.
{"title":"Development of a human type legged robot with roller skates","authors":"K. Itabashi, M. Kumagai","doi":"10.1109/SII.2010.5708312","DOIUrl":"https://doi.org/10.1109/SII.2010.5708312","url":null,"abstract":"There are types of walking robots using wheels with their legs. Among those, the robots with passive wheels generate propulsive force by specially designed periodic leg motions. The authors had proposed to use a special axle mechanism that can change its curvature to track a designed path for propulsion. The mechanism showed not only straightforward motion but also curved motion and pivoting motion that is unique to the method. However, the robot did not have enough stiffness for further quantitative investigation. Therefore a new bipedal walking robot was developed for the work. The developed robot could perform the roller walking with the designed forward movement. The method of the roller walk of a biped robot is described briefly followed by the design and implementation of the robot. An idea to use Bézier curve for motion trajectory is also introduced. Experimental results are also described and shown in an accompanied video.","PeriodicalId":334652,"journal":{"name":"2010 IEEE/SICE International Symposium on System Integration","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123989811","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2010-12-01DOI: 10.1109/SII.2010.5708332
Takanori Matsukawa, S. Arai, K. Hashimoto
We propose a real-time camera calibration method for an autonomous flight of a small helicopter. Our purpose is to control a small helicopter automatically by using cameras fixed on the ground. We use calibrated cameras, un-calibrated cameras, and the small helicopter that is not attached with any sensors. The proposed method finds correspondences between image features in the two images of a calibrated camera and an un-calibrated camera, and estimates the extrinsic parameters of cameras using a particle filter in real time. We evaluate a utility of the proposed method by experiments. We compare real-time calibration by typical bundle adjustment with a Gauss Newton method to the proposed method in the experiment of the small helicopter flight. The autonomous flight of the small helicopter can be achieved by the proposed method in the situation that a flight of a helicopter can not be achieved with typical bundle adjustment with a Gauss Newton method.
{"title":"Autonomous flight of small helicopter with real-time camera calibration","authors":"Takanori Matsukawa, S. Arai, K. Hashimoto","doi":"10.1109/SII.2010.5708332","DOIUrl":"https://doi.org/10.1109/SII.2010.5708332","url":null,"abstract":"We propose a real-time camera calibration method for an autonomous flight of a small helicopter. Our purpose is to control a small helicopter automatically by using cameras fixed on the ground. We use calibrated cameras, un-calibrated cameras, and the small helicopter that is not attached with any sensors. The proposed method finds correspondences between image features in the two images of a calibrated camera and an un-calibrated camera, and estimates the extrinsic parameters of cameras using a particle filter in real time. We evaluate a utility of the proposed method by experiments. We compare real-time calibration by typical bundle adjustment with a Gauss Newton method to the proposed method in the experiment of the small helicopter flight. The autonomous flight of the small helicopter can be achieved by the proposed method in the situation that a flight of a helicopter can not be achieved with typical bundle adjustment with a Gauss Newton method.","PeriodicalId":334652,"journal":{"name":"2010 IEEE/SICE International Symposium on System Integration","volume":"20 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124581066","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}