首页 > 最新文献

2010 IEEE/SICE International Symposium on System Integration最新文献

英文 中文
Real-time prediction of fall and collision of tracked vehicle for remote-control support 用于遥控保障的履带车辆坠落碰撞实时预测
Pub Date : 2010-12-01 DOI: 10.1109/SII.2010.5708298
Ken Sakurada, Shihoko Suzuki, K. Ohno, E. Takeuchi, S. Tadokoro, Akihiko Hata, Naoki Miyahara, K. Higashi
This thesis describes a new method that in real time predicts fall and collision in order to support remote control of a tracked vehicle with sub-tracks. A tracked vehicle has high ability of getting over rough terrain. However, it is difficult for an operator at a remote place to control the vehicle's moving direction and speed. Hence, we propose a new path evaluation system based on the measurement of environmental shapes around the vehicle. In this system, the candidate paths are generated by operator inputs and terrain information. For evaluating the traversability of the path, we estimate the pose of the robot on the path and contact points with the ground. Then, the combination of translational and rotational velocity is chosen.
本文提出了一种实时预测跌落和碰撞的新方法,以支持具有子履带的履带车辆的远程控制。履带式车辆通过崎岖地形的能力很强。然而,远程操作人员很难控制车辆的移动方向和速度。因此,我们提出了一种新的基于车辆周围环境形状测量的路径评估系统。在该系统中,候选路径由操作员输入和地形信息生成。为了评估路径的可遍历性,我们估计了机器人在路径上的姿态和与地面的接触点。然后,选择平移速度和旋转速度的组合。
{"title":"Real-time prediction of fall and collision of tracked vehicle for remote-control support","authors":"Ken Sakurada, Shihoko Suzuki, K. Ohno, E. Takeuchi, S. Tadokoro, Akihiko Hata, Naoki Miyahara, K. Higashi","doi":"10.1109/SII.2010.5708298","DOIUrl":"https://doi.org/10.1109/SII.2010.5708298","url":null,"abstract":"This thesis describes a new method that in real time predicts fall and collision in order to support remote control of a tracked vehicle with sub-tracks. A tracked vehicle has high ability of getting over rough terrain. However, it is difficult for an operator at a remote place to control the vehicle's moving direction and speed. Hence, we propose a new path evaluation system based on the measurement of environmental shapes around the vehicle. In this system, the candidate paths are generated by operator inputs and terrain information. For evaluating the traversability of the path, we estimate the pose of the robot on the path and contact points with the ground. Then, the combination of translational and rotational velocity is chosen.","PeriodicalId":334652,"journal":{"name":"2010 IEEE/SICE International Symposium on System Integration","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117185010","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Manipulation of an irregularly shaped object by two mobile robots 由两个移动机器人对不规则形状物体的操作
Pub Date : 2010-12-01 DOI: 10.1109/SII.2010.5708328
Zhaojia Liu, Hiromasa Kamogawa, J. Ota
Fast transition from a stable initial state to a stable handling state is important when multiple mobile robots manipulate and transport a heavy and bulky object. In this paper, a cooperative system consisting of two mobile robots was designed to realize fast transition. A gripper robot grasps and lifts an object from one side to provide enough space for a lifter robot to lift the object. Fast transition can be formulated as an optimization problem. We propose an algorithm to realize fast transition based on the cooperative system designed. The experimental results illustrate the validity of the proposed method.
当多个移动机器人操作和搬运一个笨重的物体时,从稳定的初始状态快速过渡到稳定的搬运状态是非常重要的。本文设计了一个由两个移动机器人组成的协作系统,以实现快速过渡。抓取机器人从物体的一侧抓取并提起物体,为抓取机器人提供足够的空间来提起物体。快速过渡可以表述为一个优化问题。提出了一种基于设计的协作系统实现快速转移的算法。实验结果表明了该方法的有效性。
{"title":"Manipulation of an irregularly shaped object by two mobile robots","authors":"Zhaojia Liu, Hiromasa Kamogawa, J. Ota","doi":"10.1109/SII.2010.5708328","DOIUrl":"https://doi.org/10.1109/SII.2010.5708328","url":null,"abstract":"Fast transition from a stable initial state to a stable handling state is important when multiple mobile robots manipulate and transport a heavy and bulky object. In this paper, a cooperative system consisting of two mobile robots was designed to realize fast transition. A gripper robot grasps and lifts an object from one side to provide enough space for a lifter robot to lift the object. Fast transition can be formulated as an optimization problem. We propose an algorithm to realize fast transition based on the cooperative system designed. The experimental results illustrate the validity of the proposed method.","PeriodicalId":334652,"journal":{"name":"2010 IEEE/SICE International Symposium on System Integration","volume":"280 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115425540","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
Accelerometer detection in a camera view based on feature point tracking 基于特征点跟踪的相机视图加速度计检测
Pub Date : 2010-12-01 DOI: 10.1109/SII.2010.5708367
Y. Maki, S. Kagami, K. Hashimoto
We have been working on detecting an object with an accelerometer from many moving objects in a camera view. Our previous method relied on knowledges of specific appearance of the moving objects such as colors, which limited the practical use. In this paper, in order to reduce this dependency on the knowledges of object appearance as much as possible, we apply our method to natural feature points detected in the camera view. To deal with numerous feature points in real time, we propose a method to speed up the computation. We also investigate a method to cope with natural features that are not always tracked continuously. Experimental results show that the proposed method is successfully applied to natural feature points detected and tracked in real time.
我们一直致力于用加速度计从相机视图中的许多移动物体中检测物体。我们以前的方法依赖于运动物体的特定外观(如颜色)的知识,这限制了实际应用。在本文中,为了尽可能地减少这种对物体外观知识的依赖,我们将我们的方法应用于相机视图中检测到的自然特征点。为了实时处理大量的特征点,我们提出了一种加快计算速度的方法。我们还研究了一种处理不总是连续跟踪的自然特征的方法。实验结果表明,该方法成功地应用于自然特征点的实时检测和跟踪。
{"title":"Accelerometer detection in a camera view based on feature point tracking","authors":"Y. Maki, S. Kagami, K. Hashimoto","doi":"10.1109/SII.2010.5708367","DOIUrl":"https://doi.org/10.1109/SII.2010.5708367","url":null,"abstract":"We have been working on detecting an object with an accelerometer from many moving objects in a camera view. Our previous method relied on knowledges of specific appearance of the moving objects such as colors, which limited the practical use. In this paper, in order to reduce this dependency on the knowledges of object appearance as much as possible, we apply our method to natural feature points detected in the camera view. To deal with numerous feature points in real time, we propose a method to speed up the computation. We also investigate a method to cope with natural features that are not always tracked continuously. Experimental results show that the proposed method is successfully applied to natural feature points detected and tracked in real time.","PeriodicalId":334652,"journal":{"name":"2010 IEEE/SICE International Symposium on System Integration","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124703582","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 18
Influence evaluation of wheel surface profile on traversability of planetary rovers 车轮表面轮廓对行星漫游车可穿越性的影响评价
Pub Date : 2010-12-01 DOI: 10.1109/SII.2010.5708303
M. Sutoh, Tsuyoshi Ito, K. Nagatani, Kazuya Yoshida
Planetary rovers play a significant role in surface explorations on the Moon and/or Mars. However, because of wheel slippage, the wheels of planetary rovers can get stuck in loose soil, and the exploration mission may fail because of this situation. To avoid slippage and increase the wheels' drawbar pull, the wheels of planetary rovers typically have parallel fins called lugs on their surface. In this study, we conducted experiments using two-wheeled testbeds in a sandbox to provide a quantitative confirmation of the influence of lugs on the traversability of planetary rovers. In this paper, we report the results of the experiments, and discuss the influence of lugs on the traversability of planetary rovers.
行星漫游者在月球和/或火星表面探测中发挥着重要作用。然而,由于车轮打滑,行星探测器的车轮可能会卡在松散的土壤中,因此探测任务可能会失败。为了避免打滑和增加车轮的拉力,行星漫游者的车轮表面通常有平行的鳍片,称为凸耳。在这项研究中,我们在沙箱中使用两轮试验台进行了实验,以定量确认轮毂对行星探测器可穿越性的影响。在本文中,我们报告了实验结果,并讨论了耳片对行星探测器可穿越性的影响。
{"title":"Influence evaluation of wheel surface profile on traversability of planetary rovers","authors":"M. Sutoh, Tsuyoshi Ito, K. Nagatani, Kazuya Yoshida","doi":"10.1109/SII.2010.5708303","DOIUrl":"https://doi.org/10.1109/SII.2010.5708303","url":null,"abstract":"Planetary rovers play a significant role in surface explorations on the Moon and/or Mars. However, because of wheel slippage, the wheels of planetary rovers can get stuck in loose soil, and the exploration mission may fail because of this situation. To avoid slippage and increase the wheels' drawbar pull, the wheels of planetary rovers typically have parallel fins called lugs on their surface. In this study, we conducted experiments using two-wheeled testbeds in a sandbox to provide a quantitative confirmation of the influence of lugs on the traversability of planetary rovers. In this paper, we report the results of the experiments, and discuss the influence of lugs on the traversability of planetary rovers.","PeriodicalId":334652,"journal":{"name":"2010 IEEE/SICE International Symposium on System Integration","volume":"22 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121553621","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
3D measurement of a surface point using a high-speed projector-camera system for augmented reality games 使用增强现实游戏的高速投影相机系统对表面点进行3D测量
Pub Date : 2010-12-01 DOI: 10.1109/SII.2010.5708306
Kota Toma, S. Kagami, K. Hashimoto
In this paper, we describe high-speed 3D position and normal measurement of a specified point of interest using a high-speed projector-camera systems. A special cross line pattern is adaptively projected onto the point of interest and the reflected pattern on an object surface is captured by the camera and analyzed to obtain the position and normal information. The experimental results show that the estimated normal directions were apparently correct and stable except for occasional large errors due to instantaneous detection failures. Based on this measurement system, we describe a game system by which real-time interaction with non-structured 3D real environment is achieved.
在本文中,我们描述了高速三维位置和法向测量的一个特定的兴趣点使用高速投影相机系统。一种特殊的交叉线模式自适应地投射到感兴趣的点上,由相机捕获物体表面上的反射模式并进行分析以获得位置和法线信息。实验结果表明,除了偶尔由于瞬时检测失败而产生较大误差外,估计的法向是正确和稳定的。基于该测量系统,我们描述了一个与非结构化三维真实环境实时交互的游戏系统。
{"title":"3D measurement of a surface point using a high-speed projector-camera system for augmented reality games","authors":"Kota Toma, S. Kagami, K. Hashimoto","doi":"10.1109/SII.2010.5708306","DOIUrl":"https://doi.org/10.1109/SII.2010.5708306","url":null,"abstract":"In this paper, we describe high-speed 3D position and normal measurement of a specified point of interest using a high-speed projector-camera systems. A special cross line pattern is adaptively projected onto the point of interest and the reflected pattern on an object surface is captured by the camera and analyzed to obtain the position and normal information. The experimental results show that the estimated normal directions were apparently correct and stable except for occasional large errors due to instantaneous detection failures. Based on this measurement system, we describe a game system by which real-time interaction with non-structured 3D real environment is achieved.","PeriodicalId":334652,"journal":{"name":"2010 IEEE/SICE International Symposium on System Integration","volume":"1941 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128024615","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Gesture-world environment technology for mobile manipulation 面向移动操作的手势世界环境技术
Pub Date : 2010-12-01 DOI: 10.1109/SII.2010.5708323
K. Hoshino, Takuya Kasahara, Naoki Igo, Motomasa Tomida, T. Mukai, Kinji Nishi, Hajime Kotani
The aim of this paper is to propose the technology to allow people to control robots by means of everyday gestures without using sensors or controllers. The hand pose estimation we propose reduces the number of image features per data set to 64, which makes the construction of a large-scale database possible. This has also made it possible to estimate the 3D hand poses of unspecified users with individual differences without sacrificing estimation accuracy. Specifically, the system we propose involved the construction in advance of a large database comprising three elements: hand joint information including the wrist, low-order proportional information on the hand images to indicate the rough hand shape, and hand pose data comprised of 64 image features per data set. To estimate a hand pose, the system first performs coarse screening to select similar data sets from the database based on the three hand proportions of the input image, and then performed a detailed search to find the data set most similar to the input images based on 64 image features. Using subjects with varying hand poses, we performed joint angle estimation using our hand pose estimation system comprised of 750,000 hand pose data sets, achieving roughly the same average estimation error as our previous system, about 2 degrees. However, the standard deviation of the estimation error was smaller than in our previous system having roughly 30,000 data sets: down from 26.91 degrees to 14.57 degrees for the index finger PIP joint and from 15.77 degrees to 10.28 degrees for the thumb. We were thus able to confirm an improvement in estimation accuracy, even for unspecified users. Further, the processing speed, using a notebook PC of normal specifications and a compact high-speed camera, was about 80 fps or more, including image capture, hand pose estimation, and CG rendering and robot control of the estimation result.
本文的目的是提出一种技术,允许人们通过日常手势来控制机器人,而不使用传感器或控制器。我们提出的手部姿态估计将每个数据集的图像特征数量减少到64个,这使得大规模数据库的构建成为可能。这也使得在不牺牲估计精度的情况下估计具有个体差异的未指定用户的3D手部姿势成为可能。具体而言,我们提出的系统涉及提前构建一个包含三个要素的大型数据库:手部关节信息(包括手腕),手部图像上的低阶比例信息(表示粗略的手部形状),以及每个数据集由64个图像特征组成的手部姿态数据。为了估计手的姿态,系统首先根据输入图像的三个手的比例从数据库中进行粗筛选,选择相似的数据集,然后根据64个图像特征进行详细搜索,找到与输入图像最相似的数据集。在不同手部姿态的实验对象中,我们使用由75万个手部姿态数据集组成的手部姿态估计系统进行关节角度估计,获得与之前系统大致相同的平均估计误差,约为2度。然而,估计误差的标准偏差比我们之前拥有大约30,000个数据集的系统要小:食指PIP关节从26.91度降至14.57度,拇指从15.77度降至10.28度。因此,我们能够确认在估计精度上的改进,甚至对于未指定的用户。此外,使用普通规格的笔记本电脑和紧凑型高速相机的处理速度约为80 fps以上,包括图像捕获,手部姿势估计,以及CG渲染和估计结果的机器人控制。
{"title":"Gesture-world environment technology for mobile manipulation","authors":"K. Hoshino, Takuya Kasahara, Naoki Igo, Motomasa Tomida, T. Mukai, Kinji Nishi, Hajime Kotani","doi":"10.1109/SII.2010.5708323","DOIUrl":"https://doi.org/10.1109/SII.2010.5708323","url":null,"abstract":"The aim of this paper is to propose the technology to allow people to control robots by means of everyday gestures without using sensors or controllers. The hand pose estimation we propose reduces the number of image features per data set to 64, which makes the construction of a large-scale database possible. This has also made it possible to estimate the 3D hand poses of unspecified users with individual differences without sacrificing estimation accuracy. Specifically, the system we propose involved the construction in advance of a large database comprising three elements: hand joint information including the wrist, low-order proportional information on the hand images to indicate the rough hand shape, and hand pose data comprised of 64 image features per data set. To estimate a hand pose, the system first performs coarse screening to select similar data sets from the database based on the three hand proportions of the input image, and then performed a detailed search to find the data set most similar to the input images based on 64 image features. Using subjects with varying hand poses, we performed joint angle estimation using our hand pose estimation system comprised of 750,000 hand pose data sets, achieving roughly the same average estimation error as our previous system, about 2 degrees. However, the standard deviation of the estimation error was smaller than in our previous system having roughly 30,000 data sets: down from 26.91 degrees to 14.57 degrees for the index finger PIP joint and from 15.77 degrees to 10.28 degrees for the thumb. We were thus able to confirm an improvement in estimation accuracy, even for unspecified users. Further, the processing speed, using a notebook PC of normal specifications and a compact high-speed camera, was about 80 fps or more, including image capture, hand pose estimation, and CG rendering and robot control of the estimation result.","PeriodicalId":334652,"journal":{"name":"2010 IEEE/SICE International Symposium on System Integration","volume":"13 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128366689","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
3D-microfluidic device to remove zona pellucida fabricated by Mask-less exposure technology 无掩模曝光技术制备透明带的3d微流控去除装置
Pub Date : 2010-12-01 DOI: 10.1109/SII.2010.5708335
Y. Yamanishi, T. Nakano, Yu Sawada, K. Itoga, T. Okano, F. Arai
This paper presents a novel method of three-dimensional fabrication using Mask-less exposure equipment and a three dimensional microfluidic application for the cell manipulation. The grayscale data can directly control the height of the photoresist to be exposed without using any mask. Three-dimensional microchannel was successfully fabricated simply by using the low cost exposure system with the height range of 0–200 μm. We have succeeded in removing the zona pellucida of oocyte passing through the 3D-microchannel whose cross section is gradually restricted along the path to provide mechanical stimuli on the surface of the oocyte in every direction. This microfluidic chip contributes to the effective high throughput of the peeled oocyte without damaging them.
本文介绍了一种利用无掩模曝光设备进行三维制备的新方法,以及三维微流体在细胞操作中的应用。灰度数据可以直接控制曝光光刻胶的高度,不需要使用任何掩模。利用低成本曝光系统,成功制备了高度范围为0 ~ 200 μm的三维微通道。我们成功去除卵母细胞通过3d微通道的透明带,其横截面沿路径逐渐受限,在卵母细胞表面各方向提供机械刺激。该微流控芯片在不破坏去皮卵母细胞的情况下,有效提高了去皮卵母细胞的通量。
{"title":"3D-microfluidic device to remove zona pellucida fabricated by Mask-less exposure technology","authors":"Y. Yamanishi, T. Nakano, Yu Sawada, K. Itoga, T. Okano, F. Arai","doi":"10.1109/SII.2010.5708335","DOIUrl":"https://doi.org/10.1109/SII.2010.5708335","url":null,"abstract":"This paper presents a novel method of three-dimensional fabrication using Mask-less exposure equipment and a three dimensional microfluidic application for the cell manipulation. The grayscale data can directly control the height of the photoresist to be exposed without using any mask. Three-dimensional microchannel was successfully fabricated simply by using the low cost exposure system with the height range of 0–200 μm. We have succeeded in removing the zona pellucida of oocyte passing through the 3D-microchannel whose cross section is gradually restricted along the path to provide mechanical stimuli on the surface of the oocyte in every direction. This microfluidic chip contributes to the effective high throughput of the peeled oocyte without damaging them.","PeriodicalId":334652,"journal":{"name":"2010 IEEE/SICE International Symposium on System Integration","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128435336","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Vision-based human state estimation to control an intelligent passive walker 基于视觉的人体状态估计控制智能被动步行器
Pub Date : 2010-12-01 DOI: 10.1109/SII.2010.5708316
S. Taghvaei, Y. Hirata, K. Kosuge
The motion of a passive-type intelligent walker is controlled based on visual estimation of the motion state of the user. The controlled walker would be a support for standing up and prevent the user from falling down. The visual motion analysis detects and tracks the human upper body parts and localizes them in 3D using a stereovision approach. Using this data the user's state is estimated to be whether seated, standing up, falling down or walking. The controller activates the servo brakes in sitting, standing and falling situations as a support to insure both comfort and safety of the user. Estimation methods are experimented with a passive-type intelligent walker referred to as “RT Walker”, equipped with servo brakes.
无源型智能助行器的运动控制是基于对使用者运动状态的视觉估计。受控制的助行器将支持使用者站立,防止使用者摔倒。视觉运动分析检测和跟踪人体上半身,并使用立体视觉方法在3D中定位它们。使用这些数据,用户的状态被估计为是坐着、站着、跌倒还是走路。控制器在坐着、站着和跌倒的情况下激活伺服制动器作为支撑,以确保用户的舒适和安全。用一种装有伺服制动器的被动式智能行走器(RT walker)对估计方法进行了实验。
{"title":"Vision-based human state estimation to control an intelligent passive walker","authors":"S. Taghvaei, Y. Hirata, K. Kosuge","doi":"10.1109/SII.2010.5708316","DOIUrl":"https://doi.org/10.1109/SII.2010.5708316","url":null,"abstract":"The motion of a passive-type intelligent walker is controlled based on visual estimation of the motion state of the user. The controlled walker would be a support for standing up and prevent the user from falling down. The visual motion analysis detects and tracks the human upper body parts and localizes them in 3D using a stereovision approach. Using this data the user's state is estimated to be whether seated, standing up, falling down or walking. The controller activates the servo brakes in sitting, standing and falling situations as a support to insure both comfort and safety of the user. Estimation methods are experimented with a passive-type intelligent walker referred to as “RT Walker”, equipped with servo brakes.","PeriodicalId":334652,"journal":{"name":"2010 IEEE/SICE International Symposium on System Integration","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129714461","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 18
Development of a human type legged robot with roller skates 带轮滑鞋的人型有腿机器人的研制
Pub Date : 2010-12-01 DOI: 10.1109/SII.2010.5708312
K. Itabashi, M. Kumagai
There are types of walking robots using wheels with their legs. Among those, the robots with passive wheels generate propulsive force by specially designed periodic leg motions. The authors had proposed to use a special axle mechanism that can change its curvature to track a designed path for propulsion. The mechanism showed not only straightforward motion but also curved motion and pivoting motion that is unique to the method. However, the robot did not have enough stiffness for further quantitative investigation. Therefore a new bipedal walking robot was developed for the work. The developed robot could perform the roller walking with the designed forward movement. The method of the roller walk of a biped robot is described briefly followed by the design and implementation of the robot. An idea to use Bézier curve for motion trajectory is also introduced. Experimental results are also described and shown in an accompanied video.
有几种用腿带轮子的行走机器人。其中,具有被动轮的机器人通过特别设计的周期性腿部运动产生推进力。作者曾建议使用一种特殊的轴机构,可以改变其曲率来跟踪设计的推进路径。该机构不仅表现出直线运动,而且还表现出该方法特有的弯曲运动和旋转运动。然而,该机器人没有足够的刚度进行进一步的定量研究。为此,研制了一种新型双足步行机器人。所研制的机器人能够以设计的向前运动方式完成滚轴行走。简要介绍了一种双足机器人的滚轮行走方法,然后给出了该机器人的设计与实现。同时介绍了用bsamzier曲线进行运动轨迹分析的思想。实验结果也被描述并显示在附带的视频中。
{"title":"Development of a human type legged robot with roller skates","authors":"K. Itabashi, M. Kumagai","doi":"10.1109/SII.2010.5708312","DOIUrl":"https://doi.org/10.1109/SII.2010.5708312","url":null,"abstract":"There are types of walking robots using wheels with their legs. Among those, the robots with passive wheels generate propulsive force by specially designed periodic leg motions. The authors had proposed to use a special axle mechanism that can change its curvature to track a designed path for propulsion. The mechanism showed not only straightforward motion but also curved motion and pivoting motion that is unique to the method. However, the robot did not have enough stiffness for further quantitative investigation. Therefore a new bipedal walking robot was developed for the work. The developed robot could perform the roller walking with the designed forward movement. The method of the roller walk of a biped robot is described briefly followed by the design and implementation of the robot. An idea to use Bézier curve for motion trajectory is also introduced. Experimental results are also described and shown in an accompanied video.","PeriodicalId":334652,"journal":{"name":"2010 IEEE/SICE International Symposium on System Integration","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123989811","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
Autonomous flight of small helicopter with real-time camera calibration 基于实时摄像机标定的小型直升机自主飞行
Pub Date : 2010-12-01 DOI: 10.1109/SII.2010.5708332
Takanori Matsukawa, S. Arai, K. Hashimoto
We propose a real-time camera calibration method for an autonomous flight of a small helicopter. Our purpose is to control a small helicopter automatically by using cameras fixed on the ground. We use calibrated cameras, un-calibrated cameras, and the small helicopter that is not attached with any sensors. The proposed method finds correspondences between image features in the two images of a calibrated camera and an un-calibrated camera, and estimates the extrinsic parameters of cameras using a particle filter in real time. We evaluate a utility of the proposed method by experiments. We compare real-time calibration by typical bundle adjustment with a Gauss Newton method to the proposed method in the experiment of the small helicopter flight. The autonomous flight of the small helicopter can be achieved by the proposed method in the situation that a flight of a helicopter can not be achieved with typical bundle adjustment with a Gauss Newton method.
针对小型直升机的自主飞行,提出了一种实时摄像机标定方法。我们的目的是利用固定在地面上的摄像机自动控制一架小型直升机。我们使用校准过的摄像机,未校准过的摄像机,以及没有安装任何传感器的小型直升机。该方法在标定相机和未标定相机的两幅图像中找到图像特征之间的对应关系,并利用粒子滤波实时估计相机的外在参数。我们通过实验评估了所提出方法的效用。在小型直升机飞行实验中,将高斯牛顿法与典型束平差法的实时标定进行了比较。针对用高斯牛顿法进行典型束调整无法实现直升机自主飞行的情况,提出的方法可以实现小型直升机的自主飞行。
{"title":"Autonomous flight of small helicopter with real-time camera calibration","authors":"Takanori Matsukawa, S. Arai, K. Hashimoto","doi":"10.1109/SII.2010.5708332","DOIUrl":"https://doi.org/10.1109/SII.2010.5708332","url":null,"abstract":"We propose a real-time camera calibration method for an autonomous flight of a small helicopter. Our purpose is to control a small helicopter automatically by using cameras fixed on the ground. We use calibrated cameras, un-calibrated cameras, and the small helicopter that is not attached with any sensors. The proposed method finds correspondences between image features in the two images of a calibrated camera and an un-calibrated camera, and estimates the extrinsic parameters of cameras using a particle filter in real time. We evaluate a utility of the proposed method by experiments. We compare real-time calibration by typical bundle adjustment with a Gauss Newton method to the proposed method in the experiment of the small helicopter flight. The autonomous flight of the small helicopter can be achieved by the proposed method in the situation that a flight of a helicopter can not be achieved with typical bundle adjustment with a Gauss Newton method.","PeriodicalId":334652,"journal":{"name":"2010 IEEE/SICE International Symposium on System Integration","volume":"20 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124581066","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
期刊
2010 IEEE/SICE International Symposium on System Integration
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1