首页 > 最新文献

2010 IEEE/SICE International Symposium on System Integration最新文献

英文 中文
Accelerometer detection in a camera view based on feature point tracking 基于特征点跟踪的相机视图加速度计检测
Pub Date : 2010-12-01 DOI: 10.1109/SII.2010.5708367
Y. Maki, S. Kagami, K. Hashimoto
We have been working on detecting an object with an accelerometer from many moving objects in a camera view. Our previous method relied on knowledges of specific appearance of the moving objects such as colors, which limited the practical use. In this paper, in order to reduce this dependency on the knowledges of object appearance as much as possible, we apply our method to natural feature points detected in the camera view. To deal with numerous feature points in real time, we propose a method to speed up the computation. We also investigate a method to cope with natural features that are not always tracked continuously. Experimental results show that the proposed method is successfully applied to natural feature points detected and tracked in real time.
我们一直致力于用加速度计从相机视图中的许多移动物体中检测物体。我们以前的方法依赖于运动物体的特定外观(如颜色)的知识,这限制了实际应用。在本文中,为了尽可能地减少这种对物体外观知识的依赖,我们将我们的方法应用于相机视图中检测到的自然特征点。为了实时处理大量的特征点,我们提出了一种加快计算速度的方法。我们还研究了一种处理不总是连续跟踪的自然特征的方法。实验结果表明,该方法成功地应用于自然特征点的实时检测和跟踪。
{"title":"Accelerometer detection in a camera view based on feature point tracking","authors":"Y. Maki, S. Kagami, K. Hashimoto","doi":"10.1109/SII.2010.5708367","DOIUrl":"https://doi.org/10.1109/SII.2010.5708367","url":null,"abstract":"We have been working on detecting an object with an accelerometer from many moving objects in a camera view. Our previous method relied on knowledges of specific appearance of the moving objects such as colors, which limited the practical use. In this paper, in order to reduce this dependency on the knowledges of object appearance as much as possible, we apply our method to natural feature points detected in the camera view. To deal with numerous feature points in real time, we propose a method to speed up the computation. We also investigate a method to cope with natural features that are not always tracked continuously. Experimental results show that the proposed method is successfully applied to natural feature points detected and tracked in real time.","PeriodicalId":334652,"journal":{"name":"2010 IEEE/SICE International Symposium on System Integration","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124703582","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 18
Influence evaluation of wheel surface profile on traversability of planetary rovers 车轮表面轮廓对行星漫游车可穿越性的影响评价
Pub Date : 2010-12-01 DOI: 10.1109/SII.2010.5708303
M. Sutoh, Tsuyoshi Ito, K. Nagatani, Kazuya Yoshida
Planetary rovers play a significant role in surface explorations on the Moon and/or Mars. However, because of wheel slippage, the wheels of planetary rovers can get stuck in loose soil, and the exploration mission may fail because of this situation. To avoid slippage and increase the wheels' drawbar pull, the wheels of planetary rovers typically have parallel fins called lugs on their surface. In this study, we conducted experiments using two-wheeled testbeds in a sandbox to provide a quantitative confirmation of the influence of lugs on the traversability of planetary rovers. In this paper, we report the results of the experiments, and discuss the influence of lugs on the traversability of planetary rovers.
行星漫游者在月球和/或火星表面探测中发挥着重要作用。然而,由于车轮打滑,行星探测器的车轮可能会卡在松散的土壤中,因此探测任务可能会失败。为了避免打滑和增加车轮的拉力,行星漫游者的车轮表面通常有平行的鳍片,称为凸耳。在这项研究中,我们在沙箱中使用两轮试验台进行了实验,以定量确认轮毂对行星探测器可穿越性的影响。在本文中,我们报告了实验结果,并讨论了耳片对行星探测器可穿越性的影响。
{"title":"Influence evaluation of wheel surface profile on traversability of planetary rovers","authors":"M. Sutoh, Tsuyoshi Ito, K. Nagatani, Kazuya Yoshida","doi":"10.1109/SII.2010.5708303","DOIUrl":"https://doi.org/10.1109/SII.2010.5708303","url":null,"abstract":"Planetary rovers play a significant role in surface explorations on the Moon and/or Mars. However, because of wheel slippage, the wheels of planetary rovers can get stuck in loose soil, and the exploration mission may fail because of this situation. To avoid slippage and increase the wheels' drawbar pull, the wheels of planetary rovers typically have parallel fins called lugs on their surface. In this study, we conducted experiments using two-wheeled testbeds in a sandbox to provide a quantitative confirmation of the influence of lugs on the traversability of planetary rovers. In this paper, we report the results of the experiments, and discuss the influence of lugs on the traversability of planetary rovers.","PeriodicalId":334652,"journal":{"name":"2010 IEEE/SICE International Symposium on System Integration","volume":"22 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121553621","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Vision-based human state estimation to control an intelligent passive walker 基于视觉的人体状态估计控制智能被动步行器
Pub Date : 2010-12-01 DOI: 10.1109/SII.2010.5708316
S. Taghvaei, Y. Hirata, K. Kosuge
The motion of a passive-type intelligent walker is controlled based on visual estimation of the motion state of the user. The controlled walker would be a support for standing up and prevent the user from falling down. The visual motion analysis detects and tracks the human upper body parts and localizes them in 3D using a stereovision approach. Using this data the user's state is estimated to be whether seated, standing up, falling down or walking. The controller activates the servo brakes in sitting, standing and falling situations as a support to insure both comfort and safety of the user. Estimation methods are experimented with a passive-type intelligent walker referred to as “RT Walker”, equipped with servo brakes.
无源型智能助行器的运动控制是基于对使用者运动状态的视觉估计。受控制的助行器将支持使用者站立,防止使用者摔倒。视觉运动分析检测和跟踪人体上半身,并使用立体视觉方法在3D中定位它们。使用这些数据,用户的状态被估计为是坐着、站着、跌倒还是走路。控制器在坐着、站着和跌倒的情况下激活伺服制动器作为支撑,以确保用户的舒适和安全。用一种装有伺服制动器的被动式智能行走器(RT walker)对估计方法进行了实验。
{"title":"Vision-based human state estimation to control an intelligent passive walker","authors":"S. Taghvaei, Y. Hirata, K. Kosuge","doi":"10.1109/SII.2010.5708316","DOIUrl":"https://doi.org/10.1109/SII.2010.5708316","url":null,"abstract":"The motion of a passive-type intelligent walker is controlled based on visual estimation of the motion state of the user. The controlled walker would be a support for standing up and prevent the user from falling down. The visual motion analysis detects and tracks the human upper body parts and localizes them in 3D using a stereovision approach. Using this data the user's state is estimated to be whether seated, standing up, falling down or walking. The controller activates the servo brakes in sitting, standing and falling situations as a support to insure both comfort and safety of the user. Estimation methods are experimented with a passive-type intelligent walker referred to as “RT Walker”, equipped with servo brakes.","PeriodicalId":334652,"journal":{"name":"2010 IEEE/SICE International Symposium on System Integration","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129714461","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 18
3D measurement of a surface point using a high-speed projector-camera system for augmented reality games 使用增强现实游戏的高速投影相机系统对表面点进行3D测量
Pub Date : 2010-12-01 DOI: 10.1109/SII.2010.5708306
Kota Toma, S. Kagami, K. Hashimoto
In this paper, we describe high-speed 3D position and normal measurement of a specified point of interest using a high-speed projector-camera systems. A special cross line pattern is adaptively projected onto the point of interest and the reflected pattern on an object surface is captured by the camera and analyzed to obtain the position and normal information. The experimental results show that the estimated normal directions were apparently correct and stable except for occasional large errors due to instantaneous detection failures. Based on this measurement system, we describe a game system by which real-time interaction with non-structured 3D real environment is achieved.
在本文中,我们描述了高速三维位置和法向测量的一个特定的兴趣点使用高速投影相机系统。一种特殊的交叉线模式自适应地投射到感兴趣的点上,由相机捕获物体表面上的反射模式并进行分析以获得位置和法线信息。实验结果表明,除了偶尔由于瞬时检测失败而产生较大误差外,估计的法向是正确和稳定的。基于该测量系统,我们描述了一个与非结构化三维真实环境实时交互的游戏系统。
{"title":"3D measurement of a surface point using a high-speed projector-camera system for augmented reality games","authors":"Kota Toma, S. Kagami, K. Hashimoto","doi":"10.1109/SII.2010.5708306","DOIUrl":"https://doi.org/10.1109/SII.2010.5708306","url":null,"abstract":"In this paper, we describe high-speed 3D position and normal measurement of a specified point of interest using a high-speed projector-camera systems. A special cross line pattern is adaptively projected onto the point of interest and the reflected pattern on an object surface is captured by the camera and analyzed to obtain the position and normal information. The experimental results show that the estimated normal directions were apparently correct and stable except for occasional large errors due to instantaneous detection failures. Based on this measurement system, we describe a game system by which real-time interaction with non-structured 3D real environment is achieved.","PeriodicalId":334652,"journal":{"name":"2010 IEEE/SICE International Symposium on System Integration","volume":"1941 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128024615","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
3D-microfluidic device to remove zona pellucida fabricated by Mask-less exposure technology 无掩模曝光技术制备透明带的3d微流控去除装置
Pub Date : 2010-12-01 DOI: 10.1109/SII.2010.5708335
Y. Yamanishi, T. Nakano, Yu Sawada, K. Itoga, T. Okano, F. Arai
This paper presents a novel method of three-dimensional fabrication using Mask-less exposure equipment and a three dimensional microfluidic application for the cell manipulation. The grayscale data can directly control the height of the photoresist to be exposed without using any mask. Three-dimensional microchannel was successfully fabricated simply by using the low cost exposure system with the height range of 0–200 μm. We have succeeded in removing the zona pellucida of oocyte passing through the 3D-microchannel whose cross section is gradually restricted along the path to provide mechanical stimuli on the surface of the oocyte in every direction. This microfluidic chip contributes to the effective high throughput of the peeled oocyte without damaging them.
本文介绍了一种利用无掩模曝光设备进行三维制备的新方法,以及三维微流体在细胞操作中的应用。灰度数据可以直接控制曝光光刻胶的高度,不需要使用任何掩模。利用低成本曝光系统,成功制备了高度范围为0 ~ 200 μm的三维微通道。我们成功去除卵母细胞通过3d微通道的透明带,其横截面沿路径逐渐受限,在卵母细胞表面各方向提供机械刺激。该微流控芯片在不破坏去皮卵母细胞的情况下,有效提高了去皮卵母细胞的通量。
{"title":"3D-microfluidic device to remove zona pellucida fabricated by Mask-less exposure technology","authors":"Y. Yamanishi, T. Nakano, Yu Sawada, K. Itoga, T. Okano, F. Arai","doi":"10.1109/SII.2010.5708335","DOIUrl":"https://doi.org/10.1109/SII.2010.5708335","url":null,"abstract":"This paper presents a novel method of three-dimensional fabrication using Mask-less exposure equipment and a three dimensional microfluidic application for the cell manipulation. The grayscale data can directly control the height of the photoresist to be exposed without using any mask. Three-dimensional microchannel was successfully fabricated simply by using the low cost exposure system with the height range of 0–200 μm. We have succeeded in removing the zona pellucida of oocyte passing through the 3D-microchannel whose cross section is gradually restricted along the path to provide mechanical stimuli on the surface of the oocyte in every direction. This microfluidic chip contributes to the effective high throughput of the peeled oocyte without damaging them.","PeriodicalId":334652,"journal":{"name":"2010 IEEE/SICE International Symposium on System Integration","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128435336","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Attitude control system of micro satellite RISING-2 微卫星RISING-2姿态控制系统
Pub Date : 2010-12-01 DOI: 10.1109/SII.2010.5708354
Kazufumi Fukuda, T. Nakano, Y. Sakamoto, T. Kuwahara, Kazuya Yoshida, Y. Takahashi
This paper summarizes the attitude control system of the 50-kg micro satellite RISING-2, which is now under development by the Tohoku University and Hokkaido University. The main mission of the RISING-2 is Earth surface observations with 5-m resolution using a Cassegrain telescope with 10-cm diameter and 1-m focal length. Accurate attitude control capability with less than 0.1 deg direction errors and less than 0.02 deg/s angular velocity errors is required to realize this observation. In addition, because of the larger power consumption of the science units than expected, actuators must be operated with sufficiently low power. The attitude control system realizes 3-axis stabilization for the observation by means of star sensors, gyro sensors, sun attitude sensors and reaction wheels. In this paper the attitude control law of the RISING-2 is analyzed to keep the power of reaction wheels under the limit. This simulation is based on component specifications and also includes noise data of the components which are under development. The simulation results show that the pointing error is less than 0.1 deg in most time with the RISING-2 attitude control system.
本文概述了日本东北大学和北海道大学正在研制的50公斤级微型卫星RISING-2的姿态控制系统。RISING-2的主要任务是使用直径10厘米、焦距1米的卡塞格伦望远镜进行5米分辨率的地球表面观测。实现这一观测需要精确的姿态控制能力,方向误差小于0.1°,角速度误差小于0.02°/s。此外,由于科学单元的功率消耗比预期的要大,执行器必须以足够低的功率运行。姿态控制系统通过星敏感器、陀螺敏感器、太阳姿态敏感器和反作用轮实现观测的三轴稳定。本文分析了rise -2的姿态控制规律,以保证反作用轮的动力不受限制。该仿真基于组件规格,还包括正在开发的组件的噪声数据。仿真结果表明,在大多数情况下,rise -2姿态控制系统的指向误差小于0.1°。
{"title":"Attitude control system of micro satellite RISING-2","authors":"Kazufumi Fukuda, T. Nakano, Y. Sakamoto, T. Kuwahara, Kazuya Yoshida, Y. Takahashi","doi":"10.1109/SII.2010.5708354","DOIUrl":"https://doi.org/10.1109/SII.2010.5708354","url":null,"abstract":"This paper summarizes the attitude control system of the 50-kg micro satellite RISING-2, which is now under development by the Tohoku University and Hokkaido University. The main mission of the RISING-2 is Earth surface observations with 5-m resolution using a Cassegrain telescope with 10-cm diameter and 1-m focal length. Accurate attitude control capability with less than 0.1 deg direction errors and less than 0.02 deg/s angular velocity errors is required to realize this observation. In addition, because of the larger power consumption of the science units than expected, actuators must be operated with sufficiently low power. The attitude control system realizes 3-axis stabilization for the observation by means of star sensors, gyro sensors, sun attitude sensors and reaction wheels. In this paper the attitude control law of the RISING-2 is analyzed to keep the power of reaction wheels under the limit. This simulation is based on component specifications and also includes noise data of the components which are under development. The simulation results show that the pointing error is less than 0.1 deg in most time with the RISING-2 attitude control system.","PeriodicalId":334652,"journal":{"name":"2010 IEEE/SICE International Symposium on System Integration","volume":"34 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133651256","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 13
Gesture-world environment technology for mobile manipulation 面向移动操作的手势世界环境技术
Pub Date : 2010-12-01 DOI: 10.1109/SII.2010.5708323
K. Hoshino, Takuya Kasahara, Naoki Igo, Motomasa Tomida, T. Mukai, Kinji Nishi, Hajime Kotani
The aim of this paper is to propose the technology to allow people to control robots by means of everyday gestures without using sensors or controllers. The hand pose estimation we propose reduces the number of image features per data set to 64, which makes the construction of a large-scale database possible. This has also made it possible to estimate the 3D hand poses of unspecified users with individual differences without sacrificing estimation accuracy. Specifically, the system we propose involved the construction in advance of a large database comprising three elements: hand joint information including the wrist, low-order proportional information on the hand images to indicate the rough hand shape, and hand pose data comprised of 64 image features per data set. To estimate a hand pose, the system first performs coarse screening to select similar data sets from the database based on the three hand proportions of the input image, and then performed a detailed search to find the data set most similar to the input images based on 64 image features. Using subjects with varying hand poses, we performed joint angle estimation using our hand pose estimation system comprised of 750,000 hand pose data sets, achieving roughly the same average estimation error as our previous system, about 2 degrees. However, the standard deviation of the estimation error was smaller than in our previous system having roughly 30,000 data sets: down from 26.91 degrees to 14.57 degrees for the index finger PIP joint and from 15.77 degrees to 10.28 degrees for the thumb. We were thus able to confirm an improvement in estimation accuracy, even for unspecified users. Further, the processing speed, using a notebook PC of normal specifications and a compact high-speed camera, was about 80 fps or more, including image capture, hand pose estimation, and CG rendering and robot control of the estimation result.
本文的目的是提出一种技术,允许人们通过日常手势来控制机器人,而不使用传感器或控制器。我们提出的手部姿态估计将每个数据集的图像特征数量减少到64个,这使得大规模数据库的构建成为可能。这也使得在不牺牲估计精度的情况下估计具有个体差异的未指定用户的3D手部姿势成为可能。具体而言,我们提出的系统涉及提前构建一个包含三个要素的大型数据库:手部关节信息(包括手腕),手部图像上的低阶比例信息(表示粗略的手部形状),以及每个数据集由64个图像特征组成的手部姿态数据。为了估计手的姿态,系统首先根据输入图像的三个手的比例从数据库中进行粗筛选,选择相似的数据集,然后根据64个图像特征进行详细搜索,找到与输入图像最相似的数据集。在不同手部姿态的实验对象中,我们使用由75万个手部姿态数据集组成的手部姿态估计系统进行关节角度估计,获得与之前系统大致相同的平均估计误差,约为2度。然而,估计误差的标准偏差比我们之前拥有大约30,000个数据集的系统要小:食指PIP关节从26.91度降至14.57度,拇指从15.77度降至10.28度。因此,我们能够确认在估计精度上的改进,甚至对于未指定的用户。此外,使用普通规格的笔记本电脑和紧凑型高速相机的处理速度约为80 fps以上,包括图像捕获,手部姿势估计,以及CG渲染和估计结果的机器人控制。
{"title":"Gesture-world environment technology for mobile manipulation","authors":"K. Hoshino, Takuya Kasahara, Naoki Igo, Motomasa Tomida, T. Mukai, Kinji Nishi, Hajime Kotani","doi":"10.1109/SII.2010.5708323","DOIUrl":"https://doi.org/10.1109/SII.2010.5708323","url":null,"abstract":"The aim of this paper is to propose the technology to allow people to control robots by means of everyday gestures without using sensors or controllers. The hand pose estimation we propose reduces the number of image features per data set to 64, which makes the construction of a large-scale database possible. This has also made it possible to estimate the 3D hand poses of unspecified users with individual differences without sacrificing estimation accuracy. Specifically, the system we propose involved the construction in advance of a large database comprising three elements: hand joint information including the wrist, low-order proportional information on the hand images to indicate the rough hand shape, and hand pose data comprised of 64 image features per data set. To estimate a hand pose, the system first performs coarse screening to select similar data sets from the database based on the three hand proportions of the input image, and then performed a detailed search to find the data set most similar to the input images based on 64 image features. Using subjects with varying hand poses, we performed joint angle estimation using our hand pose estimation system comprised of 750,000 hand pose data sets, achieving roughly the same average estimation error as our previous system, about 2 degrees. However, the standard deviation of the estimation error was smaller than in our previous system having roughly 30,000 data sets: down from 26.91 degrees to 14.57 degrees for the index finger PIP joint and from 15.77 degrees to 10.28 degrees for the thumb. We were thus able to confirm an improvement in estimation accuracy, even for unspecified users. Further, the processing speed, using a notebook PC of normal specifications and a compact high-speed camera, was about 80 fps or more, including image capture, hand pose estimation, and CG rendering and robot control of the estimation result.","PeriodicalId":334652,"journal":{"name":"2010 IEEE/SICE International Symposium on System Integration","volume":"13 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128366689","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
A grasp criterion for robot hands considering multiple aspects of tasks and hand mechanisms 一种考虑多任务和手机构的机械手抓取准则
Pub Date : 2010-12-01 DOI: 10.1109/SII.2010.5708344
M. Sato, Seiji Sugiyama, T. Yoshikawa
In this paper, a task-oriented grasp criterion for robot hands is discussed. The objectives of grasp criteria proposed in the past are stable grasping and the task that require adequate forces and moments on the grasping object. These criteria do not consider the other parameters such as positions and velocities on the grasping object and the mechanical property of robot hands. We propose a task-oriented grasp criterion that evaluates the feasibility of the task with considering hand mechanisms. Our proposed method find the grasping position by considering the efficiency of grasping object's positions, velocities, and forces. As a result, this criterion could detect the grasping position like humans do.
本文讨论了一种面向任务的机械手抓取准则。过去提出的抓取标准的目标是稳定的抓取和对抓取对象需要足够的力和力矩的任务。这些准则没有考虑抓取物体的位置和速度以及机械手的机械性能等其他参数。我们提出了一个以任务为导向的把握标准,在考虑手机构的情况下评估任务的可行性。我们提出的方法通过考虑抓取物体的位置、速度和力的效率来确定抓取位置。因此,该准则可以像人类一样检测抓取位置。
{"title":"A grasp criterion for robot hands considering multiple aspects of tasks and hand mechanisms","authors":"M. Sato, Seiji Sugiyama, T. Yoshikawa","doi":"10.1109/SII.2010.5708344","DOIUrl":"https://doi.org/10.1109/SII.2010.5708344","url":null,"abstract":"In this paper, a task-oriented grasp criterion for robot hands is discussed. The objectives of grasp criteria proposed in the past are stable grasping and the task that require adequate forces and moments on the grasping object. These criteria do not consider the other parameters such as positions and velocities on the grasping object and the mechanical property of robot hands. We propose a task-oriented grasp criterion that evaluates the feasibility of the task with considering hand mechanisms. Our proposed method find the grasping position by considering the efficiency of grasping object's positions, velocities, and forces. As a result, this criterion could detect the grasping position like humans do.","PeriodicalId":334652,"journal":{"name":"2010 IEEE/SICE International Symposium on System Integration","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132097856","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Development of a human type legged robot with roller skates 带轮滑鞋的人型有腿机器人的研制
Pub Date : 2010-12-01 DOI: 10.1109/SII.2010.5708312
K. Itabashi, M. Kumagai
There are types of walking robots using wheels with their legs. Among those, the robots with passive wheels generate propulsive force by specially designed periodic leg motions. The authors had proposed to use a special axle mechanism that can change its curvature to track a designed path for propulsion. The mechanism showed not only straightforward motion but also curved motion and pivoting motion that is unique to the method. However, the robot did not have enough stiffness for further quantitative investigation. Therefore a new bipedal walking robot was developed for the work. The developed robot could perform the roller walking with the designed forward movement. The method of the roller walk of a biped robot is described briefly followed by the design and implementation of the robot. An idea to use Bézier curve for motion trajectory is also introduced. Experimental results are also described and shown in an accompanied video.
有几种用腿带轮子的行走机器人。其中,具有被动轮的机器人通过特别设计的周期性腿部运动产生推进力。作者曾建议使用一种特殊的轴机构,可以改变其曲率来跟踪设计的推进路径。该机构不仅表现出直线运动,而且还表现出该方法特有的弯曲运动和旋转运动。然而,该机器人没有足够的刚度进行进一步的定量研究。为此,研制了一种新型双足步行机器人。所研制的机器人能够以设计的向前运动方式完成滚轴行走。简要介绍了一种双足机器人的滚轮行走方法,然后给出了该机器人的设计与实现。同时介绍了用bsamzier曲线进行运动轨迹分析的思想。实验结果也被描述并显示在附带的视频中。
{"title":"Development of a human type legged robot with roller skates","authors":"K. Itabashi, M. Kumagai","doi":"10.1109/SII.2010.5708312","DOIUrl":"https://doi.org/10.1109/SII.2010.5708312","url":null,"abstract":"There are types of walking robots using wheels with their legs. Among those, the robots with passive wheels generate propulsive force by specially designed periodic leg motions. The authors had proposed to use a special axle mechanism that can change its curvature to track a designed path for propulsion. The mechanism showed not only straightforward motion but also curved motion and pivoting motion that is unique to the method. However, the robot did not have enough stiffness for further quantitative investigation. Therefore a new bipedal walking robot was developed for the work. The developed robot could perform the roller walking with the designed forward movement. The method of the roller walk of a biped robot is described briefly followed by the design and implementation of the robot. An idea to use Bézier curve for motion trajectory is also introduced. Experimental results are also described and shown in an accompanied video.","PeriodicalId":334652,"journal":{"name":"2010 IEEE/SICE International Symposium on System Integration","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123989811","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
Autonomous flight of small helicopter with real-time camera calibration 基于实时摄像机标定的小型直升机自主飞行
Pub Date : 2010-12-01 DOI: 10.1109/SII.2010.5708332
Takanori Matsukawa, S. Arai, K. Hashimoto
We propose a real-time camera calibration method for an autonomous flight of a small helicopter. Our purpose is to control a small helicopter automatically by using cameras fixed on the ground. We use calibrated cameras, un-calibrated cameras, and the small helicopter that is not attached with any sensors. The proposed method finds correspondences between image features in the two images of a calibrated camera and an un-calibrated camera, and estimates the extrinsic parameters of cameras using a particle filter in real time. We evaluate a utility of the proposed method by experiments. We compare real-time calibration by typical bundle adjustment with a Gauss Newton method to the proposed method in the experiment of the small helicopter flight. The autonomous flight of the small helicopter can be achieved by the proposed method in the situation that a flight of a helicopter can not be achieved with typical bundle adjustment with a Gauss Newton method.
针对小型直升机的自主飞行,提出了一种实时摄像机标定方法。我们的目的是利用固定在地面上的摄像机自动控制一架小型直升机。我们使用校准过的摄像机,未校准过的摄像机,以及没有安装任何传感器的小型直升机。该方法在标定相机和未标定相机的两幅图像中找到图像特征之间的对应关系,并利用粒子滤波实时估计相机的外在参数。我们通过实验评估了所提出方法的效用。在小型直升机飞行实验中,将高斯牛顿法与典型束平差法的实时标定进行了比较。针对用高斯牛顿法进行典型束调整无法实现直升机自主飞行的情况,提出的方法可以实现小型直升机的自主飞行。
{"title":"Autonomous flight of small helicopter with real-time camera calibration","authors":"Takanori Matsukawa, S. Arai, K. Hashimoto","doi":"10.1109/SII.2010.5708332","DOIUrl":"https://doi.org/10.1109/SII.2010.5708332","url":null,"abstract":"We propose a real-time camera calibration method for an autonomous flight of a small helicopter. Our purpose is to control a small helicopter automatically by using cameras fixed on the ground. We use calibrated cameras, un-calibrated cameras, and the small helicopter that is not attached with any sensors. The proposed method finds correspondences between image features in the two images of a calibrated camera and an un-calibrated camera, and estimates the extrinsic parameters of cameras using a particle filter in real time. We evaluate a utility of the proposed method by experiments. We compare real-time calibration by typical bundle adjustment with a Gauss Newton method to the proposed method in the experiment of the small helicopter flight. The autonomous flight of the small helicopter can be achieved by the proposed method in the situation that a flight of a helicopter can not be achieved with typical bundle adjustment with a Gauss Newton method.","PeriodicalId":334652,"journal":{"name":"2010 IEEE/SICE International Symposium on System Integration","volume":"20 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124581066","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
期刊
2010 IEEE/SICE International Symposium on System Integration
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1