首页 > 最新文献

2015 IEEE International Conference on Technologies for Practical Robot Applications (TePRA)最新文献

英文 中文
Loading of hanging trolleys on overhead conveyor with industrial robots 用工业机器人在高架输送机上装载吊车
Torstein A. Myhre, A. Transeth, O. Egeland
Handling moving objects with robot manipulators is a challenging task as it involves tracking of objects with high accuracy. An industrial application of this type is the loading and unloading of objects on an overhead conveyor. A robotic solution to this problem is presented in this paper, where we describe a method for the interaction of an industrial robot and a free swinging object. Our approach is based on visual tracking using particle filtering where the equations of motion of the object are included in the filtering algorithm. The first contribution of this paper is that the Fisher information matrix is used to quantify the information content from each image feature. In particular, the Fisher information matrix is used to construct a weighted likelihood function. This improves the robustness of tracking algorithm significantly compared to the standard approach based on an unweighted likelihood function. The second contribution of this paper is that we detect occluded image features, and avoid the use of these features in the calculation of the likelihood function. This further improves the quality of the likelihood function. We demonstrate the improved performance of the proposed method in experiments involving the automatic loading of trolleys hanging from a moving overhead conveyor.
用机械手处理运动物体是一项具有挑战性的任务,因为它涉及到对物体的高精度跟踪。这种类型的工业应用是在架空输送机上装载和卸载物体。本文提出了一种解决这一问题的机器人方法,其中我们描述了工业机器人与自由摆动物体相互作用的方法。我们的方法是基于使用粒子滤波的视觉跟踪,其中物体的运动方程包含在滤波算法中。本文的第一个贡献是使用Fisher信息矩阵来量化每个图像特征的信息内容。特别地,利用Fisher信息矩阵构造加权似然函数。与基于未加权似然函数的标准跟踪方法相比,这大大提高了跟踪算法的鲁棒性。本文的第二个贡献是我们检测了被遮挡的图像特征,并避免了在计算似然函数时使用这些特征。这进一步提高了似然函数的质量。我们在涉及移动的架空输送机上悬挂的小车的自动装载的实验中证明了所提出的方法的改进性能。
{"title":"Loading of hanging trolleys on overhead conveyor with industrial robots","authors":"Torstein A. Myhre, A. Transeth, O. Egeland","doi":"10.1109/TePRA.2015.7219677","DOIUrl":"https://doi.org/10.1109/TePRA.2015.7219677","url":null,"abstract":"Handling moving objects with robot manipulators is a challenging task as it involves tracking of objects with high accuracy. An industrial application of this type is the loading and unloading of objects on an overhead conveyor. A robotic solution to this problem is presented in this paper, where we describe a method for the interaction of an industrial robot and a free swinging object. Our approach is based on visual tracking using particle filtering where the equations of motion of the object are included in the filtering algorithm. The first contribution of this paper is that the Fisher information matrix is used to quantify the information content from each image feature. In particular, the Fisher information matrix is used to construct a weighted likelihood function. This improves the robustness of tracking algorithm significantly compared to the standard approach based on an unweighted likelihood function. The second contribution of this paper is that we detect occluded image features, and avoid the use of these features in the calculation of the likelihood function. This further improves the quality of the likelihood function. We demonstrate the improved performance of the proposed method in experiments involving the automatic loading of trolleys hanging from a moving overhead conveyor.","PeriodicalId":325788,"journal":{"name":"2015 IEEE International Conference on Technologies for Practical Robot Applications (TePRA)","volume":"24 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-05-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132019736","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 5
A virtual demonstrator environment for robot imitation learning 机器人模仿学习的虚拟演示环境
Di-Wei Huang, Garrett E. Katz, J. Langsfeld, R. Gentili, J. Reggia
To support studies in robot imitation learning, this paper presents a software platform, SMILE (Simulator for Maryland Imitation Learning Environment), specifically targeting tasks in which exact human motions are not critical. We hypothesize that in this class of tasks, object behaviors are far more important than human behaviors, and thus one can significantly reduce complexity by not processing human motions at all. As such, SMILE simulates a virtual environment where a human demonstrator can manipulate objects using GUI controls without body parts being visible to a robot in the same environment. Imitation learning is therefore based on the behaviors of manipulated objects only. A simple Matlab interface for programming a simulated robot is also provided in SMILE, along with an XML interface for initializing objects in the virtual environment. SMILE lowers the barriers for studying robot imitation learning by (1) simplifying learning by making the human demonstrator be a virtual presence and (2) eliminating the immediate need to purchase special equipment for motion capturing.
为了支持机器人模仿学习的研究,本文提出了一个软件平台SMILE(模拟器马里兰模仿学习环境),专门针对不需要精确人体动作的任务。我们假设在这类任务中,对象行为比人类行为重要得多,因此可以通过完全不处理人类动作来显着降低复杂性。因此,SMILE模拟了一个虚拟环境,在这个环境中,人类演示者可以使用GUI控件操纵对象,而不会在同一环境中看到机器人的身体部位。因此,模仿学习仅基于被操纵对象的行为。SMILE中还提供了一个用于编程模拟机器人的简单Matlab接口,以及一个用于初始化虚拟环境中的对象的XML接口。SMILE通过以下方式降低了研究机器人模仿学习的障碍:(1)通过使人类演示者成为虚拟存在来简化学习;(2)消除了立即购买特殊设备进行动作捕捉的需要。
{"title":"A virtual demonstrator environment for robot imitation learning","authors":"Di-Wei Huang, Garrett E. Katz, J. Langsfeld, R. Gentili, J. Reggia","doi":"10.1109/TePRA.2015.7219691","DOIUrl":"https://doi.org/10.1109/TePRA.2015.7219691","url":null,"abstract":"To support studies in robot imitation learning, this paper presents a software platform, SMILE (Simulator for Maryland Imitation Learning Environment), specifically targeting tasks in which exact human motions are not critical. We hypothesize that in this class of tasks, object behaviors are far more important than human behaviors, and thus one can significantly reduce complexity by not processing human motions at all. As such, SMILE simulates a virtual environment where a human demonstrator can manipulate objects using GUI controls without body parts being visible to a robot in the same environment. Imitation learning is therefore based on the behaviors of manipulated objects only. A simple Matlab interface for programming a simulated robot is also provided in SMILE, along with an XML interface for initializing objects in the virtual environment. SMILE lowers the barriers for studying robot imitation learning by (1) simplifying learning by making the human demonstrator be a virtual presence and (2) eliminating the immediate need to purchase special equipment for motion capturing.","PeriodicalId":325788,"journal":{"name":"2015 IEEE International Conference on Technologies for Practical Robot Applications (TePRA)","volume":"170 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-05-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121218041","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 16
TDMEC, a new measure for evaluating the image quality of color images acquired in vision systems TDMEC是一种评价视觉系统中获取的彩色图像质量的新方法
A. Samani, K. Panetta, S. Agaian
In robotic imaging systems, images are often subject to additive Gaussian noise and additive noise in the color components during image acquisition. These distortions can arise from poor illumination, excessive temperatures, or electronic circuit noise. Imaging sensors required to perform real time enhancement of images that is best suited to the human visual system often need parameter selection and optimization. This is achieved by using a quality metric for image enhancement. Most image quality assessment algorithms require parameter selection of their own to best assess the image quality. Some measures require a reference image to be used alongside the test image for comparison. In this article, we introduce a no-parameter no-reference metric that can determine the best visually pleasing image for human visual perception. Our proposed metric is algorithm independent such that it can be utilized for a variety of enhancement algorithms. Measure of enhancement methods can be categorized as either spatial or transform domain based measures. In this article, we present a DCT transform domain measure of enhancement to evaluate color images impacted by additive noise during image acquisition in robotics applications. Unlike the spatial domain measure of enhancement methods, our proposed measure is independent of image attributes and does not require parameter selection. The proposed measure is applicable to compressed and non-compressed images. This measure could be used as an enhancement metric for different image enhancement methods for both grayscale and the color images.
在机器人成像系统中,在图像采集过程中,图像经常受到加性高斯噪声和颜色分量中的加性噪声的影响。这些失真可能是由于光照不足、温度过高或电子电路噪声引起的。成像传感器需要对最适合人类视觉系统的图像进行实时增强,通常需要进行参数选择和优化。这是通过使用图像增强的质量度量来实现的。大多数图像质量评估算法需要自己选择参数以最好地评估图像质量。有些测量需要参考图像与测试图像一起使用以进行比较。在本文中,我们引入了一个无参数无参考的度量,可以确定最适合人类视觉感知的视觉愉悦图像。我们提出的度量是算法无关的,因此它可以用于各种增强算法。增强方法的测度可分为基于空间测度和基于变换域测度。在本文中,我们提出了一种DCT变换域增强措施,以评估机器人应用中图像采集过程中受加性噪声影响的彩色图像。与增强方法的空间域度量不同,我们提出的度量独立于图像属性,不需要选择参数。该方法适用于压缩图像和非压缩图像。该度量可以作为灰度图像和彩色图像增强方法的增强度量。
{"title":"TDMEC, a new measure for evaluating the image quality of color images acquired in vision systems","authors":"A. Samani, K. Panetta, S. Agaian","doi":"10.1109/TePRA.2015.7219662","DOIUrl":"https://doi.org/10.1109/TePRA.2015.7219662","url":null,"abstract":"In robotic imaging systems, images are often subject to additive Gaussian noise and additive noise in the color components during image acquisition. These distortions can arise from poor illumination, excessive temperatures, or electronic circuit noise. Imaging sensors required to perform real time enhancement of images that is best suited to the human visual system often need parameter selection and optimization. This is achieved by using a quality metric for image enhancement. Most image quality assessment algorithms require parameter selection of their own to best assess the image quality. Some measures require a reference image to be used alongside the test image for comparison. In this article, we introduce a no-parameter no-reference metric that can determine the best visually pleasing image for human visual perception. Our proposed metric is algorithm independent such that it can be utilized for a variety of enhancement algorithms. Measure of enhancement methods can be categorized as either spatial or transform domain based measures. In this article, we present a DCT transform domain measure of enhancement to evaluate color images impacted by additive noise during image acquisition in robotics applications. Unlike the spatial domain measure of enhancement methods, our proposed measure is independent of image attributes and does not require parameter selection. The proposed measure is applicable to compressed and non-compressed images. This measure could be used as an enhancement metric for different image enhancement methods for both grayscale and the color images.","PeriodicalId":325788,"journal":{"name":"2015 IEEE International Conference on Technologies for Practical Robot Applications (TePRA)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-05-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131101809","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 8
Service robots: An industrial perspective 服务型机器人:工业视角
Sang Choi, Gregory F. Rossano, George Zhang, T. Fuhlbrigge
This paper presents the overview and the introduction of commercialized robotic products in the industrial service, maintenance and repair sectors. General facts of the industrial service are briefly described, and then we focus on four specific applications including motor /generator inspection, solar panel inspection /cleaning, tank inspection and pipe inspection. For each application, service process characteristics, operational details, technical challenges, requirements are described. Robotics solutions with commercialized products of each application area were introduced and detailed with special features and specification.
本文介绍了工业服务、维护和维修领域商业化机器人产品的概况和介绍。简要描述了工业服务的一般事实,然后我们重点介绍了四种具体应用,包括电机/发电机检查,太阳能电池板检查/清洁,储罐检查和管道检查。对于每个应用程序,都描述了服务流程特征、操作细节、技术挑战和需求。介绍了每个应用领域的机器人解决方案和商业化产品,并详细介绍了其特殊功能和规格。
{"title":"Service robots: An industrial perspective","authors":"Sang Choi, Gregory F. Rossano, George Zhang, T. Fuhlbrigge","doi":"10.1109/TePRA.2015.7219679","DOIUrl":"https://doi.org/10.1109/TePRA.2015.7219679","url":null,"abstract":"This paper presents the overview and the introduction of commercialized robotic products in the industrial service, maintenance and repair sectors. General facts of the industrial service are briefly described, and then we focus on four specific applications including motor /generator inspection, solar panel inspection /cleaning, tank inspection and pipe inspection. For each application, service process characteristics, operational details, technical challenges, requirements are described. Robotics solutions with commercialized products of each application area were introduced and detailed with special features and specification.","PeriodicalId":325788,"journal":{"name":"2015 IEEE International Conference on Technologies for Practical Robot Applications (TePRA)","volume":"20 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-05-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115935501","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
A parallel manipulator for mobile manipulating UAVs 一种用于移动操纵无人机的并联操纵器
T. Danko, Kenneth Chaney, P. Oh
Manipulating objects using arms mounted to unmanned aerial vehicles (UAVs) is attractive because UAVs may access many locations that are otherwise inaccessible to traditional mobile manipulation platforms such as ground vehicles. Most previous efforts seeking to coordinate the combined manipulator-UAV system have focused on using a manipulator to extend the UAV's reach and assume that both the UAV and manipulator can reliably reach commanded goal poses. This work accepts the reality that state of the art UAV positioning precision is not of a high enough quality to reliably perform simple tasks such as grasping objects. A 6 degree of freedom parallel manipulator is used to robustly maintain precise end-effector positions despite host UAV perturbations. A description of a unique parallel manipulator that allows for very little moving mass, and is easily stowed below a quadrotor UAV is presented along with flight test results and an analytical comparison to a serial manipulator.
使用安装在无人机上的武器操纵物体是有吸引力的,因为无人机可以进入许多传统移动操作平台(如地面车辆)无法进入的位置。大多数先前寻求协调联合机械臂-无人机系统的努力都集中在使用机械臂来扩展无人机的范围,并假设无人机和机械臂都能可靠地达到命令的目标姿态。这项工作接受了这样一个现实,即目前最先进的无人机定位精度不够高,无法可靠地执行简单的任务,如抓取物体。采用6自由度并联机械臂,在主机无人机摄动的情况下鲁棒地保持末端执行器的精确位置。一个独特的并联机械臂的描述,允许很少的移动质量,并且很容易存放在四旋翼无人机下面,连同飞行测试结果和分析比较,以一个串行机械臂。
{"title":"A parallel manipulator for mobile manipulating UAVs","authors":"T. Danko, Kenneth Chaney, P. Oh","doi":"10.1109/TePRA.2015.7219682","DOIUrl":"https://doi.org/10.1109/TePRA.2015.7219682","url":null,"abstract":"Manipulating objects using arms mounted to unmanned aerial vehicles (UAVs) is attractive because UAVs may access many locations that are otherwise inaccessible to traditional mobile manipulation platforms such as ground vehicles. Most previous efforts seeking to coordinate the combined manipulator-UAV system have focused on using a manipulator to extend the UAV's reach and assume that both the UAV and manipulator can reliably reach commanded goal poses. This work accepts the reality that state of the art UAV positioning precision is not of a high enough quality to reliably perform simple tasks such as grasping objects. A 6 degree of freedom parallel manipulator is used to robustly maintain precise end-effector positions despite host UAV perturbations. A description of a unique parallel manipulator that allows for very little moving mass, and is easily stowed below a quadrotor UAV is presented along with flight test results and an analytical comparison to a serial manipulator.","PeriodicalId":325788,"journal":{"name":"2015 IEEE International Conference on Technologies for Practical Robot Applications (TePRA)","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-05-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127505013","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 59
Trajectory optimization of robotic suturing 机器人缝合轨迹优化
Der-Lin Chow, W. Newman
This paper presents progress towards autonomous knot-tying in Robotic Assisted Minimally Invasive Surgery. While successful demonstrations of robotic knot-tying have been achieved, objective comparisons of competing approaches have been lacking. In this presentation we describe how to score a proposed procedure in terms of speed and volume. Applying the scoring metric has motivated an improved procedure for knot-tying, as well as a pathway to automated discovery for trajectory optimizations.
本文介绍了机器人辅助微创手术中自主打结的进展。虽然已经实现了机器人打结的成功演示,但缺乏对竞争方法的客观比较。在本演示中,我们将描述如何根据速度和容量对拟议程序进行评分。应用评分指标可以改进打结过程,并实现轨迹优化的自动发现。
{"title":"Trajectory optimization of robotic suturing","authors":"Der-Lin Chow, W. Newman","doi":"10.1109/TePRA.2015.7219672","DOIUrl":"https://doi.org/10.1109/TePRA.2015.7219672","url":null,"abstract":"This paper presents progress towards autonomous knot-tying in Robotic Assisted Minimally Invasive Surgery. While successful demonstrations of robotic knot-tying have been achieved, objective comparisons of competing approaches have been lacking. In this presentation we describe how to score a proposed procedure in terms of speed and volume. Applying the scoring metric has motivated an improved procedure for knot-tying, as well as a pathway to automated discovery for trajectory optimizations.","PeriodicalId":325788,"journal":{"name":"2015 IEEE International Conference on Technologies for Practical Robot Applications (TePRA)","volume":"41 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-05-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121796578","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 8
Autonomous wall cutting with an Atlas humanoid robot 用阿特拉斯人形机器人自动切墙
Zheng-Hao Chong, Robert T. W. Hung, Kit-Hang Lee, Weijia Wang, T. Ng, W. Newman
Autonomous wall cutting is described using an Atlas humanoid robot. An integrated wall-cutting skill is presented, which only requires an operator to issue supervisory-level commands to prescribe a desired cutting path, leading to autonomous cutting.
描述了使用Atlas人形机器人进行自主墙面切割。提出了一种集成的切墙技术,该技术只需要操作员发出监督级别的命令来指定所需的切割路径,从而实现自动切割。
{"title":"Autonomous wall cutting with an Atlas humanoid robot","authors":"Zheng-Hao Chong, Robert T. W. Hung, Kit-Hang Lee, Weijia Wang, T. Ng, W. Newman","doi":"10.1109/TePRA.2015.7219673","DOIUrl":"https://doi.org/10.1109/TePRA.2015.7219673","url":null,"abstract":"Autonomous wall cutting is described using an Atlas humanoid robot. An integrated wall-cutting skill is presented, which only requires an operator to issue supervisory-level commands to prescribe a desired cutting path, leading to autonomous cutting.","PeriodicalId":325788,"journal":{"name":"2015 IEEE International Conference on Technologies for Practical Robot Applications (TePRA)","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-05-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130768048","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 7
Design of fast walking with one- versus two-at-a-time swing leg motions for RoboSimian 快速行走的设计与一个与两个在同一时间摆动腿运动的机器人
Katie Byl, M. Byl
This paper presents two prototype fast walking gaits for the quadruped robot RoboSimian, along with experimental results for each. The first gait uses a statically stable one-at-a-time swing-leg crawl. The second gait uses a two-at-a-time swingleg motion, which requires deliberate planning of zero-moment point (ZMP) to balance the robot on a narrow support base. Of particular focus are the development of practical means to exploit the fact that RoboSimian has high-dimensionality, with seven actuators per limb, as a means of partially overcoming low joint velocity limits at each joint. For both gaits, we use an inverse kinematics (IK) table that has been designed to maximize the reachable workspace of each limb while minimizing joint velocities during end effector motions. Even with the simplification provided by use of IK solutions, there are still a wide range of variables left open in the design of each gait. We discuss these and present practical methodologies for parameterizing and subsequently deriving approximate time-optimal solutions for each gait type, subject to joint velocity limits of the robot and to real-world requirements for safety margins in maintaining adequate balance. Results show that careful choice of parameters for each of the gaits improves their respective walking speeds significantly. Finally, we compare the fastest achievable walking speeds of each gait and find they are nearly equivalent, given current performance limits of the robot.
介绍了四足机器人RoboSimian的两种快速步行步态原型,并给出了各自的实验结果。第一种步态采用静态稳定的单腿摆动爬行。第二种步态采用一次两腿摆动运动,这需要仔细规划零力矩点(ZMP),以在狭窄的支撑基础上平衡机器人。特别关注的是开发实用手段来利用RoboSimian具有高维性的事实,每个肢体有七个驱动器,作为在每个关节部分克服低关节速度限制的手段。对于这两种步态,我们使用了一个逆运动学(IK)表,该表旨在最大化每个肢体的可达工作空间,同时最小化末端执行器运动期间的关节速度。即使使用IK解决方案提供了简化,在每种步态的设计中仍然存在广泛的变量。我们讨论了这些问题,并提出了实用的方法来参数化并随后推导出每种步态类型的近似时间最优解,这些方法受机器人关节速度限制和保持足够平衡的安全裕度的现实需求的限制。结果表明,仔细选择每种步态的参数可以显著提高各自的行走速度。最后,我们比较了每种步态的最快步行速度,发现在给定机器人当前性能限制的情况下,它们几乎是相等的。
{"title":"Design of fast walking with one- versus two-at-a-time swing leg motions for RoboSimian","authors":"Katie Byl, M. Byl","doi":"10.1109/TePRA.2015.7219688","DOIUrl":"https://doi.org/10.1109/TePRA.2015.7219688","url":null,"abstract":"This paper presents two prototype fast walking gaits for the quadruped robot RoboSimian, along with experimental results for each. The first gait uses a statically stable one-at-a-time swing-leg crawl. The second gait uses a two-at-a-time swingleg motion, which requires deliberate planning of zero-moment point (ZMP) to balance the robot on a narrow support base. Of particular focus are the development of practical means to exploit the fact that RoboSimian has high-dimensionality, with seven actuators per limb, as a means of partially overcoming low joint velocity limits at each joint. For both gaits, we use an inverse kinematics (IK) table that has been designed to maximize the reachable workspace of each limb while minimizing joint velocities during end effector motions. Even with the simplification provided by use of IK solutions, there are still a wide range of variables left open in the design of each gait. We discuss these and present practical methodologies for parameterizing and subsequently deriving approximate time-optimal solutions for each gait type, subject to joint velocity limits of the robot and to real-world requirements for safety margins in maintaining adequate balance. Results show that careful choice of parameters for each of the gaits improves their respective walking speeds significantly. Finally, we compare the fastest achievable walking speeds of each gait and find they are nearly equivalent, given current performance limits of the robot.","PeriodicalId":325788,"journal":{"name":"2015 IEEE International Conference on Technologies for Practical Robot Applications (TePRA)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-05-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126145572","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 8
Autonomous vehicles for remote sample collection in difficult conditions: Enabling remote sample collection by marine biologists 在困难条件下远程采集样本的自动驾驶车辆:使海洋生物学家能够远程采集样本
A. Bennett, Victoria L. Preston, Jay Woo, Shivali Chandra, Devynn Diggins, Riley Chapman, Zhecan Wang, M. Rush, L. Lye, Mindy Tieu, Silas Hughes, Iain Kerr, A. Wee
Rapidly dropping costs and increasing capabilities of robotic systems are creating unprecedented opportunities for the world of scientific research. Remote sample collection in conditions that were once impossible due to expense, location, timing, or risk are now becoming a reality. Of particular interest in marine biological research is the aspect of removing additional stressors in the form of humans and equipment from whale monitoring. In a partnership between Olin College of Engineering and Ocean Alliance, a multirotor unmanned air vehicle (UAV) named SnotBot is being developed to enable marine biologists to collect observational data and biological samples from living whales in a less intrusive and more effective way. In Summer 2014 tests conducted in the Gulf of Mexico it was demonstrated that SnotBot may not be an irritant to whales of study with respect to the noise and downdraft generated by the UAV [1]. The results from those field tests are being used to apply for research permits to collect samples from real whales. Until formal authorization to operate over whales is granted, controlled testing at Olin College and in the Gloucester Harbor of Massachusetts Bay is being conducted to characterize the vehicles and develop autonomy. Beyond cetacean/whale research, the ability to collect physical samples in difficult or sensitive locations, as demonstrated by SnotBot, has far reaching applications in environmental monitoring, aerial surveying, and diagnosis of a transient events.
机器人系统的成本迅速下降,能力不断增强,为科学研究领域创造了前所未有的机遇。由于费用、地点、时间或风险等原因,以前不可能实现的条件下的远程样本采集现在已经成为现实。在海洋生物研究中特别令人感兴趣的是从鲸鱼监测中消除人类和设备形式的额外压力因素。在奥林工程学院和海洋联盟的合作下,一种名为“鼻涕机器人”的多旋翼无人飞行器(UAV)正在开发中,它使海洋生物学家能够以一种更少干扰和更有效的方式从活鲸身上收集观测数据和生物样本。2014年夏季在墨西哥湾进行的测试表明,就无人机产生的噪音和下沉气流而言,snonotbot可能不会对研究鲸鱼产生刺激[1]。这些实地测试的结果正被用于申请从真正的鲸鱼身上收集样本的研究许可。在正式授权在鲸鱼上操作之前,奥林学院和马萨诸塞湾格洛斯特港正在进行控制测试,以表征车辆并开发自主性。除了鲸类/鲸鱼研究之外,在困难或敏感的位置收集物理样本的能力,如snonotbot所展示的,在环境监测、航空测量和瞬态事件诊断方面有着深远的应用。
{"title":"Autonomous vehicles for remote sample collection in difficult conditions: Enabling remote sample collection by marine biologists","authors":"A. Bennett, Victoria L. Preston, Jay Woo, Shivali Chandra, Devynn Diggins, Riley Chapman, Zhecan Wang, M. Rush, L. Lye, Mindy Tieu, Silas Hughes, Iain Kerr, A. Wee","doi":"10.1109/TePRA.2015.7219660","DOIUrl":"https://doi.org/10.1109/TePRA.2015.7219660","url":null,"abstract":"Rapidly dropping costs and increasing capabilities of robotic systems are creating unprecedented opportunities for the world of scientific research. Remote sample collection in conditions that were once impossible due to expense, location, timing, or risk are now becoming a reality. Of particular interest in marine biological research is the aspect of removing additional stressors in the form of humans and equipment from whale monitoring. In a partnership between Olin College of Engineering and Ocean Alliance, a multirotor unmanned air vehicle (UAV) named SnotBot is being developed to enable marine biologists to collect observational data and biological samples from living whales in a less intrusive and more effective way. In Summer 2014 tests conducted in the Gulf of Mexico it was demonstrated that SnotBot may not be an irritant to whales of study with respect to the noise and downdraft generated by the UAV [1]. The results from those field tests are being used to apply for research permits to collect samples from real whales. Until formal authorization to operate over whales is granted, controlled testing at Olin College and in the Gloucester Harbor of Massachusetts Bay is being conducted to characterize the vehicles and develop autonomy. Beyond cetacean/whale research, the ability to collect physical samples in difficult or sensitive locations, as demonstrated by SnotBot, has far reaching applications in environmental monitoring, aerial surveying, and diagnosis of a transient events.","PeriodicalId":325788,"journal":{"name":"2015 IEEE International Conference on Technologies for Practical Robot Applications (TePRA)","volume":"20 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-05-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130095137","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 8
Autonomous convoy driving by night: The vehicle tracking system 夜间自动车队驾驶:车辆跟踪系统
C. Fries, Hans-Joachim Wünsche
Previous publications of our institute describe a robust vehicle tracking system for daylight conditions. This paper presents an improved vehicle tracking system which is able to detect and track the convoy leader also by twilight and night. The primary sensor equipment consists of a daytime camera, a LiDAR and an inertial navigation system. An expansion with a thermal and a lowlight camera was necessary to be robust against any illumination conditions. The system is capable of estimating the relative 3D position and orientation, the velocity and the steering angle of a convoy leader precisely in real-time. This makes it possible to follow the convoy leader's track. Another novelty is coupling a Kalman filter with a particle filter for higher stability and accuracy in vehicle tracking. The tracking system shows excellent functionality while driving more than 50km fully autonomously in urban- and unstructured environments at night.
我们研究所以前的出版物描述了一个强大的车辆跟踪系统的日光条件。本文提出了一种改进的车辆跟踪系统,该系统可以在黄昏和夜间对车队的领队进行检测和跟踪。主要的传感器设备包括一个日间摄像机、一个激光雷达和一个惯性导航系统。为了在任何照明条件下都能保持稳定,一个带有热摄像机和低光摄像机的扩展是必要的。该系统能够实时准确地估计护航先锋的相对三维位置和方向、速度和转向角度。这使得跟踪车队领队的踪迹成为可能。另一种新颖的方法是将卡尔曼滤波与粒子滤波相结合,以提高车辆跟踪的稳定性和准确性。在夜间的城市和非结构化环境中,自动驾驶50公里以上时,追踪系统显示出出色的功能。
{"title":"Autonomous convoy driving by night: The vehicle tracking system","authors":"C. Fries, Hans-Joachim Wünsche","doi":"10.1109/TePRA.2015.7219675","DOIUrl":"https://doi.org/10.1109/TePRA.2015.7219675","url":null,"abstract":"Previous publications of our institute describe a robust vehicle tracking system for daylight conditions. This paper presents an improved vehicle tracking system which is able to detect and track the convoy leader also by twilight and night. The primary sensor equipment consists of a daytime camera, a LiDAR and an inertial navigation system. An expansion with a thermal and a lowlight camera was necessary to be robust against any illumination conditions. The system is capable of estimating the relative 3D position and orientation, the velocity and the steering angle of a convoy leader precisely in real-time. This makes it possible to follow the convoy leader's track. Another novelty is coupling a Kalman filter with a particle filter for higher stability and accuracy in vehicle tracking. The tracking system shows excellent functionality while driving more than 50km fully autonomously in urban- and unstructured environments at night.","PeriodicalId":325788,"journal":{"name":"2015 IEEE International Conference on Technologies for Practical Robot Applications (TePRA)","volume":"121 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-05-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114567931","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 13
期刊
2015 IEEE International Conference on Technologies for Practical Robot Applications (TePRA)
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1