人机交互与控制:将图表转化为直观的增强现实方法

E. Lakshantha, S. Egerton
{"title":"人机交互与控制:将图表转化为直观的增强现实方法","authors":"E. Lakshantha, S. Egerton","doi":"10.1109/IE.2014.24","DOIUrl":null,"url":null,"abstract":"Robots will play a vital role in our future personal spaces and we need to provide ways for robots and humans to interact with each other in a way that is natural, intuitive, descriptive and unambiguous. In this paper we introduce a framework that enables a human and robot to naturally interact and communicate with each other using the idea of diagrams. The diagrams are formed by a connected set of object markers placed in the environment. These markers can either be physically present in the environment or virtually present using marker-less technology. This paper presents a marker-less method for globally persistent markers. Diagrams are formed by connecting the objects together enabling the user to easily interact and control the robot in complex ways. We report on a proof-of-concept implementation of our framework and show how the framework can be used to program a robot to carry out navigation and action tasks within an environment.","PeriodicalId":341235,"journal":{"name":"2014 International Conference on Intelligent Environments","volume":"12 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2014-06-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"Human Robot Interaction and Control: Translating Diagrams into an Intuitive Augmented Reality Approach\",\"authors\":\"E. Lakshantha, S. Egerton\",\"doi\":\"10.1109/IE.2014.24\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Robots will play a vital role in our future personal spaces and we need to provide ways for robots and humans to interact with each other in a way that is natural, intuitive, descriptive and unambiguous. In this paper we introduce a framework that enables a human and robot to naturally interact and communicate with each other using the idea of diagrams. The diagrams are formed by a connected set of object markers placed in the environment. These markers can either be physically present in the environment or virtually present using marker-less technology. This paper presents a marker-less method for globally persistent markers. Diagrams are formed by connecting the objects together enabling the user to easily interact and control the robot in complex ways. We report on a proof-of-concept implementation of our framework and show how the framework can be used to program a robot to carry out navigation and action tasks within an environment.\",\"PeriodicalId\":341235,\"journal\":{\"name\":\"2014 International Conference on Intelligent Environments\",\"volume\":\"12 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2014-06-30\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2014 International Conference on Intelligent Environments\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IE.2014.24\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2014 International Conference on Intelligent Environments","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IE.2014.24","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

摘要

机器人将在我们未来的个人空间中发挥至关重要的作用,我们需要为机器人和人类提供一种自然、直观、描述性和明确的互动方式。在本文中,我们介绍了一个框架,使人类和机器人能够使用图的思想自然地相互交互和通信。这些图是由放置在环境中的一组相互连接的对象标记形成的。这些标记既可以实际存在于环境中,也可以使用无标记技术虚拟存在。提出了一种全局持久标记的无标记方法。通过将对象连接在一起形成图表,使用户能够轻松地以复杂的方式交互和控制机器人。我们报告了框架的概念验证实现,并展示了如何使用该框架对机器人进行编程,使其在环境中执行导航和操作任务。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Human Robot Interaction and Control: Translating Diagrams into an Intuitive Augmented Reality Approach
Robots will play a vital role in our future personal spaces and we need to provide ways for robots and humans to interact with each other in a way that is natural, intuitive, descriptive and unambiguous. In this paper we introduce a framework that enables a human and robot to naturally interact and communicate with each other using the idea of diagrams. The diagrams are formed by a connected set of object markers placed in the environment. These markers can either be physically present in the environment or virtually present using marker-less technology. This paper presents a marker-less method for globally persistent markers. Diagrams are formed by connecting the objects together enabling the user to easily interact and control the robot in complex ways. We report on a proof-of-concept implementation of our framework and show how the framework can be used to program a robot to carry out navigation and action tasks within an environment.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Simple Rule Editor for the Internet of Things Application of ISFET Microsensors with Mobile Network to Build IoT for Water Environment Monitoring The Application of Camtasia Studio in the Development of English Online Courseware Nature-Inspired Interference Management in Smart Peer Groups Using Science-Fiction Prototyping as a Means to Motivate Learning of STEM Topics and Foreign Languages
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1