A socially assistive robot that can interpret affective body language during one-on-one human-robot interactions

D. McColl, G. Nejat
{"title":"A socially assistive robot that can interpret affective body language during one-on-one human-robot interactions","authors":"D. McColl, G. Nejat","doi":"10.1504/IJBBR.2012.049594","DOIUrl":null,"url":null,"abstract":"Socially assistive robots can engage in assistive human-robot interactions (HRI) by providing rehabilitation of cognitive, social, and physical abilities after a stroke, accident or diagnosis of a social, developmental or cognitive disorder. However, there are a number of research issues that need to be addressed in order to design such robots. In this paper, we address one main challenge in the development of intelligent socially assistive robots: a robot’s ability to identify human non-verbal communication during assistive interactions. Namely, we present a unique non-contact automated sensory-based approach for identification and categorisation of human upper body language in determining how accessible a person is to a robot during natural real-time HRI. This classification will allow a robot to effectively determine its own reactive task-driven behaviour during assistive interactions. The types of interactions envisioned include providing reminders, health monitoring, and social and cognitive therapies. Preliminary experiments presented show the potential of integrating the proposed body language recognition and classification technique into a socially assistive robot partaking participating in HRI scenarios.","PeriodicalId":375470,"journal":{"name":"International Journal of Biomechatronics and Biomedical Robotics","volume":"362 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2012-10-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Biomechatronics and Biomedical Robotics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1504/IJBBR.2012.049594","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Socially assistive robots can engage in assistive human-robot interactions (HRI) by providing rehabilitation of cognitive, social, and physical abilities after a stroke, accident or diagnosis of a social, developmental or cognitive disorder. However, there are a number of research issues that need to be addressed in order to design such robots. In this paper, we address one main challenge in the development of intelligent socially assistive robots: a robot’s ability to identify human non-verbal communication during assistive interactions. Namely, we present a unique non-contact automated sensory-based approach for identification and categorisation of human upper body language in determining how accessible a person is to a robot during natural real-time HRI. This classification will allow a robot to effectively determine its own reactive task-driven behaviour during assistive interactions. The types of interactions envisioned include providing reminders, health monitoring, and social and cognitive therapies. Preliminary experiments presented show the potential of integrating the proposed body language recognition and classification technique into a socially assistive robot partaking participating in HRI scenarios.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
一个社交辅助机器人,可以在一对一的人机互动中解释情感肢体语言
社交辅助机器人可以在中风、事故或诊断出社交、发育或认知障碍后,通过提供认知、社交和身体能力的康复,参与辅助人机交互(HRI)。然而,为了设计这样的机器人,有许多研究问题需要解决。在本文中,我们解决了智能社交辅助机器人发展中的一个主要挑战:机器人在辅助交互过程中识别人类非语言交流的能力。也就是说,我们提出了一种独特的非接触式自动基于感官的方法,用于识别和分类人类上半身语言,以确定在自然实时HRI期间机器人对人的可访问性。这种分类将允许机器人在辅助交互期间有效地确定自己的反应性任务驱动行为。设想的互动类型包括提供提醒、健康监测以及社会和认知治疗。初步实验表明,将所提出的肢体语言识别和分类技术集成到参与HRI场景的社交辅助机器人中具有潜力。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Towards electromyogram-based grasps classification Head movement and facial expression-based human-machine interface for controlling an intelligent wheelchair Physical model of human blood electronic memristors network Development of ameba-inspired crawler mechanism using worm gear A local hybrid actuator for robotic surgery instruments
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1