Where are we?: evaluating the current rendering fidelity of mobile audio augmented reality systems

Florian Heller, Jayan Jevanesan, P. Dietrich, Jan O. Borchers
{"title":"Where are we?: evaluating the current rendering fidelity of mobile audio augmented reality systems","authors":"Florian Heller, Jayan Jevanesan, P. Dietrich, Jan O. Borchers","doi":"10.1145/2935334.2935365","DOIUrl":null,"url":null,"abstract":"Mobile audio augmented reality systems (MAARS) simulate virtual audio sources in a physical space via headphones. While 20 years ago, these required expensive sensing and rendering equipment, the necessary technology has become widely available. Smartphones have become capable to run high-fidelity spatial audio rendering algorithms, and modern sensors can provide rich data to the rendering process. Combined, these constitute an inexpensive, powerful platform for audio augmented reality. We evaluated the practical limitations of currently available off-the-shelf hardware using a voice sample in a lab experiment. State of the art motion sensors provide multiple degrees of freedom, including pitch and roll angles instead of yaw only. Since our rendering algorithm is also capable of including this richer sensor data in terms of source elevation, we also measured its impact on sound localization. Results show that mobile audio augmented reality systems achieve the same horizontal resolution as stationary systems. We found that including pitch and roll angles did not significantly improve the users' localization performance.","PeriodicalId":420843,"journal":{"name":"Proceedings of the 18th International Conference on Human-Computer Interaction with Mobile Devices and Services","volume":"37 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-09-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"16","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 18th International Conference on Human-Computer Interaction with Mobile Devices and Services","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2935334.2935365","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 16

Abstract

Mobile audio augmented reality systems (MAARS) simulate virtual audio sources in a physical space via headphones. While 20 years ago, these required expensive sensing and rendering equipment, the necessary technology has become widely available. Smartphones have become capable to run high-fidelity spatial audio rendering algorithms, and modern sensors can provide rich data to the rendering process. Combined, these constitute an inexpensive, powerful platform for audio augmented reality. We evaluated the practical limitations of currently available off-the-shelf hardware using a voice sample in a lab experiment. State of the art motion sensors provide multiple degrees of freedom, including pitch and roll angles instead of yaw only. Since our rendering algorithm is also capable of including this richer sensor data in terms of source elevation, we also measured its impact on sound localization. Results show that mobile audio augmented reality systems achieve the same horizontal resolution as stationary systems. We found that including pitch and roll angles did not significantly improve the users' localization performance.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
我们在哪里?评估当前移动音频增强现实系统的渲染保真度
移动音频增强现实系统(MAARS)通过耳机模拟物理空间中的虚拟音频源。而20年前,这些需要昂贵的传感和渲染设备,必要的技术已经变得广泛可用。智能手机已经能够运行高保真空间音频渲染算法,现代传感器可以为渲染过程提供丰富的数据。结合起来,这些构成了一个廉价、强大的音频增强现实平台。我们在实验室实验中使用语音样本评估了当前可用的现成硬件的实际局限性。最先进的运动传感器提供多个自由度,包括俯仰角和滚转角,而不仅仅是偏航。由于我们的渲染算法还能够包含源高程方面的丰富传感器数据,因此我们还测量了其对声音定位的影响。结果表明,移动音频增强现实系统可以达到与固定系统相同的水平分辨率。我们发现,包括俯仰角和滚转角并没有显著提高用户的定位性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
A longitudinal evaluation of the acceptability and impact of a diet diary app for older adults with age-related macular degeneration Session details: Text entry Nail+: sensing fingernail deformation to detect finger force touch interactions on rigid surfaces Feasibility of using haptic directions through maps with a tablet and smart watch for people who are blind and visually impaired It's not how you stand, it's how you move: F-formations and collaboration dynamics in a mobile learning game
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1