FAAST: The Flexible Action and Articulated Skeleton Toolkit

Evan A. Suma, B. Lange, A. Rizzo, D. Krum, M. Bolas
{"title":"FAAST: The Flexible Action and Articulated Skeleton Toolkit","authors":"Evan A. Suma, B. Lange, A. Rizzo, D. Krum, M. Bolas","doi":"10.1109/VR.2011.5759491","DOIUrl":null,"url":null,"abstract":"The Flexible Action and Articulated Skeleton Toolkit (FAAST) is middleware to facilitate integration of full-body control with virtual reality applications and video games using OpenNI-compliant depth sensors (currently the PrimeSensor and the Microsoft Kinect). FAAST incorporates a VRPN server for streaming the user's skeleton joints over a network, which provides a convenient interface for custom virtual reality applications and games. This body pose information can be used for goals such as realistically puppeting a virtual avatar or controlling an on-screen mouse cursor. Additionally, the toolkit also provides a configurable input emulator that detects human actions and binds them to virtual mouse and keyboard commands, which are sent to the actively selected window. Thus, FAAST can enable natural interaction for existing off-the-shelf video games that were not explicitly developed to support input from motion sensors. The actions and input bindings are configurable at run-time, allowing the user to customize the controls and sensitivity to adjust for individual body types and preferences. In the future, we plan to substantially expand FAAST's action lexicon, provide support for recording and training custom gestures, and incorporate real-time head tracking using computer vision techniques.","PeriodicalId":346701,"journal":{"name":"2011 IEEE Virtual Reality Conference","volume":"13 2","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2011-03-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"238","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2011 IEEE Virtual Reality Conference","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/VR.2011.5759491","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 238

Abstract

The Flexible Action and Articulated Skeleton Toolkit (FAAST) is middleware to facilitate integration of full-body control with virtual reality applications and video games using OpenNI-compliant depth sensors (currently the PrimeSensor and the Microsoft Kinect). FAAST incorporates a VRPN server for streaming the user's skeleton joints over a network, which provides a convenient interface for custom virtual reality applications and games. This body pose information can be used for goals such as realistically puppeting a virtual avatar or controlling an on-screen mouse cursor. Additionally, the toolkit also provides a configurable input emulator that detects human actions and binds them to virtual mouse and keyboard commands, which are sent to the actively selected window. Thus, FAAST can enable natural interaction for existing off-the-shelf video games that were not explicitly developed to support input from motion sensors. The actions and input bindings are configurable at run-time, allowing the user to customize the controls and sensitivity to adjust for individual body types and preferences. In the future, we plan to substantially expand FAAST's action lexicon, provide support for recording and training custom gestures, and incorporate real-time head tracking using computer vision techniques.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
快速:灵活的行动和铰接骨架工具包
灵活动作和铰接骨架工具包(FAAST)是一种中间件,可以使用openni兼容的深度传感器(目前是primessensor和微软Kinect),促进全身控制与虚拟现实应用和视频游戏的集成。FAAST集成了一个VRPN服务器,用于通过网络传输用户的骨骼关节,这为自定义虚拟现实应用程序和游戏提供了方便的界面。这种身体姿势信息可以用于逼真地操纵虚拟角色或控制屏幕上的鼠标光标等目标。此外,该工具包还提供了一个可配置的输入仿真器,用于检测人类操作并将其绑定到虚拟鼠标和键盘命令,这些命令将被发送到活动选择的窗口。因此,FAAST可以为现有的现成视频游戏提供自然交互,而这些游戏并没有明确地支持来自运动传感器的输入。操作和输入绑定在运行时是可配置的,允许用户自定义控件和灵敏度,以调整个人身体类型和首选项。在未来,我们计划大幅扩展FAAST的动作词典,为记录和训练自定义手势提供支持,并使用计算机视觉技术整合实时头部跟踪。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Depth judgment tasks and environments in near-field augmented reality Continual surface-based multi-projector blending for moving objects Mixed reality for supporting office devices troubleshooting “Tap, squeeze and stir” the virtual world: Touching the different states of matter through 6DoF haptic interaction Olfactory feedback system to improve the concentration level based on biological information
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1