回声:通过从记录数据中重建游戏回合来分析游戏回合

Daniel MacCormick, Loutfouz Zaman
{"title":"回声:通过从记录数据中重建游戏回合来分析游戏回合","authors":"Daniel MacCormick, Loutfouz Zaman","doi":"10.1145/3410404.3414254","DOIUrl":null,"url":null,"abstract":"Games user research (GUR) is centered on ensuring games deliver the experience that their designers intended. GUR researchers frequently make use of playtesting to evaluate games. This often requires watching back hours of video footage after the session to ensure that they did not miss anything important. Analytics have been used to help improve this process, providing visualizations of the underlying gameplay data. Yet, many of these game analytics tools provide static visualizations which do not accurately capture the dynamic aspects of modern video games. To address this problem, we have created Echo, a tool that uses gameplay data to reconstruct the original session with in-game assets, instead of abstracting them away. Echo has been designed to help bridge the gap between static gameplay data representation and video footage, with the goal of providing the best of both. A user study revealed that participants found Echo less frustrating to use compared to videos for gameplay analysis and also ranked it higher for efficiency, among others. It revealed that participants felt less cognitive load when using Echo as well. Qualitative results were also promising as participants employed several distinct workflows while using Echo. We received numerous suggestions for building upon the current state of the tool, including support for multiple viewports, live annotations, and visible gameplay metrics.","PeriodicalId":92838,"journal":{"name":"Proceedings of the ... Annual Symposium on Computer-Human Interaction in Play. ACM SIGCHI Annual Symposium on Computer-Human Interaction in Play","volume":"252 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2020-11-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":"{\"title\":\"Echo: Analyzing Gameplay Sessions by Reconstructing Them From Recorded Data\",\"authors\":\"Daniel MacCormick, Loutfouz Zaman\",\"doi\":\"10.1145/3410404.3414254\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Games user research (GUR) is centered on ensuring games deliver the experience that their designers intended. GUR researchers frequently make use of playtesting to evaluate games. This often requires watching back hours of video footage after the session to ensure that they did not miss anything important. Analytics have been used to help improve this process, providing visualizations of the underlying gameplay data. Yet, many of these game analytics tools provide static visualizations which do not accurately capture the dynamic aspects of modern video games. To address this problem, we have created Echo, a tool that uses gameplay data to reconstruct the original session with in-game assets, instead of abstracting them away. Echo has been designed to help bridge the gap between static gameplay data representation and video footage, with the goal of providing the best of both. A user study revealed that participants found Echo less frustrating to use compared to videos for gameplay analysis and also ranked it higher for efficiency, among others. It revealed that participants felt less cognitive load when using Echo as well. Qualitative results were also promising as participants employed several distinct workflows while using Echo. We received numerous suggestions for building upon the current state of the tool, including support for multiple viewports, live annotations, and visible gameplay metrics.\",\"PeriodicalId\":92838,\"journal\":{\"name\":\"Proceedings of the ... Annual Symposium on Computer-Human Interaction in Play. ACM SIGCHI Annual Symposium on Computer-Human Interaction in Play\",\"volume\":\"252 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-11-02\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the ... Annual Symposium on Computer-Human Interaction in Play. ACM SIGCHI Annual Symposium on Computer-Human Interaction in Play\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3410404.3414254\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the ... Annual Symposium on Computer-Human Interaction in Play. ACM SIGCHI Annual Symposium on Computer-Human Interaction in Play","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3410404.3414254","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3

摘要

游戏用户研究(GUR)的核心是确保游戏能够传达设计师想要传达的体验。GUR研究人员经常使用游戏测试来评估游戏。这通常需要在会议结束后回放几个小时的视频片段,以确保他们没有错过任何重要的东西。分析可以帮助改进这一过程,提供潜在玩法数据的可视化。然而,许多游戏分析工具提供的静态可视化并不能准确捕捉现代电子游戏的动态方面。为了解决这个问题,我们创造了Echo,这是一个使用游戏玩法数据去重建带有游戏内资产的原始会话的工具,而不是将它们抽象出来。Echo旨在帮助弥合静态游戏玩法数据表示和视频片段之间的差距,其目标是提供两者的最佳效果。一项用户研究显示,与游戏玩法分析视频相比,参与者发现Echo在使用时不那么令人沮丧,而且在效率等方面排名更高。研究显示,使用Echo时,参与者感到的认知负荷也更少。定性结果也很有希望,因为参与者在使用Echo时采用了几个不同的工作流程。我们收到了许多关于构建当前工具状态的建议,包括支持多个视口、实时注释和可见的游戏参数。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Echo: Analyzing Gameplay Sessions by Reconstructing Them From Recorded Data
Games user research (GUR) is centered on ensuring games deliver the experience that their designers intended. GUR researchers frequently make use of playtesting to evaluate games. This often requires watching back hours of video footage after the session to ensure that they did not miss anything important. Analytics have been used to help improve this process, providing visualizations of the underlying gameplay data. Yet, many of these game analytics tools provide static visualizations which do not accurately capture the dynamic aspects of modern video games. To address this problem, we have created Echo, a tool that uses gameplay data to reconstruct the original session with in-game assets, instead of abstracting them away. Echo has been designed to help bridge the gap between static gameplay data representation and video footage, with the goal of providing the best of both. A user study revealed that participants found Echo less frustrating to use compared to videos for gameplay analysis and also ranked it higher for efficiency, among others. It revealed that participants felt less cognitive load when using Echo as well. Qualitative results were also promising as participants employed several distinct workflows while using Echo. We received numerous suggestions for building upon the current state of the tool, including support for multiple viewports, live annotations, and visible gameplay metrics.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Redirection Masking Strategies: Understanding Attentional Diversion Methods for Redirection Techniques Interactive Gamification for New Experimental Music: Initial Findings Adaptive Play: Examining Game-Based Coping Behaviour as an Individualized Stress Response Breaking the Magic Circle: Using a Persuasive Game to Build Empathy For Nursing Staff and Increase Citizen Responsibility During a Pandemic CHI PLAY '21: The Annual Symposium on Computer-Human Interaction in Play, Virtual Event, Austria, October 18-21, 2021
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1