在游戏视频库中搜索错误实例

IF 1.7 4区 计算机科学 Q3 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE IEEE Transactions on Games Pub Date : 2024-01-17 DOI:10.1109/TG.2024.3355285
Mohammad Reza Taesiri;Finlay Macklon;Sarra Habchi;Cor-Paul Bezemer
{"title":"在游戏视频库中搜索错误实例","authors":"Mohammad Reza Taesiri;Finlay Macklon;Sarra Habchi;Cor-Paul Bezemer","doi":"10.1109/TG.2024.3355285","DOIUrl":null,"url":null,"abstract":"Gameplay videos offer valuable insights into player interactions and game responses, particularly data about game bugs. Despite the abundance of gameplay videos online, extracting useful information remains a challenge. This article introduces a method for searching and extracting relevant videos from extensive video repositories using English text queries. Our approach requires no external information, like video metadata; it solely depends on video content. Leveraging the zero-shot transfer capabilities of the contrastive language–image pretraining model, our approach does not require any data labeling or training. To evaluate our approach, we present the \n<monospace>GamePhysics</monospace>\n dataset, comprising 26 954 videos from 1873 games that were collected from the \n<uri>GamePhysics</uri>\n section on the Reddit website. Our approach shows promising results in our extensive analysis of simple and compound queries, indicating that our method is useful for detecting objects and events in gameplay videos. Moreover, we assess the effectiveness of our method by analyzing a carefully annotated dataset of 220 gameplay videos. The results of our study demonstrate the potential of our approach for applications, such as the creation of a video search tool tailored to identifying video game bugs, which could greatly benefit quality assurance teams in finding and reproducing bugs.","PeriodicalId":55977,"journal":{"name":"IEEE Transactions on Games","volume":"16 3","pages":"697-710"},"PeriodicalIF":1.7000,"publicationDate":"2024-01-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Searching Bug Instances in Gameplay Video Repositories\",\"authors\":\"Mohammad Reza Taesiri;Finlay Macklon;Sarra Habchi;Cor-Paul Bezemer\",\"doi\":\"10.1109/TG.2024.3355285\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Gameplay videos offer valuable insights into player interactions and game responses, particularly data about game bugs. Despite the abundance of gameplay videos online, extracting useful information remains a challenge. This article introduces a method for searching and extracting relevant videos from extensive video repositories using English text queries. Our approach requires no external information, like video metadata; it solely depends on video content. Leveraging the zero-shot transfer capabilities of the contrastive language–image pretraining model, our approach does not require any data labeling or training. To evaluate our approach, we present the \\n<monospace>GamePhysics</monospace>\\n dataset, comprising 26 954 videos from 1873 games that were collected from the \\n<uri>GamePhysics</uri>\\n section on the Reddit website. Our approach shows promising results in our extensive analysis of simple and compound queries, indicating that our method is useful for detecting objects and events in gameplay videos. Moreover, we assess the effectiveness of our method by analyzing a carefully annotated dataset of 220 gameplay videos. The results of our study demonstrate the potential of our approach for applications, such as the creation of a video search tool tailored to identifying video game bugs, which could greatly benefit quality assurance teams in finding and reproducing bugs.\",\"PeriodicalId\":55977,\"journal\":{\"name\":\"IEEE Transactions on Games\",\"volume\":\"16 3\",\"pages\":\"697-710\"},\"PeriodicalIF\":1.7000,\"publicationDate\":\"2024-01-17\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Transactions on Games\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10402100/\",\"RegionNum\":4,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Games","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10402100/","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

摘要

游戏视频为玩家互动和游戏反应提供了宝贵的见解,尤其是有关游戏错误的数据。尽管网上有大量游戏视频,但提取有用信息仍是一项挑战。本文介绍了一种使用英文文本查询从大量视频库中搜索和提取相关视频的方法。我们的方法不需要视频元数据等外部信息,只依赖于视频内容。利用对比语言-图像预训练模型的零镜头转移功能,我们的方法不需要任何数据标记或训练。为了评估我们的方法,我们展示了 GamePhysics 数据集,其中包括从 Reddit 网站 GamePhysics 部分收集的来自 1873 款游戏的 26 954 个视频。在对简单和复合查询的广泛分析中,我们的方法显示出了良好的效果,表明我们的方法有助于检测游戏视频中的对象和事件。此外,我们还通过分析精心注释的 220 个游戏视频数据集来评估我们方法的有效性。我们的研究结果证明了我们的方法在应用方面的潜力,例如创建一个专门用于识别视频游戏错误的视频搜索工具,这将极大地有利于质量保证团队发现和复制错误。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Searching Bug Instances in Gameplay Video Repositories
Gameplay videos offer valuable insights into player interactions and game responses, particularly data about game bugs. Despite the abundance of gameplay videos online, extracting useful information remains a challenge. This article introduces a method for searching and extracting relevant videos from extensive video repositories using English text queries. Our approach requires no external information, like video metadata; it solely depends on video content. Leveraging the zero-shot transfer capabilities of the contrastive language–image pretraining model, our approach does not require any data labeling or training. To evaluate our approach, we present the GamePhysics dataset, comprising 26 954 videos from 1873 games that were collected from the GamePhysics section on the Reddit website. Our approach shows promising results in our extensive analysis of simple and compound queries, indicating that our method is useful for detecting objects and events in gameplay videos. Moreover, we assess the effectiveness of our method by analyzing a carefully annotated dataset of 220 gameplay videos. The results of our study demonstrate the potential of our approach for applications, such as the creation of a video search tool tailored to identifying video game bugs, which could greatly benefit quality assurance teams in finding and reproducing bugs.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
IEEE Transactions on Games
IEEE Transactions on Games Engineering-Electrical and Electronic Engineering
CiteScore
4.60
自引率
8.70%
发文量
87
期刊最新文献
Table of Contents IEEE Computational Intelligence Society Information IEEE Transactions on Games Publication Information Large Language Models and Games: A Survey and Roadmap Investigating Efficiency of Free-For-All Models in a Matchmaking Context
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1