挖掘用户活动作为搜索和检索的上下文源

Zhengwei Qiu, A. Doherty, C. Gurrin, A. Smeaton
{"title":"挖掘用户活动作为搜索和检索的上下文源","authors":"Zhengwei Qiu, A. Doherty, C. Gurrin, A. Smeaton","doi":"10.1109/STAIR.2011.5995782","DOIUrl":null,"url":null,"abstract":"Nowadays in information retrieval it is generally accepted that if we can better understand the context of searchers then this could help the search process, either at indexing time by including more metadata or at retrieval time by better modelling the user needs. In this work we explore how activity recognition from tri-axial accelerometers can be employed to model a user's activity as a means of enabling context-aware information retrieval. In this paper we discuss how we can gather user activity automatically as a context source from a wearable mobile device and we evaluate the accuracy of our proposed user activity recognition algorithm. Our technique can recognise four kinds of activities which can be used to model part of an individual's current context. We discuss promising experimental results, possible approaches to improve our algorithms, and the impact of this work in modelling user context toward enhanced search and retrieval.","PeriodicalId":376671,"journal":{"name":"2011 International Conference on Semantic Technology and Information Retrieval","volume":"12 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2011-06-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"13","resultStr":"{\"title\":\"Mining user activity as a context source for search and retrieval\",\"authors\":\"Zhengwei Qiu, A. Doherty, C. Gurrin, A. Smeaton\",\"doi\":\"10.1109/STAIR.2011.5995782\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Nowadays in information retrieval it is generally accepted that if we can better understand the context of searchers then this could help the search process, either at indexing time by including more metadata or at retrieval time by better modelling the user needs. In this work we explore how activity recognition from tri-axial accelerometers can be employed to model a user's activity as a means of enabling context-aware information retrieval. In this paper we discuss how we can gather user activity automatically as a context source from a wearable mobile device and we evaluate the accuracy of our proposed user activity recognition algorithm. Our technique can recognise four kinds of activities which can be used to model part of an individual's current context. We discuss promising experimental results, possible approaches to improve our algorithms, and the impact of this work in modelling user context toward enhanced search and retrieval.\",\"PeriodicalId\":376671,\"journal\":{\"name\":\"2011 International Conference on Semantic Technology and Information Retrieval\",\"volume\":\"12 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2011-06-28\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"13\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2011 International Conference on Semantic Technology and Information Retrieval\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/STAIR.2011.5995782\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2011 International Conference on Semantic Technology and Information Retrieval","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/STAIR.2011.5995782","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 13

摘要

如今,在信息检索中,人们普遍认为,如果我们能够更好地理解搜索者的上下文,那么这将有助于搜索过程,或者在索引时包含更多的元数据,或者在检索时更好地建模用户需求。在这项工作中,我们探讨了如何利用三轴加速度计的活动识别来模拟用户的活动,作为实现上下文感知信息检索的一种手段。在本文中,我们讨论了如何从可穿戴移动设备自动收集用户活动作为上下文源,并评估了我们提出的用户活动识别算法的准确性。我们的技术可以识别四种活动,这些活动可以用来为个人当前环境的一部分建模。我们讨论了有希望的实验结果,改进算法的可能方法,以及这项工作对增强搜索和检索的用户上下文建模的影响。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Mining user activity as a context source for search and retrieval
Nowadays in information retrieval it is generally accepted that if we can better understand the context of searchers then this could help the search process, either at indexing time by including more metadata or at retrieval time by better modelling the user needs. In this work we explore how activity recognition from tri-axial accelerometers can be employed to model a user's activity as a means of enabling context-aware information retrieval. In this paper we discuss how we can gather user activity automatically as a context source from a wearable mobile device and we evaluate the accuracy of our proposed user activity recognition algorithm. Our technique can recognise four kinds of activities which can be used to model part of an individual's current context. We discuss promising experimental results, possible approaches to improve our algorithms, and the impact of this work in modelling user context toward enhanced search and retrieval.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Word Sense Disambiguation by using domain knowledge Morphological analysis for rule based machine translation Measuring flow in gaming platforms Construction of topics and clusters in Topic Detection and Tracking tasks Phonetic coding methods for Malay names retrieval
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1