{"title":"Study of Relevance and Effort across Devices","authors":"Manisha Verma, Emine Yilmaz, Nick Craswell","doi":"10.1145/3176349.3176888","DOIUrl":null,"url":null,"abstract":"Relevance judgments are essential for designing information retrieval systems. Traditionally, judgments have been gathered via desktop interfaces. However, with the rise in popularity of smaller devices for information access, it has become imperative to investigate whether desktop based judgments are different from mobile judgments. Recently, user effort and document usefulness have also emerged as important dimensions to optimize and evaluate information retrieval systems. Since existing work is limited to desktops, it remains to be seen how these judgments are affected by user»s search device. In this paper, we address these shortcomings by collecting and analyzing relevance, usefulness and effort judgments on mobiles and desktops. Analysis of these judgments shows high agreement rate between desktop and mobile judges for relevance, followed by usefulness and findability. We also found that desktop judges are likely to spend more time and examine non-relevant/not-useful/difficult documents in greater depth compared to mobile judges. Based on our findings, we suggest that relevance judgments should be gathered via desktops and effort judgments should be collected on each device independently.","PeriodicalId":198379,"journal":{"name":"Proceedings of the 2018 Conference on Human Information Interaction & Retrieval","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2018-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2018 Conference on Human Information Interaction & Retrieval","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3176349.3176888","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
Relevance judgments are essential for designing information retrieval systems. Traditionally, judgments have been gathered via desktop interfaces. However, with the rise in popularity of smaller devices for information access, it has become imperative to investigate whether desktop based judgments are different from mobile judgments. Recently, user effort and document usefulness have also emerged as important dimensions to optimize and evaluate information retrieval systems. Since existing work is limited to desktops, it remains to be seen how these judgments are affected by user»s search device. In this paper, we address these shortcomings by collecting and analyzing relevance, usefulness and effort judgments on mobiles and desktops. Analysis of these judgments shows high agreement rate between desktop and mobile judges for relevance, followed by usefulness and findability. We also found that desktop judges are likely to spend more time and examine non-relevant/not-useful/difficult documents in greater depth compared to mobile judges. Based on our findings, we suggest that relevance judgments should be gathered via desktops and effort judgments should be collected on each device independently.