{"title":"基于启发式评价和基于问题的评分的信息可视化评价","authors":"Marti A. Hearst, Paul Laskowski, Luis Silva","doi":"10.1145/2858036.2858280","DOIUrl":null,"url":null,"abstract":"In an instructional setting it can be difficult to accurately assess the quality of information visualizations of several variables. Instead of a standard design critique, an alternative is to ask potential readers of the chart to answer questions about it. A controlled study with 47 participants shows a good correlation between aggregated novice heuristic evaluation scores and results of answering questions about the data, suggesting that the two forms of assessment can be complementary. Using both metrics in parallel can yield further benefits; discrepancies between them may reveal incorrect application of heuristics or other issues.","PeriodicalId":169608,"journal":{"name":"Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems","volume":" 3","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-05-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"31","resultStr":"{\"title\":\"Evaluating Information Visualization via the Interplay of Heuristic Evaluation and Question-Based Scoring\",\"authors\":\"Marti A. Hearst, Paul Laskowski, Luis Silva\",\"doi\":\"10.1145/2858036.2858280\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In an instructional setting it can be difficult to accurately assess the quality of information visualizations of several variables. Instead of a standard design critique, an alternative is to ask potential readers of the chart to answer questions about it. A controlled study with 47 participants shows a good correlation between aggregated novice heuristic evaluation scores and results of answering questions about the data, suggesting that the two forms of assessment can be complementary. Using both metrics in parallel can yield further benefits; discrepancies between them may reveal incorrect application of heuristics or other issues.\",\"PeriodicalId\":169608,\"journal\":{\"name\":\"Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems\",\"volume\":\" 3\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2016-05-07\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"31\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/2858036.2858280\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2858036.2858280","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Evaluating Information Visualization via the Interplay of Heuristic Evaluation and Question-Based Scoring
In an instructional setting it can be difficult to accurately assess the quality of information visualizations of several variables. Instead of a standard design critique, an alternative is to ask potential readers of the chart to answer questions about it. A controlled study with 47 participants shows a good correlation between aggregated novice heuristic evaluation scores and results of answering questions about the data, suggesting that the two forms of assessment can be complementary. Using both metrics in parallel can yield further benefits; discrepancies between them may reveal incorrect application of heuristics or other issues.