PREVis: Perceived Readability Evaluation for Visualizations

Anne-Flore Cabouat;Tingying He;Petra Isenberg;Tobias Isenberg
{"title":"PREVis: Perceived Readability Evaluation for Visualizations","authors":"Anne-Flore Cabouat;Tingying He;Petra Isenberg;Tobias Isenberg","doi":"10.1109/TVCG.2024.3456318","DOIUrl":null,"url":null,"abstract":"We developed and validated an instrument to measure the perceived readability in data visualization: PREVis. Researchers and practitioners can easily use this instrument as part of their evaluations to compare the perceived readability of different visual data representations. Our instrument can complement results from controlled experiments on user task performance or provide additional data during in-depth qualitative work such as design iterations when developing a new technique. Although readability is recognized as an essential quality of data visualizations, so far there has not been a unified definition of the construct in the context of visual representations. As a result, researchers often lack guidance for determining how to ask people to rate their perceived readability of a visualization. To address this issue, we engaged in a rigorous process to develop the first validated instrument targeted at the subjective readability of visual data representations. Our final instrument consists of 11 items across 4 dimensions: understandability, layout clarity, readability of data values, and readability of data patterns. We provide the questionnaire as a document with implementation guidelines on osf.io/9cg8j. Beyond this instrument, we contribute a discussion of how researchers have previously assessed visualization readability, and an analysis of the factors underlying perceived readability in visual data representations.","PeriodicalId":94035,"journal":{"name":"IEEE transactions on visualization and computer graphics","volume":"31 1","pages":"1083-1093"},"PeriodicalIF":0.0000,"publicationDate":"2024-09-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on visualization and computer graphics","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10681245/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

We developed and validated an instrument to measure the perceived readability in data visualization: PREVis. Researchers and practitioners can easily use this instrument as part of their evaluations to compare the perceived readability of different visual data representations. Our instrument can complement results from controlled experiments on user task performance or provide additional data during in-depth qualitative work such as design iterations when developing a new technique. Although readability is recognized as an essential quality of data visualizations, so far there has not been a unified definition of the construct in the context of visual representations. As a result, researchers often lack guidance for determining how to ask people to rate their perceived readability of a visualization. To address this issue, we engaged in a rigorous process to develop the first validated instrument targeted at the subjective readability of visual data representations. Our final instrument consists of 11 items across 4 dimensions: understandability, layout clarity, readability of data values, and readability of data patterns. We provide the questionnaire as a document with implementation guidelines on osf.io/9cg8j. Beyond this instrument, we contribute a discussion of how researchers have previously assessed visualization readability, and an analysis of the factors underlying perceived readability in visual data representations.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
PREVis:可视化感知可读性评估
我们开发并验证了一种测量数据可视化可读性的工具:PREVis。研究人员和从业人员可以轻松地使用该工具作为评估的一部分,以比较不同可视化数据表示的可读性。我们的工具可以补充用户任务执行情况对照实验的结果,或在深入的定性工作(如开发新技术时的设计迭代)中提供额外的数据。尽管可读性被认为是数据可视化的一项基本质量,但迄今为止,在可视化表征方面还没有一个统一的定义。因此,研究人员在确定如何让人们对可视化的可读性进行评分时往往缺乏指导。为了解决这个问题,我们通过严格的程序,开发出了首个针对可视化数据表示主观可读性的验证工具。我们的最终工具由 11 个项目组成,涵盖 4 个维度:可理解性、布局清晰度、数据值可读性和数据模式可读性。我们在osf.io/9cg8j上以文档形式提供了该问卷及实施指南。除了这份问卷,我们还讨论了研究人员之前是如何评估可视化可读性的,并分析了可视化数据表示中可感知可读性的基本因素。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
2024 Reviewers List Errata to “DiffFit: Visually-Guided Differentiable Fitting of Molecule Structures to a Cryo-EM Map” The Census-Stub Graph Invariant Descriptor TimeLighting: Guided Exploration of 2D Temporal Network Projections Preface
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1