Evaluation of Depth of Field for depth perception in DVR

A. Grosset, Mathias Schott, Georges-Pierre Bonneau, C. Hansen
{"title":"Evaluation of Depth of Field for depth perception in DVR","authors":"A. Grosset, Mathias Schott, Georges-Pierre Bonneau, C. Hansen","doi":"10.1109/PacificVis.2013.6596131","DOIUrl":null,"url":null,"abstract":"In this paper we present a user study on the use of Depth of Field for depth perception in Direct Volume Rendering. Direct Volume Rendering with Phong shading and perspective projection is used as the baseline. Depth of Field is then added to see its impact on the correct perception of ordinal depth. Accuracy and response time are used as the metrics to evaluate the usefulness of Depth of Field. The onsite user study has two parts: static and dynamic. Eye tracking is used to monitor the gaze of the subjects. From our results we see that though Depth of Field does not act as a proper depth cue in all conditions, it can be used to reinforce the perception of which feature is in front of the other. The best results (high accuracy & fast response time) for correct perception of ordinal depth occurs when the front feature (out of the two features users were to choose from) is in focus and perspective projection is used.","PeriodicalId":179865,"journal":{"name":"2013 IEEE Pacific Visualization Symposium (PacificVis)","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2013-09-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"25","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2013 IEEE Pacific Visualization Symposium (PacificVis)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/PacificVis.2013.6596131","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 25

Abstract

In this paper we present a user study on the use of Depth of Field for depth perception in Direct Volume Rendering. Direct Volume Rendering with Phong shading and perspective projection is used as the baseline. Depth of Field is then added to see its impact on the correct perception of ordinal depth. Accuracy and response time are used as the metrics to evaluate the usefulness of Depth of Field. The onsite user study has two parts: static and dynamic. Eye tracking is used to monitor the gaze of the subjects. From our results we see that though Depth of Field does not act as a proper depth cue in all conditions, it can be used to reinforce the perception of which feature is in front of the other. The best results (high accuracy & fast response time) for correct perception of ordinal depth occurs when the front feature (out of the two features users were to choose from) is in focus and perspective projection is used.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
DVR中深度感知的景深评价
在本文中,我们提出了在直接体绘制中使用景深进行深度感知的用户研究。使用Phong阴影和透视投影的直接体渲染作为基线。然后添加景深以查看其对顺序深度的正确感知的影响。精度和响应时间被用作评价景深有用性的指标。现场用户研究分为静态和动态两部分。眼球追踪是用来监测被试者的目光。从我们的结果中我们可以看到,虽然景深在所有情况下都不能作为适当的深度线索,但它可以用来加强对哪个特征在另一个特征前面的感知。正确感知有序深度的最佳结果(高精度和快速响应时间)发生在前特征(用户可以从两个特征中选择)处于焦点中并使用透视投影时。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
FlowGraph: A compound hierarchical graph for flow field exploration A topologically-enhanced juxtaposition tool for hybrid wind tunnel Constrained optimization for disoccluding geographic landmarks in 3D urban maps iTree: Exploring time-varying data using indexable tree Local WYSIWYG volume visualization
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1