{"title":"Intuitive volume rendering on mobile devices","authors":"Yuchen Xin, H. Wong","doi":"10.1109/CISP-BMEI.2016.7852799","DOIUrl":null,"url":null,"abstract":"Nowadays, mobile devices, virtual reality and augment reality technologies are developing faster and faster. With a variety of equipments, people are no longer only using PC to handle tasks. Some traditional system frameworks are migrating to these new technology areas, and direct volume rendering is one of them. In this paper, we propose a real-time and intuitive volume data exploration framework on mobile devices. Our framework introduces a direct-touch transfer function design method that is able for the user to pick voxels directly on a volume rendered image. With one-pass shader, user interaction and volume rendering can be handled efficiently in real-time. The user only needs to learn a few related knowledge to explore a volume data and get its rendered image. As a result, the time cost of transfer function design for direct volume rendering is significantly reduced. Our framework is implemented with OpenGL ES 3.0 and GLSL shader. Experimental results show the advantages of our framework. Researchers, and even general users, can easily obtain volume rendered images of volume data.","PeriodicalId":275095,"journal":{"name":"2016 9th International Congress on Image and Signal Processing, BioMedical Engineering and Informatics (CISP-BMEI)","volume":"14 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2016 9th International Congress on Image and Signal Processing, BioMedical Engineering and Informatics (CISP-BMEI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CISP-BMEI.2016.7852799","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
Nowadays, mobile devices, virtual reality and augment reality technologies are developing faster and faster. With a variety of equipments, people are no longer only using PC to handle tasks. Some traditional system frameworks are migrating to these new technology areas, and direct volume rendering is one of them. In this paper, we propose a real-time and intuitive volume data exploration framework on mobile devices. Our framework introduces a direct-touch transfer function design method that is able for the user to pick voxels directly on a volume rendered image. With one-pass shader, user interaction and volume rendering can be handled efficiently in real-time. The user only needs to learn a few related knowledge to explore a volume data and get its rendered image. As a result, the time cost of transfer function design for direct volume rendering is significantly reduced. Our framework is implemented with OpenGL ES 3.0 and GLSL shader. Experimental results show the advantages of our framework. Researchers, and even general users, can easily obtain volume rendered images of volume data.
如今,移动设备、虚拟现实和增强现实技术的发展越来越快。随着设备的多样化,人们不再仅仅使用PC来处理任务。一些传统的系统框架正在向这些新技术领域迁移,直接体绘制就是其中之一。在本文中,我们提出了一个实时、直观的移动设备卷数据探索框架。我们的框架引入了一种直接触摸传递函数设计方法,使用户能够直接在体渲染图像上选择体素。使用一遍着色器,用户交互和体渲染可以有效地实时处理。用户只需要学习一些相关的知识,就可以探索一个体数据并获得其渲染图像。因此,直接体绘制传递函数设计的时间成本显著降低。我们的框架是用OpenGL ES 3.0和GLSL着色器实现的。实验结果表明了该框架的优越性。研究人员,甚至一般用户,可以很容易地获得体数据的体渲染图像。