Won-Jong Lee, S. Hwang, Youngsam Shin, Jeong-Joon Yoo, Soojung Ryu
We present a bandwidth- and energy-efficient, hybrid ray tracing and rasterizer architecture for tile-based mobile GPU. In order to successfully commercialize mobile System on Chip (SoC), including ray tracing hardware solution, effective integration with rasterizer based on OpenGL---ES is indispensable for the performance and compatibility reason. Thus, recently, the traditional rasterizer-ray tracing hybrid approach is revisited to achieve this goal. The key factor for hybrid rendering is to reflect the fundamental principle of tile-based rendering on integrating the ray tracing hardware and mobile GPUs. Consequently, we propose a new architecture for hybrid rendering by combining new three features such as extended tile binning unit, tile prefetch, and per-tile power control. Simulation results show that our architecture is a potentially versatile solution for future mobile GPUs in low-energy devices because it provides as much as 31.7% better G-buffer bandwidth utilization and is up to 2.18 times better performance per unit energy compared to the ray tracing hardware-only solution.
我们提出了一种带宽和节能,混合光线跟踪和光栅化架构,用于基于tile的移动GPU。为了实现包括光线追踪硬件解决方案在内的移动片上系统(SoC)的成功商用,基于OpenGL—ES的栅格化技术与栅格化技术的有效集成是实现性能和兼容性的必要条件。因此,最近,传统的光栅器-射线跟踪混合方法被重新审视以实现这一目标。混合渲染的关键是将光线追踪硬件与移动gpu相结合,体现基于tile渲染的基本原理。因此,我们提出了一种新的混合渲染架构,该架构结合了三个新的特性,如扩展tile bin unit, tile prefetch和per tile power control。仿真结果表明,我们的架构是未来低能耗设备中移动gpu的潜在通用解决方案,因为它提供了高达31.7%的g缓冲带宽利用率,并且与光线追踪硬件解决方案相比,每单位能量的性能提高了2.18倍。
{"title":"An efficient hybrid ray tracing and rasterizer architecture for mobile GPU","authors":"Won-Jong Lee, S. Hwang, Youngsam Shin, Jeong-Joon Yoo, Soojung Ryu","doi":"10.1145/2818427.2818442","DOIUrl":"https://doi.org/10.1145/2818427.2818442","url":null,"abstract":"We present a bandwidth- and energy-efficient, hybrid ray tracing and rasterizer architecture for tile-based mobile GPU. In order to successfully commercialize mobile System on Chip (SoC), including ray tracing hardware solution, effective integration with rasterizer based on OpenGL---ES is indispensable for the performance and compatibility reason. Thus, recently, the traditional rasterizer-ray tracing hybrid approach is revisited to achieve this goal. The key factor for hybrid rendering is to reflect the fundamental principle of tile-based rendering on integrating the ray tracing hardware and mobile GPUs. Consequently, we propose a new architecture for hybrid rendering by combining new three features such as extended tile binning unit, tile prefetch, and per-tile power control. Simulation results show that our architecture is a potentially versatile solution for future mobile GPUs in low-energy devices because it provides as much as 31.7% better G-buffer bandwidth utilization and is up to 2.18 times better performance per unit energy compared to the ray tracing hardware-only solution.","PeriodicalId":328982,"journal":{"name":"SIGGRAPH Asia 2015 Mobile Graphics and Interactive Applications","volume":"112 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-11-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128816690","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Nobuhisa Hanamitsu, Haruki Nakamura, M. Nakatani, K. Minamizawa
When using social networking, users often upload digital media to capture their experiences, and this includes video or photos of meals, landscapes, gatherings of friends, etc. These events are captured with a camera, recorded by a microphone, or archived using a video camera. Such media provides us with visual, audible, or integrated audio-visual experiences. However, up until now, sharing one's corresponding haptic experiences has not been possible. If this haptic experience can be shared, the sensory feedback will be sufficiently compelling and easy to understand in order for a more complete experience from a first-person perspective.
{"title":"Twech: a mobile platform to search and share visuo-tactile experiences","authors":"Nobuhisa Hanamitsu, Haruki Nakamura, M. Nakatani, K. Minamizawa","doi":"10.1145/2818427.2818461","DOIUrl":"https://doi.org/10.1145/2818427.2818461","url":null,"abstract":"When using social networking, users often upload digital media to capture their experiences, and this includes video or photos of meals, landscapes, gatherings of friends, etc. These events are captured with a camera, recorded by a microphone, or archived using a video camera. Such media provides us with visual, audible, or integrated audio-visual experiences. However, up until now, sharing one's corresponding haptic experiences has not been possible. If this haptic experience can be shared, the sensory feedback will be sufficiently compelling and easy to understand in order for a more complete experience from a first-person perspective.","PeriodicalId":328982,"journal":{"name":"SIGGRAPH Asia 2015 Mobile Graphics and Interactive Applications","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-11-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129907643","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This paper presents an experimental method and apparatus for producing spherical panoramas with high dynamic range imaging (HDRI). Our method is optimized for providing high fidelity augmented reality (AR) image-based environment recognition for mobile devices. Previous studies have shown that a pre-produced panorama image can be used to make AR tracking possible for mobile AR applications. However, there has been little research on determining the qualities of the source panorama image necessary for creating high fidelity AR experiences. Panorama image production can have various challenges that can result in inaccurate reproduction of images that do not allow correct virtual graphics to be registered in the AR scene. These challenges include using multiple angle photograph images that contain parallax error, nadir angle difficulty and limited dynamic range. For mobile AR, we developed a HDRI method that requires a single acquisition that extends the dynamic range from a digital negative. This approach that needs least acquisition time is to be used for multiple angles necessary for reconstructing accurately reproduced spherical panorama with sufficient luminance.
{"title":"Augmented reality using high fidelity spherical panorama with HDRI","authors":"Zi Siang See, M. Billinghurst, A. Cheok","doi":"10.1145/2818427.2818445","DOIUrl":"https://doi.org/10.1145/2818427.2818445","url":null,"abstract":"This paper presents an experimental method and apparatus for producing spherical panoramas with high dynamic range imaging (HDRI). Our method is optimized for providing high fidelity augmented reality (AR) image-based environment recognition for mobile devices. Previous studies have shown that a pre-produced panorama image can be used to make AR tracking possible for mobile AR applications. However, there has been little research on determining the qualities of the source panorama image necessary for creating high fidelity AR experiences. Panorama image production can have various challenges that can result in inaccurate reproduction of images that do not allow correct virtual graphics to be registered in the AR scene. These challenges include using multiple angle photograph images that contain parallax error, nadir angle difficulty and limited dynamic range. For mobile AR, we developed a HDRI method that requires a single acquisition that extends the dynamic range from a digital negative. This approach that needs least acquisition time is to be used for multiple angles necessary for reconstructing accurately reproduced spherical panorama with sufficient luminance.","PeriodicalId":328982,"journal":{"name":"SIGGRAPH Asia 2015 Mobile Graphics and Interactive Applications","volume":"30 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-11-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115719459","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Marie-Stephanie Iekura, H. Hayakawa, Keisuke Onoda, Yoichi Kamiyama, K. Minamizawa, M. Inami
What could we do if we were able to feel others experience in real time. SMASH is a system that provides sports experience of a person from a remote area to spectators at the stadium and television audience in real time. For example, in sports watching, the spectator, by holding the actuator build-in device in the hand, is able to feel the heartbeat of the athlete and the tactile sensation that the player has during a game like shooting a ball, steps, or smashing a shuttle. Sports watching experience using player view point with HMD is said to be the most close to the player, but it shuts the user from other spectators. By feeling the players sensation in the palm, the spectator is able to feel the player closer while enjoying sports watching with others and share their emotions. This system used with television broadcast and at stadium should bring different stage of synchronization with the athlete depending on the situation. In 2020, with the Tokyo Olympic and Paralympic games, new system for sports watching are expected to emerge, using not only tactile information, but information system using extension of any of the human senses are expected to come out.
{"title":"SMASH: synchronization media of athletes and spectator through haptic","authors":"Marie-Stephanie Iekura, H. Hayakawa, Keisuke Onoda, Yoichi Kamiyama, K. Minamizawa, M. Inami","doi":"10.1145/2818427.2818439","DOIUrl":"https://doi.org/10.1145/2818427.2818439","url":null,"abstract":"What could we do if we were able to feel others experience in real time. SMASH is a system that provides sports experience of a person from a remote area to spectators at the stadium and television audience in real time. For example, in sports watching, the spectator, by holding the actuator build-in device in the hand, is able to feel the heartbeat of the athlete and the tactile sensation that the player has during a game like shooting a ball, steps, or smashing a shuttle. Sports watching experience using player view point with HMD is said to be the most close to the player, but it shuts the user from other spectators. By feeling the players sensation in the palm, the spectator is able to feel the player closer while enjoying sports watching with others and share their emotions. This system used with television broadcast and at stadium should bring different stage of synchronization with the athlete depending on the situation. In 2020, with the Tokyo Olympic and Paralympic games, new system for sports watching are expected to emerge, using not only tactile information, but information system using extension of any of the human senses are expected to come out.","PeriodicalId":328982,"journal":{"name":"SIGGRAPH Asia 2015 Mobile Graphics and Interactive Applications","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-11-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130317217","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Jeong-Joon Yoo, J. D. Lee, Sundeep Krishnadasan, Won-Jong Lee, J. Brothers, Soojung Ryu
In this paper, we present a tile-based path rendering scheme that provides a fast rendering on mobile device. Because legacy path rendering schemes have memory or computing intensive work, they do not provide an enough performance (fps) on mobile device. To get an acceptable performance, we propose using tile-based approach for path rendering on mobile device. The design goal of our scheme has two folds: 1) Minimize memory I/O, 2) Minimize computation. Because our scheme effectively reduces memory I/O and computation simultaneously, we can get an acceptable high performance of path rendering on mobile device.
{"title":"Tile-based path rendering for mobile device","authors":"Jeong-Joon Yoo, J. D. Lee, Sundeep Krishnadasan, Won-Jong Lee, J. Brothers, Soojung Ryu","doi":"10.1145/2818427.2818449","DOIUrl":"https://doi.org/10.1145/2818427.2818449","url":null,"abstract":"In this paper, we present a tile-based path rendering scheme that provides a fast rendering on mobile device. Because legacy path rendering schemes have memory or computing intensive work, they do not provide an enough performance (fps) on mobile device. To get an acceptable performance, we propose using tile-based approach for path rendering on mobile device. The design goal of our scheme has two folds: 1) Minimize memory I/O, 2) Minimize computation. Because our scheme effectively reduces memory I/O and computation simultaneously, we can get an acceptable high performance of path rendering on mobile device.","PeriodicalId":328982,"journal":{"name":"SIGGRAPH Asia 2015 Mobile Graphics and Interactive Applications","volume":"58 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-11-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121746106","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Mod is a small studio creating interactive stories and immersive experiences. The bespoke nature of our work coupled with small budgets has required ingenuity around the use of mobile graphics and interactive applications. We examine operational challenges and lessons learned across three exhibitions that demonstrate iterative development and pitfalls in creating a more playful "plug-and-play" Internet of Things eco-system for creative multimedia experience.
{"title":"The internet of (showbiz) things: scalability issues in deploying and supporting networked multimedia experience.","authors":"Michela Ledwidge, Andrew Burrell","doi":"10.1145/2818427.2818432","DOIUrl":"https://doi.org/10.1145/2818427.2818432","url":null,"abstract":"Mod is a small studio creating interactive stories and immersive experiences. The bespoke nature of our work coupled with small budgets has required ingenuity around the use of mobile graphics and interactive applications. We examine operational challenges and lessons learned across three exhibitions that demonstrate iterative development and pitfalls in creating a more playful \"plug-and-play\" Internet of Things eco-system for creative multimedia experience.","PeriodicalId":328982,"journal":{"name":"SIGGRAPH Asia 2015 Mobile Graphics and Interactive Applications","volume":"58 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-11-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115068732","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"SIGGRAPH Asia 2015 Mobile Graphics and Interactive Applications","authors":"","doi":"10.1145/2818427","DOIUrl":"https://doi.org/10.1145/2818427","url":null,"abstract":"","PeriodicalId":328982,"journal":{"name":"SIGGRAPH Asia 2015 Mobile Graphics and Interactive Applications","volume":"45 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132588161","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}