首页 > 最新文献

Proceedings of the Workshop on Virtual Reality and Augmented Reality Network最新文献

英文 中文
VR Video Conferencing over Named Data Networks 命名数据网络上的VR视频会议
Pub Date : 2017-08-11 DOI: 10.1145/3097895.3097897
Liyang Zhang, S. O. Amin, C. Westphal
We propose a VR video conferencing system over named data networks (NDN). The system is designed to support real-time, multi-party streaming and playback of 360 degree video on a web player. A centralized architecture is used, with a signaling server to coordinate multiple participants. To ensure real-time requirement, a protocol featuring prefetching is used for producer-consumer communication. Along with the native support of multicast in NDN, this design is expected to better support large amount of data streaming between multiple users. As a proof of concept, a protoype of the system is implemented with one-way real-time 360 video streaming. Experiments show that seamless streaming and interactive playback of 360 video can be achieved with low latency. Therefore, the proposed system has the potential to provide immersive VR experience for real-time multi-party video conferencing.
提出了一种基于命名数据网络(NDN)的虚拟现实视频会议系统。该系统旨在支持实时,多方流媒体和360度视频播放的网络播放器。采用集中式架构,使用信令服务器协调多个参与者。为了保证实时性,采用了一种具有预取特性的协议进行生产者和消费者之间的通信。随着NDN对组播的原生支持,该设计有望更好地支持多用户之间的大量数据流。作为概念验证,系统的原型实现了单向实时360视频流。实验表明,该方法可以在低延迟的情况下实现360度视频的无缝串流和交互式播放。因此,该系统具有为实时多方视频会议提供沉浸式VR体验的潜力。
{"title":"VR Video Conferencing over Named Data Networks","authors":"Liyang Zhang, S. O. Amin, C. Westphal","doi":"10.1145/3097895.3097897","DOIUrl":"https://doi.org/10.1145/3097895.3097897","url":null,"abstract":"We propose a VR video conferencing system over named data networks (NDN). The system is designed to support real-time, multi-party streaming and playback of 360 degree video on a web player. A centralized architecture is used, with a signaling server to coordinate multiple participants. To ensure real-time requirement, a protocol featuring prefetching is used for producer-consumer communication. Along with the native support of multicast in NDN, this design is expected to better support large amount of data streaming between multiple users. As a proof of concept, a protoype of the system is implemented with one-way real-time 360 video streaming. Experiments show that seamless streaming and interactive playback of 360 video can be achieved with low latency. Therefore, the proposed system has the potential to provide immersive VR experience for real-time multi-party video conferencing.","PeriodicalId":270981,"journal":{"name":"Proceedings of the Workshop on Virtual Reality and Augmented Reality Network","volume":"21 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-08-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125485746","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 16
VR is on the Edge: How to Deliver 360° Videos in Mobile Networks VR处于边缘:如何在移动网络中提供360°视频
Pub Date : 2017-08-11 DOI: 10.1145/3097895.3097901
Simone Mangiante, G. Klas, Amit Navon, G. Zhuang, Ran Ju, M. F. Silva
VR/AR is rapidly progressing towards enterprise and end customers with the promise of bringing immersive experience to numerous applications. Soon it will target smartphones from the cloud and 360° video delivery will need unprecedented requirements for ultra-low latency and ultra-high throughput to mobile networks. Latest developments in NFV and Mobile Edge Computing reveal already the potential to enable VR streaming in cellular networks and to pave the way towards 5G and next stages in VR technology. In this paper we present a Field Of View (FOV) rendering solution at the edge of a mobile network, designed to optimize the bandwidth and latency required by VR 360° video streaming. Preliminary test results show the immediate benefits in bandwidth saving this approach can provide and generate new directions for VR/AR network research.
VR/AR正迅速向企业和终端客户发展,有望为众多应用带来身临其境的体验。很快,它将瞄准云智能手机,360°视频传输将对移动网络的超低延迟和超高吞吐量提出前所未有的要求。NFV和移动边缘计算的最新发展已经揭示了在蜂窝网络中实现VR流媒体的潜力,并为5G和VR技术的下一个阶段铺平了道路。在本文中,我们提出了一种移动网络边缘的视场(FOV)渲染解决方案,旨在优化VR 360°视频流所需的带宽和延迟。初步测试结果表明,该方法在节省带宽方面立竿见影,为VR/AR网络研究提供了新的方向。
{"title":"VR is on the Edge: How to Deliver 360° Videos in Mobile Networks","authors":"Simone Mangiante, G. Klas, Amit Navon, G. Zhuang, Ran Ju, M. F. Silva","doi":"10.1145/3097895.3097901","DOIUrl":"https://doi.org/10.1145/3097895.3097901","url":null,"abstract":"VR/AR is rapidly progressing towards enterprise and end customers with the promise of bringing immersive experience to numerous applications. Soon it will target smartphones from the cloud and 360° video delivery will need unprecedented requirements for ultra-low latency and ultra-high throughput to mobile networks. Latest developments in NFV and Mobile Edge Computing reveal already the potential to enable VR streaming in cellular networks and to pave the way towards 5G and next stages in VR technology. In this paper we present a Field Of View (FOV) rendering solution at the edge of a mobile network, designed to optimize the bandwidth and latency required by VR 360° video streaming. Preliminary test results show the immediate benefits in bandwidth saving this approach can provide and generate new directions for VR/AR network research.","PeriodicalId":270981,"journal":{"name":"Proceedings of the Workshop on Virtual Reality and Augmented Reality Network","volume":"165 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-08-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116430617","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 166
Prioritized Buffer Control in Two-tier 360 Video Streaming 两层360视频流中的优先缓冲控制
Pub Date : 2017-08-11 DOI: 10.1145/3097895.3097898
Fanyi Duanmu, Eymen Kurdoglu, S. A. Hosseini, Yong Liu, Yao Wang
360 degree video compression and streaming is one of the key components of Virtual Reality (VR) applications. In 360 video streaming, a user may freely navigate through the captured 3D environment by changing her desired viewing direction. Only a small portion of the entire 360 degree video is watched at any time. Streaming the entire 360 degree raw video is therefore unnecessary and bandwidth-consuming. One the other hand, only streaming the video in the predicted user's view direction will introduce streaming discontinuity whenever the the prediction is wrong. In this work, a two-tier 360 video streaming framework with prioritized buffer control is proposed to effectively accommodate the dynamics in both network bandwidth and viewing direction. Through simulations driven by real network bandwidth and viewing direction traces, we demonstrate that the proposed framework can significantly outperform the conventional 360 video streaming solutions.
360度视频压缩和流是虚拟现实(VR)应用的关键组成部分之一。在360度视频流中,用户可以通过改变自己想要的观看方向,在捕获的3D环境中自由导航。在任何时候,只有整个360度视频的一小部分被观看。因此,流式传输整个360度原始视频是不必要的,并且占用带宽。另一方面,如果只在预测的用户观看方向上播放视频,则在预测错误时将导致播放不连续。在这项工作中,提出了一种具有优先缓冲控制的两层360视频流框架,以有效地适应网络带宽和观看方向的动态。通过由真实网络带宽和观看方向跟踪驱动的仿真,我们证明了所提出的框架可以显著优于传统的360视频流解决方案。
{"title":"Prioritized Buffer Control in Two-tier 360 Video Streaming","authors":"Fanyi Duanmu, Eymen Kurdoglu, S. A. Hosseini, Yong Liu, Yao Wang","doi":"10.1145/3097895.3097898","DOIUrl":"https://doi.org/10.1145/3097895.3097898","url":null,"abstract":"360 degree video compression and streaming is one of the key components of Virtual Reality (VR) applications. In 360 video streaming, a user may freely navigate through the captured 3D environment by changing her desired viewing direction. Only a small portion of the entire 360 degree video is watched at any time. Streaming the entire 360 degree raw video is therefore unnecessary and bandwidth-consuming. One the other hand, only streaming the video in the predicted user's view direction will introduce streaming discontinuity whenever the the prediction is wrong. In this work, a two-tier 360 video streaming framework with prioritized buffer control is proposed to effectively accommodate the dynamics in both network bandwidth and viewing direction. Through simulations driven by real network bandwidth and viewing direction traces, we demonstrate that the proposed framework can significantly outperform the conventional 360 video streaming solutions.","PeriodicalId":270981,"journal":{"name":"Proceedings of the Workshop on Virtual Reality and Augmented Reality Network","volume":"25 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-08-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123824033","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 44
On the Networking Challenges of Mobile Augmented Reality 移动增强现实的网络挑战
Pub Date : 2017-08-11 DOI: 10.1145/3097895.3097900
Wenxiao Zhang, B. Han, P. Hui
In this paper, we conduct a reality check for Augmented Reality (AR) on mobile devices. We dissect and measure the cloud-offloading feature for computation-intensive visual tasks of two popular commercial AR systems. Our key finding is that their cloud-based recognition is still not mature and not optimized for latency, data usage and energy consumption. In order to identify the opportunities for further improving the Quality of Experience (QoE) for mobile AR, we break down the end-to-end latency of the pipeline for typical cloud-based mobile AR and pinpoint the dominating components in the critical path.
在本文中,我们对移动设备上的增强现实(AR)进行了现实检查。我们分析和测量了两种流行的商业AR系统的计算密集型视觉任务的云卸载特性。我们的主要发现是,他们基于云的识别仍然不成熟,没有针对延迟、数据使用和能耗进行优化。为了确定进一步提高移动AR体验质量(QoE)的机会,我们分解了典型的基于云的移动AR管道的端到端延迟,并确定了关键路径中的主导组件。
{"title":"On the Networking Challenges of Mobile Augmented Reality","authors":"Wenxiao Zhang, B. Han, P. Hui","doi":"10.1145/3097895.3097900","DOIUrl":"https://doi.org/10.1145/3097895.3097900","url":null,"abstract":"In this paper, we conduct a reality check for Augmented Reality (AR) on mobile devices. We dissect and measure the cloud-offloading feature for computation-intensive visual tasks of two popular commercial AR systems. Our key finding is that their cloud-based recognition is still not mature and not optimized for latency, data usage and energy consumption. In order to identify the opportunities for further improving the Quality of Experience (QoE) for mobile AR, we break down the end-to-end latency of the pipeline for typical cloud-based mobile AR and pinpoint the dominating components in the critical path.","PeriodicalId":270981,"journal":{"name":"Proceedings of the Workshop on Virtual Reality and Augmented Reality Network","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-08-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114253682","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 71
Ultra Wide View Based Panoramic VR Streaming 基于超宽视图的全景VR流媒体
Pub Date : 2017-08-11 DOI: 10.1145/3097895.3097899
Ran Ju, Jun He, Fengxin Sun, Jin Li, Feng Li, Jirong Zhu, Lei Han
Online VR streaming faces great challenges such as the high throughput and real time interaction requirement. In this paper, we propose a novel ultra wide view based method to stream high quality VR on Internet at low bandwidth and little computation cost. First, we only transmit the region where user is looking at instead of full 360° view to save bandwidth. To achieve this goal, we split the source VR into small grid videos in advance. The grid videos are able to reconstruct any view flexibly in user end. Second, according to the fact that users generally interact at low speed, we expand the view that user requested to meet the real time interaction requirement. Besides, a low resolution full view stream is supplied to handle exceptional cases such as high speed view change. We test our solution in an experimental network. The results show remarkable bandwidth saving of over 60% in average at little computation cost while supplying the same quality of experience as local VR.
在线VR流媒体面临着高吞吐量和实时交互要求等巨大挑战。本文提出了一种基于超宽视场的新方法,以低带宽和低计算成本在互联网上传输高质量的VR。首先,我们只传输用户正在观看的区域,而不是360°全景,以节省带宽。为了实现这一目标,我们提前将源VR分割成小的网格视频。网格视频能够在用户端灵活地重构任意视图。其次,根据用户交互速度较低的特点,拓展用户需求视图,满足实时交互需求。此外,还提供了低分辨率的全视图流来处理高速视图更改等异常情况。我们在一个实验网络中测试了我们的解决方案。结果表明,在提供与本地VR相同的体验质量的情况下,以较小的计算成本平均节省60%以上的带宽。
{"title":"Ultra Wide View Based Panoramic VR Streaming","authors":"Ran Ju, Jun He, Fengxin Sun, Jin Li, Feng Li, Jirong Zhu, Lei Han","doi":"10.1145/3097895.3097899","DOIUrl":"https://doi.org/10.1145/3097895.3097899","url":null,"abstract":"Online VR streaming faces great challenges such as the high throughput and real time interaction requirement. In this paper, we propose a novel ultra wide view based method to stream high quality VR on Internet at low bandwidth and little computation cost. First, we only transmit the region where user is looking at instead of full 360° view to save bandwidth. To achieve this goal, we split the source VR into small grid videos in advance. The grid videos are able to reconstruct any view flexibly in user end. Second, according to the fact that users generally interact at low speed, we expand the view that user requested to meet the real time interaction requirement. Besides, a low resolution full view stream is supplied to handle exceptional cases such as high speed view change. We test our solution in an experimental network. The results show remarkable bandwidth saving of over 60% in average at little computation cost while supplying the same quality of experience as local VR.","PeriodicalId":270981,"journal":{"name":"Proceedings of the Workshop on Virtual Reality and Augmented Reality Network","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-08-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121975449","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 27
Delivering Deep Learning to Mobile Devices via Offloading 通过卸载向移动设备提供深度学习
Pub Date : 2017-08-11 DOI: 10.1145/3097895.3097903
Xukan Ran, Haoliang Chen, Zhenming Liu, Jiasi Chen
Deep learning has the potential to make Augmented Reality (AR) devices smarter, but few AR apps use such technology today because it is compute-intensive, and front-end devices cannot deliver sufficient compute power. We propose a distributed framework that ties together front-end devices with more powerful back-end "helpers" that allow deep learning to be executed locally or to be offloaded. This framework should be able to intelligently use current estimates of network conditions and back-end server loads, in conjunction with the application's requirements, to determine an optimal strategy. This work reports our preliminary investigation in implementing such a framework, in which the front-end is assumed to be smartphones. Our specific contributions include: (1) development of an Android application that performs real-time object detection, either locally on the smartphone or remotely on a server; and (2) characterization of the tradeoffs between object detection accuracy, latency, and battery drain, based on the system parameters of video resolution, deep learning model size, and offloading decision.
深度学习有可能使增强现实(AR)设备更智能,但目前很少有AR应用程序使用这种技术,因为它是计算密集型的,前端设备无法提供足够的计算能力。我们提出了一个分布式框架,将前端设备与更强大的后端“助手”联系在一起,允许深度学习在本地执行或卸载。这个框架应该能够智能地使用当前对网络条件和后端服务器负载的估计,并结合应用程序的需求来确定最佳策略。这项工作报告了我们在实现这样一个框架方面的初步调查,其中前端被假设为智能手机。我们的具体贡献包括:(1)开发执行实时对象检测的Android应用程序,无论是在本地智能手机上还是在远程服务器上;(2)基于视频分辨率、深度学习模型大小和卸载决策的系统参数,表征目标检测精度、延迟和电池消耗之间的权衡。
{"title":"Delivering Deep Learning to Mobile Devices via Offloading","authors":"Xukan Ran, Haoliang Chen, Zhenming Liu, Jiasi Chen","doi":"10.1145/3097895.3097903","DOIUrl":"https://doi.org/10.1145/3097895.3097903","url":null,"abstract":"Deep learning has the potential to make Augmented Reality (AR) devices smarter, but few AR apps use such technology today because it is compute-intensive, and front-end devices cannot deliver sufficient compute power. We propose a distributed framework that ties together front-end devices with more powerful back-end \"helpers\" that allow deep learning to be executed locally or to be offloaded. This framework should be able to intelligently use current estimates of network conditions and back-end server loads, in conjunction with the application's requirements, to determine an optimal strategy. This work reports our preliminary investigation in implementing such a framework, in which the front-end is assumed to be smartphones. Our specific contributions include: (1) development of an Android application that performs real-time object detection, either locally on the smartphone or remotely on a server; and (2) characterization of the tradeoffs between object detection accuracy, latency, and battery drain, based on the system parameters of video resolution, deep learning model size, and offloading decision.","PeriodicalId":270981,"journal":{"name":"Proceedings of the Workshop on Virtual Reality and Augmented Reality Network","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-08-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114643661","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 63
Characterization of 360-degree Videos 360度视频的特征
Pub Date : 2017-08-11 DOI: 10.1145/3097895.3097896
Shahryar Afzal, Jiasi Chen, K. Ramakrishnan
Online streaming of Virtual Reality and 360° videos is rapidly growing, as more and more major content providers and news outlets adopt the format to enrich the user experience. We characterize 360° videos by examining several thousand YouTube videos across more than a dozen categories. 360° videos, at first sight, seem to pose a challenge for the network to stream because of their substantially higher bit rates and larger number of resolutions. However, a careful examination of video characteristics reveals that there are significant opportunities for reducing the actual bit rate delivered to client devices based on the user's field of view. We study the bit rate and the motion in 360° videos, and compare them against regular videos by investigating several important metrics. We find that 360° videos are less variable in terms of bit rate, and have less motion than regular videos. Our expectation is that variability in the bit rates due to the motion of the camera in regular videos (or switching between cameras) is now translated to responsiveness requirements for end to end 360° streaming architectures.
随着越来越多的主要内容提供商和新闻媒体采用这种格式来丰富用户体验,虚拟现实和360°视频的在线流媒体正在迅速发展。我们通过检查数千个YouTube视频来描述360°视频的特征,这些视频分为十几个类别。360°视频,乍一看,似乎对网络流媒体构成了挑战,因为它们具有更高的比特率和更多的分辨率。然而,对视频特性的仔细检查表明,根据用户的视场,有很大的机会降低交付给客户端设备的实际比特率。我们研究了360°视频中的比特率和运动,并通过调查几个重要指标将它们与常规视频进行比较。我们发现360°视频在比特率方面变化较小,并且比常规视频具有更少的运动。我们的期望是,由于摄像机在常规视频中的运动(或在摄像机之间切换)而导致的比特率的可变性现在转化为端到端360°流媒体架构的响应性要求。
{"title":"Characterization of 360-degree Videos","authors":"Shahryar Afzal, Jiasi Chen, K. Ramakrishnan","doi":"10.1145/3097895.3097896","DOIUrl":"https://doi.org/10.1145/3097895.3097896","url":null,"abstract":"Online streaming of Virtual Reality and 360° videos is rapidly growing, as more and more major content providers and news outlets adopt the format to enrich the user experience. We characterize 360° videos by examining several thousand YouTube videos across more than a dozen categories. 360° videos, at first sight, seem to pose a challenge for the network to stream because of their substantially higher bit rates and larger number of resolutions. However, a careful examination of video characteristics reveals that there are significant opportunities for reducing the actual bit rate delivered to client devices based on the user's field of view. We study the bit rate and the motion in 360° videos, and compare them against regular videos by investigating several important metrics. We find that 360° videos are less variable in terms of bit rate, and have less motion than regular videos. Our expectation is that variability in the bit rates due to the motion of the camera in regular videos (or switching between cameras) is now translated to responsiveness requirements for end to end 360° streaming architectures.","PeriodicalId":270981,"journal":{"name":"Proceedings of the Workshop on Virtual Reality and Augmented Reality Network","volume":"27 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-08-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127976334","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 59
VR/AR Immersive Communication: Caching, Edge Computing, and Transmission Trade-Offs VR/AR沉浸式通信:缓存、边缘计算和传输权衡
Pub Date : 2017-08-11 DOI: 10.1145/3097895.3097902
Jacob Chakareski
We study the delivery of 360°-navigable videos to 5G VR/AR wireless clients in future cooperative multi-cellular systems. A collection of small-cell base stations interconnected via back-haul links are sharing their caching and computing resources to maximize the aggregate reward they earn by serving 360° videos requested by VR/AR wireless clients. We design an efficient representation method to construct the 360° videos such that they only deliver the remote scene viewpoint content genuinely needed by the VR/AR user, thereby overcoming the present highly inefficient approach of sending a bulky 360° video, whose major part comprises scene information never accessed by the user. Moreover, we design an optimization framework that allows the base stations to select cooperative caching/rendering/streaming strategies that maximize the aggregate reward they earn when serving the users, for the given caching/computational resources at each base station. We formulate the problem of interest as integer programming, show its NP-hardness, and derive a fully-polynomial-time approximation solution with strong performance guarantees. Our advances demonstrate orders of magnitude operational efficiency gains over state-of-the-art caching and 360° video representation mechanisms and are very promising. This is a first-of-its-kind study to explore fundamental trade-offs between caching, computing, and communication for emerging VR/AR applications of broad societal impact.
我们研究了在未来的合作多蜂窝系统中向5G VR/AR无线客户端交付360°可导航视频。一组通过回程链路相互连接的小蜂窝基站正在共享它们的缓存和计算资源,以最大限度地提高它们通过提供VR/AR无线客户端请求的360°视频而获得的总回报。我们设计了一种高效的表示方法来构建360°视频,使其只传递VR/AR用户真正需要的远程场景视点内容,从而克服了目前发送大量360°视频的低效方法,这些视频的主要部分包含用户从未访问过的场景信息。此外,我们设计了一个优化框架,允许基站选择合作缓存/渲染/流策略,最大化他们在服务用户时获得的总奖励,为每个基站给定的缓存/计算资源。我们将感兴趣的问题表述为整数规划,证明了它的np -硬度,并导出了一个具有强性能保证的全多项式时间近似解。我们的进展表明,与最先进的缓存和360°视频表示机制相比,操作效率提高了几个数量级,非常有前景。这是一项开创性的研究,旨在探索具有广泛社会影响的新兴VR/AR应用中缓存、计算和通信之间的基本权衡。
{"title":"VR/AR Immersive Communication: Caching, Edge Computing, and Transmission Trade-Offs","authors":"Jacob Chakareski","doi":"10.1145/3097895.3097902","DOIUrl":"https://doi.org/10.1145/3097895.3097902","url":null,"abstract":"We study the delivery of 360°-navigable videos to 5G VR/AR wireless clients in future cooperative multi-cellular systems. A collection of small-cell base stations interconnected via back-haul links are sharing their caching and computing resources to maximize the aggregate reward they earn by serving 360° videos requested by VR/AR wireless clients. We design an efficient representation method to construct the 360° videos such that they only deliver the remote scene viewpoint content genuinely needed by the VR/AR user, thereby overcoming the present highly inefficient approach of sending a bulky 360° video, whose major part comprises scene information never accessed by the user. Moreover, we design an optimization framework that allows the base stations to select cooperative caching/rendering/streaming strategies that maximize the aggregate reward they earn when serving the users, for the given caching/computational resources at each base station. We formulate the problem of interest as integer programming, show its NP-hardness, and derive a fully-polynomial-time approximation solution with strong performance guarantees. Our advances demonstrate orders of magnitude operational efficiency gains over state-of-the-art caching and 360° video representation mechanisms and are very promising. This is a first-of-its-kind study to explore fundamental trade-offs between caching, computing, and communication for emerging VR/AR applications of broad societal impact.","PeriodicalId":270981,"journal":{"name":"Proceedings of the Workshop on Virtual Reality and Augmented Reality Network","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-08-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131503575","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 88
Proceedings of the Workshop on Virtual Reality and Augmented Reality Network 虚拟现实与增强现实网络研讨会论文集
{"title":"Proceedings of the Workshop on Virtual Reality and Augmented Reality Network","authors":"","doi":"10.1145/3097895","DOIUrl":"https://doi.org/10.1145/3097895","url":null,"abstract":"","PeriodicalId":270981,"journal":{"name":"Proceedings of the Workshop on Virtual Reality and Augmented Reality Network","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123780366","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
期刊
Proceedings of the Workshop on Virtual Reality and Augmented Reality Network
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1