首页 > 最新文献

ACM SIGGRAPH 2020 Courses最新文献

英文 中文
Introduction to the Vulkan computer graphics API Vulkan计算机图形API简介
Pub Date : 2020-08-17 DOI: 10.1145/3388769.3407508
M. Bailey
Vulkan is better at keeping the GPU busy than OpenGL is. OpenGL drivers need to do a lot of CPU work before handing work off to the GPU. Vulkan lets you get more power from the GPU card you already have.
Vulkan比OpenGL更擅长让GPU保持忙碌状态。OpenGL驱动程序在将工作移交给GPU之前需要做大量的CPU工作。Vulkan可以让你从已有的GPU卡中获得更大的功率。
{"title":"Introduction to the Vulkan computer graphics API","authors":"M. Bailey","doi":"10.1145/3388769.3407508","DOIUrl":"https://doi.org/10.1145/3388769.3407508","url":null,"abstract":"Vulkan is better at keeping the GPU busy than OpenGL is. OpenGL drivers need to do a lot of CPU work before handing work off to the GPU. Vulkan lets you get more power from the GPU card you already have.","PeriodicalId":167147,"journal":{"name":"ACM SIGGRAPH 2020 Courses","volume":"71 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-08-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127162883","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Understanding AR inside and out --- Part One: a solid grounding 从内到外理解AR——第一部分:坚实的基础
Pub Date : 2020-08-17 DOI: 10.1145/3388769.3407539
Course Volumes, M. Billinghurst, Wan-Chun Ma
{"title":"Understanding AR inside and out --- Part One: a solid grounding","authors":"Course Volumes, M. Billinghurst, Wan-Chun Ma","doi":"10.1145/3388769.3407539","DOIUrl":"https://doi.org/10.1145/3388769.3407539","url":null,"abstract":"","PeriodicalId":167147,"journal":{"name":"ACM SIGGRAPH 2020 Courses","volume":"26 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-08-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122936009","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Virtual hands in VR: motion capture, synthesis, and perception 虚拟现实中的虚拟之手:动作捕捉、合成和感知
Pub Date : 2020-08-17 DOI: 10.1145/3388769.3407494
S. Jörg, Yuting Ye, Michael Neff, Franziska Mueller, V. Zordan
We use our hands every day: to grasp a cup of coffee, write text on a keyboard, or signal that we are about to say something important. We use our hands to interact with our environment and to help us communicate with each other without thinking about it. Wouldn't it be great to be able to do the same in virtual reality? However, accurate hand motions are not trivial to capture. In this course, we present the current state of the art when it comes to virtual hands. Starting with current examples for controlling and depicting hands in virtual reality (VR), we dive into the latest methods and technologies to capture hand motions. As hands can currently not be captured in every situation and as constraints stopping us from intersecting with objects are typically not available in VR, we present research on how to synthesize hand motions and simulate grasping motions. Finally, we provide an overview of our knowledge of how virtual hands are being perceived, resulting in practical tips on how to represent and handle virtual hands. Our goals are (a) to present a broad state of the art of the current usage of hands in VR, (b) to provide more in-depth knowledge about the functioning of current hand motion tracking and hand motion synthesis methods, (c) to give insights on our perception of hand motions in VR and how to use those insights when developing new applications, and finally (d) to identify gaps in knowledge that might be investigated next. While the focus of this course is on VR, many parts also apply to augmented reality, mixed reality, and character animation in general, and some content originates from these areas.
我们每天都要用我们的手:拿一杯咖啡,在键盘上写字,或者发出我们要说重要事情的信号。我们用手与环境互动,帮助我们不假思索地与他人交流。如果在虚拟现实中也能做到这一点,那不是很好吗?然而,准确的手部动作是很难捕捉到的。在本课程中,我们介绍了当前的艺术状态,当涉及到虚拟手。从虚拟现实(VR)中控制和描绘手的当前示例开始,我们深入研究捕捉手部动作的最新方法和技术。由于目前无法在每种情况下捕获手,并且在VR中通常无法阻止我们与物体相交,因此我们提出了如何合成手部动作和模拟抓取动作的研究。最后,我们提供了一个概述我们的知识如何虚拟手被感知,导致如何表示和处理虚拟手的实用技巧。我们的目标是(a)展示当前VR中手部使用的广泛现状,(b)提供有关当前手部运动跟踪和手部运动合成方法功能的更深入的知识,(c)提供我们对VR中手部运动的感知以及如何在开发新应用程序时使用这些见解的见解,最后(d)确定接下来可能调查的知识空白。虽然本课程的重点是VR,但许多部分也适用于增强现实,混合现实和一般的角色动画,并且一些内容源于这些领域。
{"title":"Virtual hands in VR: motion capture, synthesis, and perception","authors":"S. Jörg, Yuting Ye, Michael Neff, Franziska Mueller, V. Zordan","doi":"10.1145/3388769.3407494","DOIUrl":"https://doi.org/10.1145/3388769.3407494","url":null,"abstract":"We use our hands every day: to grasp a cup of coffee, write text on a keyboard, or signal that we are about to say something important. We use our hands to interact with our environment and to help us communicate with each other without thinking about it. Wouldn't it be great to be able to do the same in virtual reality? However, accurate hand motions are not trivial to capture. In this course, we present the current state of the art when it comes to virtual hands. Starting with current examples for controlling and depicting hands in virtual reality (VR), we dive into the latest methods and technologies to capture hand motions. As hands can currently not be captured in every situation and as constraints stopping us from intersecting with objects are typically not available in VR, we present research on how to synthesize hand motions and simulate grasping motions. Finally, we provide an overview of our knowledge of how virtual hands are being perceived, resulting in practical tips on how to represent and handle virtual hands. Our goals are (a) to present a broad state of the art of the current usage of hands in VR, (b) to provide more in-depth knowledge about the functioning of current hand motion tracking and hand motion synthesis methods, (c) to give insights on our perception of hand motions in VR and how to use those insights when developing new applications, and finally (d) to identify gaps in knowledge that might be investigated next. While the focus of this course is on VR, many parts also apply to augmented reality, mixed reality, and character animation in general, and some content originates from these areas.","PeriodicalId":167147,"journal":{"name":"ACM SIGGRAPH 2020 Courses","volume":"3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-08-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124124510","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 7
Eye-based interaction in graphical systems: 20 years later gaze applications, analytics, & interaction 图形系统中基于眼睛的交互:20年后的注视应用、分析和交互
Pub Date : 2020-08-17 DOI: 10.1145/3388769.3407492
A. Duchowski
The course starts with an overview of eye-tracking applications, distinguishing eye movement analysis from synthesis in virtual reality, games, and other venues including mobile eye tracking. The focus is on four forms of applications: diagnostic (off-line measurement), active (selection, look to shoot), passive (foveated rendering, a.k.a. gaze-contingent displays), and expressive (gaze synthesis). The course covers basic eye movement analytics, e.g., fixation count and dwell time within AOIs, as well as advanced analysis using ambient/focal attention modeling. The course concludes with an overview and a demo of how to build an interactive application using Python.
课程从眼动追踪应用概述开始,区分眼动分析与虚拟现实、游戏和其他场所(包括移动眼动追踪)中的合成。重点是四种形式的应用程序:诊断(离线测量),主动(选择,寻找拍摄),被动(注视点渲染,又名注视点显示)和表达(注视合成)。本课程涵盖基本的眼动分析,例如,注视计数和停留时间在aoi内,以及使用环境/焦点注意力建模的高级分析。本课程以概述和演示如何使用Python构建交互式应用程序结束。
{"title":"Eye-based interaction in graphical systems: 20 years later gaze applications, analytics, & interaction","authors":"A. Duchowski","doi":"10.1145/3388769.3407492","DOIUrl":"https://doi.org/10.1145/3388769.3407492","url":null,"abstract":"The course starts with an overview of eye-tracking applications, distinguishing eye movement analysis from synthesis in virtual reality, games, and other venues including mobile eye tracking. The focus is on four forms of applications: diagnostic (off-line measurement), active (selection, look to shoot), passive (foveated rendering, a.k.a. gaze-contingent displays), and expressive (gaze synthesis). The course covers basic eye movement analytics, e.g., fixation count and dwell time within AOIs, as well as advanced analysis using ambient/focal attention modeling. The course concludes with an overview and a demo of how to build an interactive application using Python.","PeriodicalId":167147,"journal":{"name":"ACM SIGGRAPH 2020 Courses","volume":"46 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-08-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128903572","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
期刊
ACM SIGGRAPH 2020 Courses
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1