Performance-based Expressive Character Animation

Deepali Aneja
{"title":"Performance-based Expressive Character Animation","authors":"Deepali Aneja","doi":"10.1145/3332167.3356880","DOIUrl":null,"url":null,"abstract":"For decades, animation has been a popular storytelling technique. Traditional tools for creating animations are labor-intensive requiring animators to painstakingly draw frames and motion curves by hand. An alternative workflow is to equip animators with direct real-time control over digital characters via performance, which offers a more immediate and efficient way to create animation. Even when using these existing expression transfer and lip sync methods, producing convincing facial animation in real-time is a challenging task. In this position paper, I describe my past and proposed future research in developing interactive systems for perceptually-valid expression retargeting from humans to stylized characters, real-time lip sync for 2D animation, and building an expressive style aligned embodied conversational agent.","PeriodicalId":322598,"journal":{"name":"Adjunct Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology","volume":"14 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-10-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Adjunct Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3332167.3356880","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

For decades, animation has been a popular storytelling technique. Traditional tools for creating animations are labor-intensive requiring animators to painstakingly draw frames and motion curves by hand. An alternative workflow is to equip animators with direct real-time control over digital characters via performance, which offers a more immediate and efficient way to create animation. Even when using these existing expression transfer and lip sync methods, producing convincing facial animation in real-time is a challenging task. In this position paper, I describe my past and proposed future research in developing interactive systems for perceptually-valid expression retargeting from humans to stylized characters, real-time lip sync for 2D animation, and building an expressive style aligned embodied conversational agent.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于表演的富有表现力的角色动画
几十年来,动画一直是一种流行的讲故事技巧。用于创建动画的传统工具是劳动密集型的,需要动画师手工绘制帧和运动曲线。另一种工作流程是通过性能为动画师提供对数字角色的直接实时控制,这提供了一种更直接、更有效的创建动画的方法。即使使用这些现有的表情转移和对口型方法,实时制作令人信服的面部动画也是一项具有挑战性的任务。在这篇论文中,我描述了我过去的研究,并提出了未来的研究方向,包括开发交互式系统,将感知有效的表情从人类重新定位到风格化的角色,为2D动画进行实时口型同步,以及构建具有表达风格的具体化对话代理。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Optical See-Through Head-Mounted Display with Deep Depth of Field Using Pinhole Polarizing Plates Visualizing Out-of-synchronization in Group Dancing A New Approach to Studying Sleep in Autonomous Vehicles: Simulating the Waking Situation Choose a lift and walk into it: Manifesting Choice Blindness in Real-life Scenarios using Immersive Virtual Reality Occlusion-aware Hand Posture Based Interaction on Tabletop Projector
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1