T. Chong, Hijung Valentina Shin, Deepali Aneja, T. Igarashi
{"title":"SoundToons:基于范例的交互式音频驱动动画精灵创作","authors":"T. Chong, Hijung Valentina Shin, Deepali Aneja, T. Igarashi","doi":"10.1145/3581641.3584047","DOIUrl":null,"url":null,"abstract":"Animations can come to life when they are synchronized with relevant sounds. Yet, synchronizing animations to audio requires tedious key-framing or programming, which is difficult for novice creators. There are existing tools that support audio-driven live animation, but they focus primarily on speech and have little or no support for non-speech sounds. We present SoundToons, an exemplar-based authoring tool for interactive, audio-driven animation focusing on non-speech sounds. Our tool enables novice creators to author live animations to a wide variety of non-speech sounds, such as clapping and instrumental music. We support two types of audio interactions: (1) discrete interaction, which triggers animations when a discrete sound event is detected, and (2) continuous, which synchronizes an animation to continuous audio parameters. By employing an exemplar-based iterative authoring approach, we empower novice creators to design and quickly refine interactive animations. User evaluations demonstrate that novice users can author and perform live audio-driven animation intuitively. Moreover, compared to other input modalities such as trackpads or foot pedals, users preferred using audio as an intuitive way to drive animation.","PeriodicalId":118159,"journal":{"name":"Proceedings of the 28th International Conference on Intelligent User Interfaces","volume":"61 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-03-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"SoundToons: Exemplar-Based Authoring of Interactive Audio-Driven Animation Sprites\",\"authors\":\"T. Chong, Hijung Valentina Shin, Deepali Aneja, T. Igarashi\",\"doi\":\"10.1145/3581641.3584047\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Animations can come to life when they are synchronized with relevant sounds. Yet, synchronizing animations to audio requires tedious key-framing or programming, which is difficult for novice creators. There are existing tools that support audio-driven live animation, but they focus primarily on speech and have little or no support for non-speech sounds. We present SoundToons, an exemplar-based authoring tool for interactive, audio-driven animation focusing on non-speech sounds. Our tool enables novice creators to author live animations to a wide variety of non-speech sounds, such as clapping and instrumental music. We support two types of audio interactions: (1) discrete interaction, which triggers animations when a discrete sound event is detected, and (2) continuous, which synchronizes an animation to continuous audio parameters. By employing an exemplar-based iterative authoring approach, we empower novice creators to design and quickly refine interactive animations. User evaluations demonstrate that novice users can author and perform live audio-driven animation intuitively. Moreover, compared to other input modalities such as trackpads or foot pedals, users preferred using audio as an intuitive way to drive animation.\",\"PeriodicalId\":118159,\"journal\":{\"name\":\"Proceedings of the 28th International Conference on Intelligent User Interfaces\",\"volume\":\"61 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-03-27\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 28th International Conference on Intelligent User Interfaces\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3581641.3584047\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 28th International Conference on Intelligent User Interfaces","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3581641.3584047","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
SoundToons: Exemplar-Based Authoring of Interactive Audio-Driven Animation Sprites
Animations can come to life when they are synchronized with relevant sounds. Yet, synchronizing animations to audio requires tedious key-framing or programming, which is difficult for novice creators. There are existing tools that support audio-driven live animation, but they focus primarily on speech and have little or no support for non-speech sounds. We present SoundToons, an exemplar-based authoring tool for interactive, audio-driven animation focusing on non-speech sounds. Our tool enables novice creators to author live animations to a wide variety of non-speech sounds, such as clapping and instrumental music. We support two types of audio interactions: (1) discrete interaction, which triggers animations when a discrete sound event is detected, and (2) continuous, which synchronizes an animation to continuous audio parameters. By employing an exemplar-based iterative authoring approach, we empower novice creators to design and quickly refine interactive animations. User evaluations demonstrate that novice users can author and perform live audio-driven animation intuitively. Moreover, compared to other input modalities such as trackpads or foot pedals, users preferred using audio as an intuitive way to drive animation.