{"title":"Capturing kinetic wave demonstrations for sound control","authors":"J. Granzow, Matias Vilaplana, Anil Çamci","doi":"10.1145/3411109.3411150","DOIUrl":null,"url":null,"abstract":"In musical acoustics, wave propagation, reflection, phase inversion, and boundary conditions can be hard to conceptualize. Physical kinetic wave demonstrations offer visible and tangible experiences of wave behavior and facilitate active learning. We implement such kinetic demonstrations, a long spring and a Shive machine, using contemporary fabrication techniques. Furthermore, we employ motion capture (MoCap) technology to transform these kinetic assemblies into audio controllers. Time-varying coordinates of Mo-Cap markers integrated into the assemblies are mapped to audio parameters, closing a multi-sensory loop where visual analogues of acoustic phenomena are in turn used to control digital audio. The project leads to a pedagogical practice where fabrication and sensing technologies are used to reconstitute demonstrations for the eye as controllers for the ear.","PeriodicalId":368424,"journal":{"name":"Proceedings of the 15th International Audio Mostly Conference","volume":"30 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 15th International Audio Mostly Conference","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3411109.3411150","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
In musical acoustics, wave propagation, reflection, phase inversion, and boundary conditions can be hard to conceptualize. Physical kinetic wave demonstrations offer visible and tangible experiences of wave behavior and facilitate active learning. We implement such kinetic demonstrations, a long spring and a Shive machine, using contemporary fabrication techniques. Furthermore, we employ motion capture (MoCap) technology to transform these kinetic assemblies into audio controllers. Time-varying coordinates of Mo-Cap markers integrated into the assemblies are mapped to audio parameters, closing a multi-sensory loop where visual analogues of acoustic phenomena are in turn used to control digital audio. The project leads to a pedagogical practice where fabrication and sensing technologies are used to reconstitute demonstrations for the eye as controllers for the ear.