Kouyou Otsu, Hidekazu Takahashi, Hisato Fukuda, Yoshinori Kobayashi, Y. Kuno
{"title":"使用现场表演者的多模态反馈增强音乐会体验","authors":"Kouyou Otsu, Hidekazu Takahashi, Hisato Fukuda, Yoshinori Kobayashi, Y. Kuno","doi":"10.1109/HSI.2017.8005047","DOIUrl":null,"url":null,"abstract":"In this paper, we aim to enhance the interaction between the performer and the audience in live idol performances. We propose a system for converting the movements of individual members of an idol group into vibrations and their voices into light on handheld devices for the audience. Specifically, for each performer, the system acquires data on movement and voice magnitudes via an acceleration sensor attached to the right wrist and microphone. The obtained data is then converted into motor vibrations and lights from an LED. The receiving devices for the audience members come in the form of a pen light or doll. A prototype system was made to collect acceleration data and voice magnitude data measurements for our experiments with an idol group in Japan to verify whether the performer's movements and singing voice could be correctly measured during real live performance conditions. We developed a program to present the strength of the movements and singing voice corresponding to one of the members as vibrations and lights based on the information of the recorded data. Then, an experiment was conducted for eight subjects that observed the performance. We found that seven out of eight subjects could identify the idol performer with corresponding vibrations and lighting from the device.","PeriodicalId":355011,"journal":{"name":"2017 10th International Conference on Human System Interactions (HSI)","volume":"8 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Enhanced concert experience using multimodal feedback from live performers\",\"authors\":\"Kouyou Otsu, Hidekazu Takahashi, Hisato Fukuda, Yoshinori Kobayashi, Y. Kuno\",\"doi\":\"10.1109/HSI.2017.8005047\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In this paper, we aim to enhance the interaction between the performer and the audience in live idol performances. We propose a system for converting the movements of individual members of an idol group into vibrations and their voices into light on handheld devices for the audience. Specifically, for each performer, the system acquires data on movement and voice magnitudes via an acceleration sensor attached to the right wrist and microphone. The obtained data is then converted into motor vibrations and lights from an LED. The receiving devices for the audience members come in the form of a pen light or doll. A prototype system was made to collect acceleration data and voice magnitude data measurements for our experiments with an idol group in Japan to verify whether the performer's movements and singing voice could be correctly measured during real live performance conditions. We developed a program to present the strength of the movements and singing voice corresponding to one of the members as vibrations and lights based on the information of the recorded data. Then, an experiment was conducted for eight subjects that observed the performance. We found that seven out of eight subjects could identify the idol performer with corresponding vibrations and lighting from the device.\",\"PeriodicalId\":355011,\"journal\":{\"name\":\"2017 10th International Conference on Human System Interactions (HSI)\",\"volume\":\"8 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2017-07-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2017 10th International Conference on Human System Interactions (HSI)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/HSI.2017.8005047\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 10th International Conference on Human System Interactions (HSI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/HSI.2017.8005047","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Enhanced concert experience using multimodal feedback from live performers
In this paper, we aim to enhance the interaction between the performer and the audience in live idol performances. We propose a system for converting the movements of individual members of an idol group into vibrations and their voices into light on handheld devices for the audience. Specifically, for each performer, the system acquires data on movement and voice magnitudes via an acceleration sensor attached to the right wrist and microphone. The obtained data is then converted into motor vibrations and lights from an LED. The receiving devices for the audience members come in the form of a pen light or doll. A prototype system was made to collect acceleration data and voice magnitude data measurements for our experiments with an idol group in Japan to verify whether the performer's movements and singing voice could be correctly measured during real live performance conditions. We developed a program to present the strength of the movements and singing voice corresponding to one of the members as vibrations and lights based on the information of the recorded data. Then, an experiment was conducted for eight subjects that observed the performance. We found that seven out of eight subjects could identify the idol performer with corresponding vibrations and lighting from the device.