{"title":"触摸和激活:使用主动声学传感为现有对象添加交互性","authors":"Makoto Ono, B. Shizuki, J. Tanaka","doi":"10.1145/2501988.2501989","DOIUrl":null,"url":null,"abstract":"In this paper, we present a novel acoustic touch sensing technique called Touch & Activate. It recognizes a rich context of touches including grasp on existing objects by attaching only a vibration speaker and a piezo-electric microphone paired as a sensor. It provides easy hardware configuration for prototyping interactive objects that have touch input capability. We conducted a controlled experiment to measure the accuracy and trade-off between the accuracy and number of training rounds for our technique. From its results, per-user recognition accuracies with five touch gestures for a plastic toy as a simple example and six hand postures for the posture recognition as a complex example were 99.6% and 86.3%, respectively. Walk up user recognition accuracies for the two applications were 97.8% and 71.2%, respectively. Since the results of our experiment showed a promising accuracy for the recognition of touch gestures and hand postures, Touch & Activate should be feasible for prototype interactive objects that have touch input capability.","PeriodicalId":294436,"journal":{"name":"Proceedings of the 26th annual ACM symposium on User interface software and technology","volume":"37 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2013-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"156","resultStr":"{\"title\":\"Touch & activate: adding interactivity to existing objects using active acoustic sensing\",\"authors\":\"Makoto Ono, B. Shizuki, J. Tanaka\",\"doi\":\"10.1145/2501988.2501989\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In this paper, we present a novel acoustic touch sensing technique called Touch & Activate. It recognizes a rich context of touches including grasp on existing objects by attaching only a vibration speaker and a piezo-electric microphone paired as a sensor. It provides easy hardware configuration for prototyping interactive objects that have touch input capability. We conducted a controlled experiment to measure the accuracy and trade-off between the accuracy and number of training rounds for our technique. From its results, per-user recognition accuracies with five touch gestures for a plastic toy as a simple example and six hand postures for the posture recognition as a complex example were 99.6% and 86.3%, respectively. Walk up user recognition accuracies for the two applications were 97.8% and 71.2%, respectively. Since the results of our experiment showed a promising accuracy for the recognition of touch gestures and hand postures, Touch & Activate should be feasible for prototype interactive objects that have touch input capability.\",\"PeriodicalId\":294436,\"journal\":{\"name\":\"Proceedings of the 26th annual ACM symposium on User interface software and technology\",\"volume\":\"37 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2013-10-08\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"156\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 26th annual ACM symposium on User interface software and technology\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/2501988.2501989\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 26th annual ACM symposium on User interface software and technology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2501988.2501989","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Touch & activate: adding interactivity to existing objects using active acoustic sensing
In this paper, we present a novel acoustic touch sensing technique called Touch & Activate. It recognizes a rich context of touches including grasp on existing objects by attaching only a vibration speaker and a piezo-electric microphone paired as a sensor. It provides easy hardware configuration for prototyping interactive objects that have touch input capability. We conducted a controlled experiment to measure the accuracy and trade-off between the accuracy and number of training rounds for our technique. From its results, per-user recognition accuracies with five touch gestures for a plastic toy as a simple example and six hand postures for the posture recognition as a complex example were 99.6% and 86.3%, respectively. Walk up user recognition accuracies for the two applications were 97.8% and 71.2%, respectively. Since the results of our experiment showed a promising accuracy for the recognition of touch gestures and hand postures, Touch & Activate should be feasible for prototype interactive objects that have touch input capability.