{"title":"An auditory interface for realtime brainwave similarity in dyads","authors":"R. M. Winters, Stephanie Koziej","doi":"10.1145/3411109.3411147","DOIUrl":null,"url":null,"abstract":"We present a case-study in the development of a\"hyperscanning\" auditory interface that transforms realtime brainwave-similarity between interacting dyads into music. Our instrument extends reality in face-to-face communication with a musical stream reflecting an invisible socio-neurophysiological signal. This instrument contributes to the historical context of brain-computer interfaces (BCIs) applied to art and music, but is unique because it is contingent on the correlation between the brainwaves of the dyad, and because it conveys this information using entirely auditory feedback. We designed the instrument to be i) easy to understand, ii) relatable and iii) pleasant for members of the general public in an exhibition context. We present how this context and user group led to our choice of EEG hardware, inter-brain similarity metric, and our auditory mapping strategy. We discuss our experience following four public exhibitions, as well as future improvements to the instrument design and user experience.","PeriodicalId":368424,"journal":{"name":"Proceedings of the 15th International Audio Mostly Conference","volume":"26 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 15th International Audio Mostly Conference","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3411109.3411147","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
We present a case-study in the development of a"hyperscanning" auditory interface that transforms realtime brainwave-similarity between interacting dyads into music. Our instrument extends reality in face-to-face communication with a musical stream reflecting an invisible socio-neurophysiological signal. This instrument contributes to the historical context of brain-computer interfaces (BCIs) applied to art and music, but is unique because it is contingent on the correlation between the brainwaves of the dyad, and because it conveys this information using entirely auditory feedback. We designed the instrument to be i) easy to understand, ii) relatable and iii) pleasant for members of the general public in an exhibition context. We present how this context and user group led to our choice of EEG hardware, inter-brain similarity metric, and our auditory mapping strategy. We discuss our experience following four public exhibitions, as well as future improvements to the instrument design and user experience.