Pub Date : 2022-06-22DOI: 10.5753/sensoryx.2022.20000
A. Covaci, Estêvão Bissoli Saleme, C. Jost, Joel A. F. Dos Santos, G. Ghinea
Our interactions with the world are multisensory in nature - the senses move us through spaces, mix with our memories and are constantly connected by our brains. Focused only on vision for a long time, the field of human computer interaction (HCI) started to meaningfully bring together all our senses in designing interactions for a variety of media. With this workshop, we look at different aspects of multisensory design - from authoring tools to the evaluation of multisensory experiences with the aim of identifying the current challenges and opportunities of mulsemedia.
{"title":"SensoryX ’22 Workshop on Multisensory Experiences at ACM IMX ’22","authors":"A. Covaci, Estêvão Bissoli Saleme, C. Jost, Joel A. F. Dos Santos, G. Ghinea","doi":"10.5753/sensoryx.2022.20000","DOIUrl":"https://doi.org/10.5753/sensoryx.2022.20000","url":null,"abstract":"Our interactions with the world are multisensory in nature - the senses move us through spaces, mix with our memories and are constantly connected by our brains. Focused only on vision for a long time, the field of human computer interaction (HCI) started to meaningfully bring together all our senses in designing interactions for a variety of media. With this workshop, we look at different aspects of multisensory design - from authoring tools to the evaluation of multisensory experiences with the aim of identifying the current challenges and opportunities of mulsemedia.","PeriodicalId":186296,"journal":{"name":"Proceedings of the 2nd Workshop on Multisensory Experiences (SensoryX 2022)","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-06-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122212592","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-06-22DOI: 10.5753/sensoryx.2022.20003
C. Jost, Brigitte Le Pévédic, Justin Debloos, G. Uzan
The PRIM project has the objective to allow everyone to produce multisensory interactions without programming and without the help of programmers. Taking inspiration from the simplicity of use of media editing software, our idea is to investigate how to enrich them to integrate interactions by modifying the editing timeline. After arguing that time and timeline are the major challenge to face before building a new software, this paper presents a new timeline metaphor, easy to understand, and that would allow to produce interactive mulsemedia and multisensory exercises once completed. Promising results show that people easily understand and accept this new timeline, which encourages us to continue in this way.
{"title":"Interactions in Multisensory Experiences: Toward a New Timeline Metaphor","authors":"C. Jost, Brigitte Le Pévédic, Justin Debloos, G. Uzan","doi":"10.5753/sensoryx.2022.20003","DOIUrl":"https://doi.org/10.5753/sensoryx.2022.20003","url":null,"abstract":"The PRIM project has the objective to allow everyone to produce multisensory interactions without programming and without the help of programmers. Taking inspiration from the simplicity of use of media editing software, our idea is to investigate how to enrich them to integrate interactions by modifying the editing timeline. After arguing that time and timeline are the major challenge to face before building a new software, this paper presents a new timeline metaphor, easy to understand, and that would allow to produce interactive mulsemedia and multisensory exercises once completed. Promising results show that people easily understand and accept this new timeline, which encourages us to continue in this way.","PeriodicalId":186296,"journal":{"name":"Proceedings of the 2nd Workshop on Multisensory Experiences (SensoryX 2022)","volume":"4 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-06-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126021587","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-06-22DOI: 10.5753/sensoryx.2022.20004
C. Jost, Brigitte Le Pévédic
This paper explores an idea to enrich existing mulsemedia editing software, getting inspiration from music scores, to have them integrate actions coming from the viewer while editing a mulsemedia. In this paper, we reflect on the possibility to do it with a unique timeline. Thus, we propose to completely change the point of view. Instead of trying to insert the interaction in the timeline, we propose to cut the media into several parts and insert them into the interaction.
{"title":"How to integrate interactions into video editing software?","authors":"C. Jost, Brigitte Le Pévédic","doi":"10.5753/sensoryx.2022.20004","DOIUrl":"https://doi.org/10.5753/sensoryx.2022.20004","url":null,"abstract":"This paper explores an idea to enrich existing mulsemedia editing software, getting inspiration from music scores, to have them integrate actions coming from the viewer while editing a mulsemedia. In this paper, we reflect on the possibility to do it with a unique timeline. Thus, we propose to completely change the point of view. Instead of trying to insert the interaction in the timeline, we propose to cut the media into several parts and insert them into the interaction.","PeriodicalId":186296,"journal":{"name":"Proceedings of the 2nd Workshop on Multisensory Experiences (SensoryX 2022)","volume":"117 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-06-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122059546","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-06-22DOI: 10.5753/sensoryx.2022.20001
Belmir De Jesus Jr., M. Lopes, Marc-Antoine Moinnereau, R. A. Gougeh, Olivier Rosanne, Walter Schubert, Alcyr Oliveira, T. Falk
Virtual reality applications are on the rise and touching numerous domains, including healthcare, training, and gaming, to name a few. Existing experiences, however, are not fully immersive, as only two senses (audio-visual) are stimulated. To overcome this limitation, olfactory and haptic devices are emerging, thus making multisensory immersive experiences a reality. To date, however, little is known about the impact that each stimulated sense has on the overall experience, as well as on the user’s sense of e.g., realism, immersion, and engagement. In this pilot, we aim to answer this question. Using a multisensory pod, sixteen participants were immersed in a 2.5-minute virtual world, where smells, vibroacoustic, and somatosensory stimuli (i.e., wind and heat) were presented, in addition to 360-degree video and surround sound. Using two wearable devices, we kept track of the user’s heart rate, breathing rate, skin temperature, blood volume pulse, and electrodermal activity while they were immersed. In this paper, we report the impact that stimulating different senses had on the users’ overall experience, sense of presence, immersion, realism, flow, cybersickness, and emotional states, both subjectively, as well as objectively using features extracted from the wearable devices.
{"title":"Quantifying Multisensory Immersive Experiences using Wearables: Is (Stimulating) More (Senses) Always Merrier?","authors":"Belmir De Jesus Jr., M. Lopes, Marc-Antoine Moinnereau, R. A. Gougeh, Olivier Rosanne, Walter Schubert, Alcyr Oliveira, T. Falk","doi":"10.5753/sensoryx.2022.20001","DOIUrl":"https://doi.org/10.5753/sensoryx.2022.20001","url":null,"abstract":"Virtual reality applications are on the rise and touching numerous domains, including healthcare, training, and gaming, to name a few. Existing experiences, however, are not fully immersive, as only two senses (audio-visual) are stimulated. To overcome this limitation, olfactory and haptic devices are emerging, thus making multisensory immersive experiences a reality. To date, however, little is known about the impact that each stimulated sense has on the overall experience, as well as on the user’s sense of e.g., realism, immersion, and engagement. In this pilot, we aim to answer this question. Using a multisensory pod, sixteen participants were immersed in a 2.5-minute virtual world, where smells, vibroacoustic, and somatosensory stimuli (i.e., wind and heat) were presented, in addition to 360-degree video and surround sound. Using two wearable devices, we kept track of the user’s heart rate, breathing rate, skin temperature, blood volume pulse, and electrodermal activity while they were immersed. In this paper, we report the impact that stimulating different senses had on the users’ overall experience, sense of presence, immersion, realism, flow, cybersickness, and emotional states, both subjectively, as well as objectively using features extracted from the wearable devices.","PeriodicalId":186296,"journal":{"name":"Proceedings of the 2nd Workshop on Multisensory Experiences (SensoryX 2022)","volume":"39 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-06-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129372307","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-06-22DOI: 10.5753/sensoryx.2022.20002
M. Rocha, Dagoberto Cruz-Sandoval, J. Favela, D. Muchaluat-Saade
Socially Assistive Robots (SARs) are a class of robots that are at an intersection between the class of assistive robots and that of interactive social robots. Besides providing some kind of assistance, SARs can provide user stimuli through interaction with the robot. SARs have been explored to assist in different healthcare therapies. This work is based on an open-source SAR called EVA. We have extended EVA’s capabilities for multimodal interaction and integration with light sensory effects. This paper presents our current research and future steps to use EVA for healthcare therapies.
{"title":"An Open-Source Socially Assistive Robot for Multisensory Healthcare Therapies","authors":"M. Rocha, Dagoberto Cruz-Sandoval, J. Favela, D. Muchaluat-Saade","doi":"10.5753/sensoryx.2022.20002","DOIUrl":"https://doi.org/10.5753/sensoryx.2022.20002","url":null,"abstract":"Socially Assistive Robots (SARs) are a class of robots that are at an intersection between the class of assistive robots and that of interactive social robots. Besides providing some kind of assistance, SARs can provide user stimuli through interaction with the robot. SARs have been explored to assist in different healthcare therapies. This work is based on an open-source SAR called EVA. We have extended EVA’s capabilities for multimodal interaction and integration with light sensory effects. This paper presents our current research and future steps to use EVA for healthcare therapies.","PeriodicalId":186296,"journal":{"name":"Proceedings of the 2nd Workshop on Multisensory Experiences (SensoryX 2022)","volume":"129 4","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-06-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"113987854","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-06-22DOI: 10.5753/sensoryx.2022.20005
A. Silveira, Celso A. S. Santos
Multimedia applications are usually limited to stimulating only two human senses: vision and hearing. Recent studies seek to expand the definition of multimedia applications to include stimuli for other human senses. In this way, sensory effects that should be triggered in synchrony with the audiovisual content being presented are included in the applications. By including sensory effects in multimedia, we aim to improve the Quality of Experience (QoE) with these mulsemedia environments. Usually, two approaches are being used for performing QoE evaluations these environments. The first, more common, is performed by subjective evaluation approaches, i.e. through questionnaires, interrogations, oral responses, etc. The second, rarer but growing, uses objective approaches by collecting physiological data from the user when dealing with the system being evaluated. Such data is gathered in real time or not, however, it is considered objective because it is "involuntary", that is, data is not the result of the user’s intention. This paper will address both the these methods to evaluate QoE and what the respective obstacles are when dealing with in mulsemedia systems.
{"title":"Ongoing Challenges of Evaluating Mulsemedia QoE","authors":"A. Silveira, Celso A. S. Santos","doi":"10.5753/sensoryx.2022.20005","DOIUrl":"https://doi.org/10.5753/sensoryx.2022.20005","url":null,"abstract":"Multimedia applications are usually limited to stimulating only two human senses: vision and hearing. Recent studies seek to expand the definition of multimedia applications to include stimuli for other human senses. In this way, sensory effects that should be triggered in synchrony with the audiovisual content being presented are included in the applications. By including sensory effects in multimedia, we aim to improve the Quality of Experience (QoE) with these mulsemedia environments. Usually, two approaches are being used for performing QoE evaluations these environments. The first, more common, is performed by subjective evaluation approaches, i.e. through questionnaires, interrogations, oral responses, etc. The second, rarer but growing, uses objective approaches by collecting physiological data from the user when dealing with the system being evaluated. Such data is gathered in real time or not, however, it is considered objective because it is \"involuntary\", that is, data is not the result of the user’s intention. This paper will address both the these methods to evaluate QoE and what the respective obstacles are when dealing with in mulsemedia systems.","PeriodicalId":186296,"journal":{"name":"Proceedings of the 2nd Workshop on Multisensory Experiences (SensoryX 2022)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-06-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129862526","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}