Z. Jianzhong, Jia Shan, Cao Peipei, Yang Yuankui, Wang Xunheng
{"title":"基于多媒体计算机感知的多模态情感用户界面建模与应用","authors":"Z. Jianzhong, Jia Shan, Cao Peipei, Yang Yuankui, Wang Xunheng","doi":"10.1109/ICNIC.2005.1499835","DOIUrl":null,"url":null,"abstract":"Emotions play an important role in people's everyday life. It's desirable to create intelligent computer systems that understand users' emotional states. In this paper we report on our efforts in developing multimodal affective user interface system used in emotion assessment. First, we discuss the importance of emotion research. Secondly, we introduce the prototype MAUI system built in BABYLAB. There are different modalities of synchronous inputs from the user using wireless sensors and multimedia computer sensing, including physiological signals, facial expressions, speech signals, and behavioral data. The physiological signals are collected using wireless remote sensors in real time. The images of facial expression and the ongoing videos of behavior and gesture are recorded by several cameras connected to different controlling computers. Signals captured simultaneously during experiments are saved in order to discern emotional state from processing sensory modalities over time. All these signals and experiments are collected synchronously in real time to make multimodal signal analysis convenient. Then, using this system, we present preliminary results from an exploratory study that aims to estimate emotion state with non-invasive technologies, which are mapped to their corresponding emotions. Finally, we discuss usages of this system in educational area, and people may benefit from improved satisfaction in learning and training.","PeriodicalId":169717,"journal":{"name":"Proceedings. 2005 First International Conference on Neural Interface and Control, 2005.","volume":"71 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2005-05-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"Modeling and application of multimodal affective user interface with multimedia computer sensing\",\"authors\":\"Z. Jianzhong, Jia Shan, Cao Peipei, Yang Yuankui, Wang Xunheng\",\"doi\":\"10.1109/ICNIC.2005.1499835\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Emotions play an important role in people's everyday life. It's desirable to create intelligent computer systems that understand users' emotional states. In this paper we report on our efforts in developing multimodal affective user interface system used in emotion assessment. First, we discuss the importance of emotion research. Secondly, we introduce the prototype MAUI system built in BABYLAB. There are different modalities of synchronous inputs from the user using wireless sensors and multimedia computer sensing, including physiological signals, facial expressions, speech signals, and behavioral data. The physiological signals are collected using wireless remote sensors in real time. The images of facial expression and the ongoing videos of behavior and gesture are recorded by several cameras connected to different controlling computers. Signals captured simultaneously during experiments are saved in order to discern emotional state from processing sensory modalities over time. All these signals and experiments are collected synchronously in real time to make multimodal signal analysis convenient. Then, using this system, we present preliminary results from an exploratory study that aims to estimate emotion state with non-invasive technologies, which are mapped to their corresponding emotions. Finally, we discuss usages of this system in educational area, and people may benefit from improved satisfaction in learning and training.\",\"PeriodicalId\":169717,\"journal\":{\"name\":\"Proceedings. 2005 First International Conference on Neural Interface and Control, 2005.\",\"volume\":\"71 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2005-05-26\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings. 2005 First International Conference on Neural Interface and Control, 2005.\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICNIC.2005.1499835\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings. 2005 First International Conference on Neural Interface and Control, 2005.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICNIC.2005.1499835","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Modeling and application of multimodal affective user interface with multimedia computer sensing
Emotions play an important role in people's everyday life. It's desirable to create intelligent computer systems that understand users' emotional states. In this paper we report on our efforts in developing multimodal affective user interface system used in emotion assessment. First, we discuss the importance of emotion research. Secondly, we introduce the prototype MAUI system built in BABYLAB. There are different modalities of synchronous inputs from the user using wireless sensors and multimedia computer sensing, including physiological signals, facial expressions, speech signals, and behavioral data. The physiological signals are collected using wireless remote sensors in real time. The images of facial expression and the ongoing videos of behavior and gesture are recorded by several cameras connected to different controlling computers. Signals captured simultaneously during experiments are saved in order to discern emotional state from processing sensory modalities over time. All these signals and experiments are collected synchronously in real time to make multimodal signal analysis convenient. Then, using this system, we present preliminary results from an exploratory study that aims to estimate emotion state with non-invasive technologies, which are mapped to their corresponding emotions. Finally, we discuss usages of this system in educational area, and people may benefit from improved satisfaction in learning and training.