N. Gasteiger, Jongyoon Lim, Mehdi Hellou, Bruce A. MacDonald, H. Ahn
{"title":"Moving away from robotic interactions: Evaluation of empathy, emotion and sentiment expressed and detected by computer systems","authors":"N. Gasteiger, Jongyoon Lim, Mehdi Hellou, Bruce A. MacDonald, H. Ahn","doi":"10.1109/RO-MAN53752.2022.9900559","DOIUrl":null,"url":null,"abstract":"Social robots are often critiqued as being too ‘robotic’ and unemotional. For affective human-robot interaction (HRI), robots must detect sentiment and express emotion and empathy in return. We explored the extent to which people can detect emotions, empathy and sentiment from speech expressed by a computer system, with a focus on changes in prosody (pitch, tone, volume) and how people identify sentiment from written text, compared to a sentiment analyzer. 89 participants identified empathy, emotion and sentiment from audio and text embedded in a survey. Empathy and sentiment were best expressed in the audio, while emotions were the most difficult detect (75%, 67% and 42% respectively). We found moderate agreement (70%) between the sentiment identified by the participants and the analyzer. There is potential for computer systems to express affect by using changes in prosody, as well as analyzing text to identify sentiment. This may help to further develop affective capabilities and appropriate responses in social robots, in order to avoid ‘robotic’ interactions. Future research should explore how to better express negative sentiment and emotions, while leveraging multi-modal approaches to HRI.","PeriodicalId":250997,"journal":{"name":"2022 31st IEEE International Conference on Robot and Human Interactive Communication (RO-MAN)","volume":"18 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-08-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 31st IEEE International Conference on Robot and Human Interactive Communication (RO-MAN)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/RO-MAN53752.2022.9900559","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
Social robots are often critiqued as being too ‘robotic’ and unemotional. For affective human-robot interaction (HRI), robots must detect sentiment and express emotion and empathy in return. We explored the extent to which people can detect emotions, empathy and sentiment from speech expressed by a computer system, with a focus on changes in prosody (pitch, tone, volume) and how people identify sentiment from written text, compared to a sentiment analyzer. 89 participants identified empathy, emotion and sentiment from audio and text embedded in a survey. Empathy and sentiment were best expressed in the audio, while emotions were the most difficult detect (75%, 67% and 42% respectively). We found moderate agreement (70%) between the sentiment identified by the participants and the analyzer. There is potential for computer systems to express affect by using changes in prosody, as well as analyzing text to identify sentiment. This may help to further develop affective capabilities and appropriate responses in social robots, in order to avoid ‘robotic’ interactions. Future research should explore how to better express negative sentiment and emotions, while leveraging multi-modal approaches to HRI.