Yuyuan Shi, Yin Chen, Liz Katherine Rincon Ardila, G. Venture, M. Bourguet
{"title":"机器人教师视觉感知平台","authors":"Yuyuan Shi, Yin Chen, Liz Katherine Rincon Ardila, G. Venture, M. Bourguet","doi":"10.1145/3349537.3352764","DOIUrl":null,"url":null,"abstract":"This paper describes our ongoing work to develop a visual sensing platform that can inform a robot teacher about the behaviour and affective state of its student audience. We have developed a multi-student behaviour recognition system, which can detect behaviours such as \"listening\" to the lecturer, \"raising hand\", or \"sleeping\". We have also developed a multi-student affect recognition system which, starting from eight basic emotions detected from facial expressions, can infer higher emotional states relevant to a learning context, such as \"interested\", \"distracted\" and \"confused\". Both systems are being tested with the Softbank robot Pepper that can respond to various students' behaviours and emotional states with adapted movements, postures and speech.","PeriodicalId":188834,"journal":{"name":"Proceedings of the 7th International Conference on Human-Agent Interaction","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-09-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":"{\"title\":\"A Visual Sensing Platform for Robot Teachers\",\"authors\":\"Yuyuan Shi, Yin Chen, Liz Katherine Rincon Ardila, G. Venture, M. Bourguet\",\"doi\":\"10.1145/3349537.3352764\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This paper describes our ongoing work to develop a visual sensing platform that can inform a robot teacher about the behaviour and affective state of its student audience. We have developed a multi-student behaviour recognition system, which can detect behaviours such as \\\"listening\\\" to the lecturer, \\\"raising hand\\\", or \\\"sleeping\\\". We have also developed a multi-student affect recognition system which, starting from eight basic emotions detected from facial expressions, can infer higher emotional states relevant to a learning context, such as \\\"interested\\\", \\\"distracted\\\" and \\\"confused\\\". Both systems are being tested with the Softbank robot Pepper that can respond to various students' behaviours and emotional states with adapted movements, postures and speech.\",\"PeriodicalId\":188834,\"journal\":{\"name\":\"Proceedings of the 7th International Conference on Human-Agent Interaction\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-09-25\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"5\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 7th International Conference on Human-Agent Interaction\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3349537.3352764\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 7th International Conference on Human-Agent Interaction","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3349537.3352764","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
This paper describes our ongoing work to develop a visual sensing platform that can inform a robot teacher about the behaviour and affective state of its student audience. We have developed a multi-student behaviour recognition system, which can detect behaviours such as "listening" to the lecturer, "raising hand", or "sleeping". We have also developed a multi-student affect recognition system which, starting from eight basic emotions detected from facial expressions, can infer higher emotional states relevant to a learning context, such as "interested", "distracted" and "confused". Both systems are being tested with the Softbank robot Pepper that can respond to various students' behaviours and emotional states with adapted movements, postures and speech.