{"title":"Interactive virtual characters","authors":"D. Thalmann, N. Magnenat-Thalmann","doi":"10.1145/2542266.2542277","DOIUrl":null,"url":null,"abstract":"In this tutorial, we will describe both virtual characters and realistic humanoid social robots using the same high level models. Particularly, we will describe: 1. How to capture real-time gestures and facial emotions from real people, how to recognize any real person, how to recognize certain sounds. We will present a state of the art and some new avenues of research. 2. How to model a variety of interactive reactions of the virtual humans and social robots (facial expressions, gestures, multiparty dialog, etc) depending on the real scenes input parameters. 3. How we can define Virtual Characters that have an emotional behavior (personality and mood and emotions) and how to allow them to remember us and have a believable relationship with us. This part is to allow virtual humans and social robots to have an individual and not automatic behaviour. This tutorial will also address the modelling of long-term and short-term memory and the interactions between users and virtual humans based on gaze and how to model visual attention. We will explain different methods to identify user actions and how to allow Virtual Characters to answer to them. 4. This tutorial will also address the modelling of long-term and short-term memory and the interactions between users and virtual humans based on gaze and how to model visual attention. We will present the concepts of behavioral animation, group simulation, intercommunication between virtual humans, social humanoid robots and real people. Case studies will be presented from the Being There Centre (see http://imi.ntu.edu.sg/BeingThereCentre/Projects/Pages/Project4.aspx) where autonomous virtual humans and social robots react to a few actions from the real people.","PeriodicalId":126796,"journal":{"name":"International Conference on Societal Automation","volume":"14 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2013-11-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Conference on Societal Automation","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2542266.2542277","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
In this tutorial, we will describe both virtual characters and realistic humanoid social robots using the same high level models. Particularly, we will describe: 1. How to capture real-time gestures and facial emotions from real people, how to recognize any real person, how to recognize certain sounds. We will present a state of the art and some new avenues of research. 2. How to model a variety of interactive reactions of the virtual humans and social robots (facial expressions, gestures, multiparty dialog, etc) depending on the real scenes input parameters. 3. How we can define Virtual Characters that have an emotional behavior (personality and mood and emotions) and how to allow them to remember us and have a believable relationship with us. This part is to allow virtual humans and social robots to have an individual and not automatic behaviour. This tutorial will also address the modelling of long-term and short-term memory and the interactions between users and virtual humans based on gaze and how to model visual attention. We will explain different methods to identify user actions and how to allow Virtual Characters to answer to them. 4. This tutorial will also address the modelling of long-term and short-term memory and the interactions between users and virtual humans based on gaze and how to model visual attention. We will present the concepts of behavioral animation, group simulation, intercommunication between virtual humans, social humanoid robots and real people. Case studies will be presented from the Being There Centre (see http://imi.ntu.edu.sg/BeingThereCentre/Projects/Pages/Project4.aspx) where autonomous virtual humans and social robots react to a few actions from the real people.