{"title":"AI update","authors":"D. Blank","doi":"10.1145/383824.383827","DOIUrl":null,"url":null,"abstract":"As part of research concerning facial expression of emotion, IBM continues a series of studies under Project BlueEyes that attempts to address four major issues: 1. Do emotions occur naturally in Human Computer Interaction (HCI)? If so, how often, and which emotions? 2. Using the image of a person, can people assess emotions reliably? 3. What information do people use to assess emotions? 4. What HCI stimuli cause what emotion and what is the user’s experience of the emotion? IBM’s first two studies have provided evidence on the first two issues. They have found evidence that some affective states (like anxiety and happiness) do occur in HCI and that people can use visual information to assess these states. Of course, those familiar with certain operating systems know that emotions can pop up in HCI every once in a while. But I assume that IBM is talking about visual clues more subtle than users pounding on monitors with their fists. In any event, people can visually detect emotions. IBM hopes that if people can perform this assessment reliably, so could a computer. To test out this hope, IBM has built Pong, a blue-eyed (of course) robo-head. Currently, Pong is a plastic and metal face that sits on a table and watches you with two ping pong-like eyes. Once it sees you, it smiles or frowns based on its interpretation of your mood. John Dvorak, computer pundit and AI hypemaster (see page 9), described interacting with Pong as “fascinating and creepy.” IBM is apparently completing further studies. For more information, see www.almaden.ibm.com/cs/ blueeyes/. New BlueEyes","PeriodicalId":8272,"journal":{"name":"Appl. Intell.","volume":"129 1","pages":"8"},"PeriodicalIF":0.0000,"publicationDate":"2001-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Appl. Intell.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/383824.383827","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
As part of research concerning facial expression of emotion, IBM continues a series of studies under Project BlueEyes that attempts to address four major issues: 1. Do emotions occur naturally in Human Computer Interaction (HCI)? If so, how often, and which emotions? 2. Using the image of a person, can people assess emotions reliably? 3. What information do people use to assess emotions? 4. What HCI stimuli cause what emotion and what is the user’s experience of the emotion? IBM’s first two studies have provided evidence on the first two issues. They have found evidence that some affective states (like anxiety and happiness) do occur in HCI and that people can use visual information to assess these states. Of course, those familiar with certain operating systems know that emotions can pop up in HCI every once in a while. But I assume that IBM is talking about visual clues more subtle than users pounding on monitors with their fists. In any event, people can visually detect emotions. IBM hopes that if people can perform this assessment reliably, so could a computer. To test out this hope, IBM has built Pong, a blue-eyed (of course) robo-head. Currently, Pong is a plastic and metal face that sits on a table and watches you with two ping pong-like eyes. Once it sees you, it smiles or frowns based on its interpretation of your mood. John Dvorak, computer pundit and AI hypemaster (see page 9), described interacting with Pong as “fascinating and creepy.” IBM is apparently completing further studies. For more information, see www.almaden.ibm.com/cs/ blueeyes/. New BlueEyes