{"title":"通过接近、自然手势和眼神来增强人类互动能力","authors":"Mihai Bâce","doi":"10.1145/3098279.3119924","DOIUrl":null,"url":null,"abstract":"Nowadays, humans are surrounded by many complex computer systems. When people interact among each other, they use multiple modalities including voice, body posture, hand gestures, facial expressions, or eye gaze. Currently, computers can only understand a small subset of these modalities, but such cues can be captured by an increasing number of wearable devices. This research aims to improve traditional human-human and human-machine interaction by augmenting humans with wearable technology and developing novel user interfaces. More specifically, (i) we investigate and develop systems that enable a group of people in close proximity to interact using in-air hand gestures and facilitate effortless information sharing. Additionally, we focus on (ii) eye gaze which can further enrich the interaction between humans and cyber-physical systems.","PeriodicalId":120153,"journal":{"name":"Proceedings of the 19th International Conference on Human-Computer Interaction with Mobile Devices and Services","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2017-09-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":"{\"title\":\"Augmenting human interaction capabilities with proximity, natural gestures, and eye gaze\",\"authors\":\"Mihai Bâce\",\"doi\":\"10.1145/3098279.3119924\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Nowadays, humans are surrounded by many complex computer systems. When people interact among each other, they use multiple modalities including voice, body posture, hand gestures, facial expressions, or eye gaze. Currently, computers can only understand a small subset of these modalities, but such cues can be captured by an increasing number of wearable devices. This research aims to improve traditional human-human and human-machine interaction by augmenting humans with wearable technology and developing novel user interfaces. More specifically, (i) we investigate and develop systems that enable a group of people in close proximity to interact using in-air hand gestures and facilitate effortless information sharing. Additionally, we focus on (ii) eye gaze which can further enrich the interaction between humans and cyber-physical systems.\",\"PeriodicalId\":120153,\"journal\":{\"name\":\"Proceedings of the 19th International Conference on Human-Computer Interaction with Mobile Devices and Services\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2017-09-04\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 19th International Conference on Human-Computer Interaction with Mobile Devices and Services\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3098279.3119924\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 19th International Conference on Human-Computer Interaction with Mobile Devices and Services","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3098279.3119924","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Augmenting human interaction capabilities with proximity, natural gestures, and eye gaze
Nowadays, humans are surrounded by many complex computer systems. When people interact among each other, they use multiple modalities including voice, body posture, hand gestures, facial expressions, or eye gaze. Currently, computers can only understand a small subset of these modalities, but such cues can be captured by an increasing number of wearable devices. This research aims to improve traditional human-human and human-machine interaction by augmenting humans with wearable technology and developing novel user interfaces. More specifically, (i) we investigate and develop systems that enable a group of people in close proximity to interact using in-air hand gestures and facilitate effortless information sharing. Additionally, we focus on (ii) eye gaze which can further enrich the interaction between humans and cyber-physical systems.