{"title":"与卧床病人互动的Kinect 3D传感器的虚拟界面:第一见解","authors":"Vítor H. Carvalho, José Eusébio","doi":"10.4018/ijhisi.294114","DOIUrl":null,"url":null,"abstract":"The human-machine interaction has evolved significantly in the last years, allowing a new range of opportunities for developing solutions for people with physical limitations. Natural user interfaces (NUI) allow bedridden and/or physically disabled people to perform a set of actions trough gestures thus increasing their quality of life and autonomy. This paper presents a solution based on image processing and computer vision using the Kinect 3D sensor for development of applications that recognize gestures made by the human hand. The gestures are then identified by a software application that triggers a set of actions of upmost importance for the bedridden person, for example, trigger the emergency, switch on/off the TV or control the bed slope. It was used a shape matching technique for six gestures recognition, being the final actions activated by the Arduino platform. The results show a success rate of 96%. This system can improve the quality of life and autonomy of bedridden people, being able to be adapted for the specific necessities of an individual subject.","PeriodicalId":101861,"journal":{"name":"Int. J. Heal. Inf. Syst. Informatics","volume":"5 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Virtual Interface With Kinect 3D Sensor for Interaction With Bedridden People: First Insights\",\"authors\":\"Vítor H. Carvalho, José Eusébio\",\"doi\":\"10.4018/ijhisi.294114\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The human-machine interaction has evolved significantly in the last years, allowing a new range of opportunities for developing solutions for people with physical limitations. Natural user interfaces (NUI) allow bedridden and/or physically disabled people to perform a set of actions trough gestures thus increasing their quality of life and autonomy. This paper presents a solution based on image processing and computer vision using the Kinect 3D sensor for development of applications that recognize gestures made by the human hand. The gestures are then identified by a software application that triggers a set of actions of upmost importance for the bedridden person, for example, trigger the emergency, switch on/off the TV or control the bed slope. It was used a shape matching technique for six gestures recognition, being the final actions activated by the Arduino platform. The results show a success rate of 96%. This system can improve the quality of life and autonomy of bedridden people, being able to be adapted for the specific necessities of an individual subject.\",\"PeriodicalId\":101861,\"journal\":{\"name\":\"Int. J. Heal. Inf. Syst. Informatics\",\"volume\":\"5 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-10-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Int. J. Heal. Inf. Syst. Informatics\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.4018/ijhisi.294114\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Int. J. Heal. Inf. Syst. Informatics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.4018/ijhisi.294114","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Virtual Interface With Kinect 3D Sensor for Interaction With Bedridden People: First Insights
The human-machine interaction has evolved significantly in the last years, allowing a new range of opportunities for developing solutions for people with physical limitations. Natural user interfaces (NUI) allow bedridden and/or physically disabled people to perform a set of actions trough gestures thus increasing their quality of life and autonomy. This paper presents a solution based on image processing and computer vision using the Kinect 3D sensor for development of applications that recognize gestures made by the human hand. The gestures are then identified by a software application that triggers a set of actions of upmost importance for the bedridden person, for example, trigger the emergency, switch on/off the TV or control the bed slope. It was used a shape matching technique for six gestures recognition, being the final actions activated by the Arduino platform. The results show a success rate of 96%. This system can improve the quality of life and autonomy of bedridden people, being able to be adapted for the specific necessities of an individual subject.