Hicham Gibet Tani, Lamiae Eloutouate, F. Elouaai, M. Bouhorma, Mohamed Walid Hajoub
{"title":"Transforming Healthcare: Leveraging Vision-Based Neural Networks for Smart Home Patient Monitoring","authors":"Hicham Gibet Tani, Lamiae Eloutouate, F. Elouaai, M. Bouhorma, Mohamed Walid Hajoub","doi":"10.3991/ijoe.v19i10.40381","DOIUrl":null,"url":null,"abstract":"Image captioning is a promising technique for remote monitoring of patient behavior, enabling healthcare providers to identify changes in patient routines and conditions. In this study, we explore the use of transformer neural networks for image caption generation from surveillance camera footage, captured at regular intervals of one minute. Our goal is to develop and evaluate a transformer neural network model, trained and tested on the COCO (common objects in context) dataset, for generating captions that describe patient behavior. Furthermore, we will compare our proposed approach with a traditional convolutional neural network (CNN) method to highlight the prominence of our proposed approach. Our findings demonstrate the potential of transformer neural networks in generating natural language descriptions of patient behavior, which can provide valuable insights for healthcare providers. The use of such technology can allow for more efficient monitoring of patients, enabling timely interventions when necessary. Moreover, our study highlights the potential of transformer neural networks in identifying patterns and trends in patient behavior over time, which can aid in developing personalized healthcare plans.","PeriodicalId":36900,"journal":{"name":"International Journal of Online and Biomedical Engineering","volume":null,"pages":null},"PeriodicalIF":1.7000,"publicationDate":"2023-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Online and Biomedical Engineering","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3991/ijoe.v19i10.40381","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
引用次数: 0
Abstract
Image captioning is a promising technique for remote monitoring of patient behavior, enabling healthcare providers to identify changes in patient routines and conditions. In this study, we explore the use of transformer neural networks for image caption generation from surveillance camera footage, captured at regular intervals of one minute. Our goal is to develop and evaluate a transformer neural network model, trained and tested on the COCO (common objects in context) dataset, for generating captions that describe patient behavior. Furthermore, we will compare our proposed approach with a traditional convolutional neural network (CNN) method to highlight the prominence of our proposed approach. Our findings demonstrate the potential of transformer neural networks in generating natural language descriptions of patient behavior, which can provide valuable insights for healthcare providers. The use of such technology can allow for more efficient monitoring of patients, enabling timely interventions when necessary. Moreover, our study highlights the potential of transformer neural networks in identifying patterns and trends in patient behavior over time, which can aid in developing personalized healthcare plans.