Pub Date : 2023-06-01Epub Date: 2023-08-02DOI: 10.1109/COMPSAC57700.2023.00162
Min Sook Park, Paramita Basak Upama, Adib Ahmed Anik, Sheikh Iqbal Ahamed, Jake Luo, Shiyu Tian, Masud Rabbani, Hyungkyoung Oh
Conversational agents have gained their ground in our daily life and various domains including healthcare. Chronic condition self-management is one of the promising healthcare areas in which conversational agents demonstrate significant potential to contribute to alleviating healthcare burdens from chronic conditions. This survey paper introduces and outlines types of conversational agents, their generic architecture and workflow, the implemented technologies, and their application to chronic condition self-management.
{"title":"A Survey of Conversational Agents and Their Applications for Self-Management of Chronic Conditions.","authors":"Min Sook Park, Paramita Basak Upama, Adib Ahmed Anik, Sheikh Iqbal Ahamed, Jake Luo, Shiyu Tian, Masud Rabbani, Hyungkyoung Oh","doi":"10.1109/COMPSAC57700.2023.00162","DOIUrl":"https://doi.org/10.1109/COMPSAC57700.2023.00162","url":null,"abstract":"<p><p>Conversational agents have gained their ground in our daily life and various domains including healthcare. Chronic condition self-management is one of the promising healthcare areas in which conversational agents demonstrate significant potential to contribute to alleviating healthcare burdens from chronic conditions. This survey paper introduces and outlines types of conversational agents, their generic architecture and workflow, the implemented technologies, and their application to chronic condition self-management.</p>","PeriodicalId":74502,"journal":{"name":"Proceedings : Annual International Computer Software and Applications Conference. COMPSAC","volume":"2023 ","pages":"1064-1075"},"PeriodicalIF":0.0,"publicationDate":"2023-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10519706/pdf/nihms-1932207.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"41169484","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-06-01DOI: 10.1109/COMPSAC54236.2022.00006
Sheikh Iqbal Ahamed, Mohammad Zulkernine
{"title":"Message from the Standing Committee Vice Chairs","authors":"Sheikh Iqbal Ahamed, Mohammad Zulkernine","doi":"10.1109/COMPSAC54236.2022.00006","DOIUrl":"https://doi.org/10.1109/COMPSAC54236.2022.00006","url":null,"abstract":"","PeriodicalId":74502,"journal":{"name":"Proceedings : Annual International Computer Software and Applications Conference. COMPSAC","volume":"77 1","pages":"xl"},"PeriodicalIF":0.0,"publicationDate":"2022-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"88560410","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-06-01Epub Date: 2022-08-10DOI: 10.1109/compsac54236.2022.00095
Masud Rabbani, Shiyu Tian, Adib Ahmed Anik, Jake Luo, Min Sook Park, Jeff Whittle, Sheikh Iqbal Ahamed, Hyunkyoung Oh
The integration of motivational strategies and self-management theory with mHealth tools is a promising approach to changing the behavior of patients with chronic disease. In this manuscript, we describe the development and current architecture of a prototype voice-activated self-monitoring application (VoiS) which is based on these theories. Unlike prior mHealth applications which require textual input, VoiS app relies on the more convenient and adaptable approach of asking users to verbally input markers of diabetes and hypertension control through a smart speaker. The VoiS app can provide real-time feedback based on these markers; thus, it has the potential to serve as a remote, regular, source of feedback to support behavior change. To enhance the usability and acceptability of the VoiS application, we will ask a diverse group of patients to use it in real-world settings and provide feedback on their experience. We will use this feedback to optimize tool performance, so that it can provide patients with an improved understanding of their chronic conditions. The VoiS app can also facilitate remote sharing of chronic disease control with healthcare providers, which can improve clinical efficacy and reduce the urgency and frequency of clinical care encounters. Because the VoiS app will be configured for use with multiple platforms, it will be more robust than existing systems with respect to user accessibility and acceptability.
{"title":"Towards Developing a Voice-activated Self-monitoring Application (VoiS) for Adults with Diabetes and Hypertension.","authors":"Masud Rabbani, Shiyu Tian, Adib Ahmed Anik, Jake Luo, Min Sook Park, Jeff Whittle, Sheikh Iqbal Ahamed, Hyunkyoung Oh","doi":"10.1109/compsac54236.2022.00095","DOIUrl":"10.1109/compsac54236.2022.00095","url":null,"abstract":"<p><p>The integration of motivational strategies and self-management theory with mHealth tools is a promising approach to changing the behavior of patients with chronic disease. In this manuscript, we describe the development and current architecture of a prototype voice-activated self-monitoring application (VoiS) which is based on these theories. Unlike prior mHealth applications which require textual input, VoiS app relies on the more convenient and adaptable approach of asking users to verbally input markers of diabetes and hypertension control through a smart speaker. The VoiS app can provide real-time feedback based on these markers; thus, it has the potential to serve as a remote, regular, source of feedback to support behavior change. To enhance the usability and acceptability of the VoiS application, we will ask a diverse group of patients to use it in real-world settings and provide feedback on their experience. We will use this feedback to optimize tool performance, so that it can provide patients with an improved understanding of their chronic conditions. The VoiS app can also facilitate remote sharing of chronic disease control with healthcare providers, which can improve clinical efficacy and reduce the urgency and frequency of clinical care encounters. Because the VoiS app will be configured for use with multiple platforms, it will be more robust than existing systems with respect to user accessibility and acceptability.</p>","PeriodicalId":74502,"journal":{"name":"Proceedings : Annual International Computer Software and Applications Conference. COMPSAC","volume":"2022 ","pages":"512-519"},"PeriodicalIF":0.0,"publicationDate":"2022-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9805835/pdf/nihms-1855485.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"10467949","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-06-01DOI: 10.1109/COMPSAC54236.2022.00007
H. Leong, Sahra Sedigh Sarvestani, Y. Teranishi
{"title":"Message from the 2022 Program Chairs-in-Chief","authors":"H. Leong, Sahra Sedigh Sarvestani, Y. Teranishi","doi":"10.1109/COMPSAC54236.2022.00007","DOIUrl":"https://doi.org/10.1109/COMPSAC54236.2022.00007","url":null,"abstract":"","PeriodicalId":74502,"journal":{"name":"Proceedings : Annual International Computer Software and Applications Conference. COMPSAC","volume":"28 1","pages":"xl"},"PeriodicalIF":0.0,"publicationDate":"2022-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"73467579","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-01-01DOI: 10.1109/COMPSAC54236.2022.00285
H. Kinoshita, T. Morizumi
{"title":"An access control model considering with transitions of access rights based on the blockchain","authors":"H. Kinoshita, T. Morizumi","doi":"10.1109/COMPSAC54236.2022.00285","DOIUrl":"https://doi.org/10.1109/COMPSAC54236.2022.00285","url":null,"abstract":"","PeriodicalId":74502,"journal":{"name":"Proceedings : Annual International Computer Software and Applications Conference. COMPSAC","volume":"36 1","pages":"1792-1797"},"PeriodicalIF":0.0,"publicationDate":"2022-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"72909914","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-07-01Epub Date: 2021-09-09DOI: 10.1109/compsac51774.2021.00094
Subhash Nerella, Julie Cupka, Matthew Ruppert, Patrick Tighe, Azra Bihorac, Parisa Rashidi
Existing pain assessment methods in the intensive care unit rely on patient self-report or visual observation by nurses. Patient self-report is subjective and can suffer from poor recall. In the case of non-verbal patients, behavioral pain assessment methods provide limited granularity, are subjective, and put additional burden on already overworked staff. Previous studies have shown the feasibility of autonomous pain expression assessment by detecting Facial Action Units (AUs). However, previous approaches for detecting facial pain AUs are historically limited to controlled environments. In this study, for the first time, we collected and annotated a pain-related AU dataset, Pain-ICU, containing 55,085 images from critically ill adult patients. We evaluated the performance of OpenFace, an open-source facial behavior analysis tool, and the trained AU R-CNN model on our Pain-ICU dataset. Variables such as assisted breathing devices, environmental lighting, and patient orientation with respect to the camera make AU detection harder than with controlled settings. Although OpenFace has shown state-of-the-art results in general purpose AU detection tasks, it could not accurately detect AUs in our Pain-ICU dataset (F1-score 0.42). To address this problem, we trained the AU R-CNN model on our Pain-ICU dataset, resulting in a satisfactory average F1-score 0.77. In this study, we show the feasibility of detecting facial pain AUs in uncontrolled ICU settings.
{"title":"Pain Action Unit Detection in Critically Ill Patients.","authors":"Subhash Nerella, Julie Cupka, Matthew Ruppert, Patrick Tighe, Azra Bihorac, Parisa Rashidi","doi":"10.1109/compsac51774.2021.00094","DOIUrl":"https://doi.org/10.1109/compsac51774.2021.00094","url":null,"abstract":"<p><p>Existing pain assessment methods in the intensive care unit rely on patient self-report or visual observation by nurses. Patient self-report is subjective and can suffer from poor recall. In the case of non-verbal patients, behavioral pain assessment methods provide limited granularity, are subjective, and put additional burden on already overworked staff. Previous studies have shown the feasibility of autonomous pain expression assessment by detecting Facial Action Units (AUs). However, previous approaches for detecting facial pain AUs are historically limited to controlled environments. In this study, for the first time, we collected and annotated a pain-related AU dataset, <i>Pain-ICU</i>, containing 55,085 images from critically ill adult patients. We evaluated the performance of OpenFace, an open-source facial behavior analysis tool, and the trained AU R-CNN model on our <i>Pain-ICU</i> dataset. Variables such as assisted breathing devices, environmental lighting, and patient orientation with respect to the camera make AU detection harder than with controlled settings. Although OpenFace has shown state-of-the-art results in general purpose AU detection tasks, it could not accurately detect AUs in our <i>Pain-ICU</i> dataset (F1-score 0.42). To address this problem, we trained the AU R-CNN model on our <i>Pain-ICU</i> dataset, resulting in a satisfactory average F1-score 0.77. In this study, we show the feasibility of detecting facial pain AUs in uncontrolled ICU settings.</p>","PeriodicalId":74502,"journal":{"name":"Proceedings : Annual International Computer Software and Applications Conference. COMPSAC","volume":"2021 ","pages":"645-651"},"PeriodicalIF":0.0,"publicationDate":"2021-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8552410/pdf/nihms-1747870.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"39581007","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-07-01Epub Date: 2021-09-09DOI: 10.1109/COMPSAC51774.2021.00110
Robert Ross, William M Mongan, Patrick O'Neill, Ilhaan Rasheed, Adam Fontecchio, Genevieve Dion, Kapil R Dandekar
Currently, wired respiratory rate sensors tether patients to a location and can potentially obscure their body from medical staff. In addition, current wired respiratory rate sensors are either inaccurate or invasive. Spurred by these deficiencies, we have developed the Bellyband, a less invasive smart garment sensor, which uses wireless, passive Radio Frequency Identification (RFID) to detect bio-signals. Though the Bellyband solves many physical problems, it creates a signal processing challenge, due to its noisy, quantized signal. Here, we present an algorithm by which to estimate respiratory rate from the Bellyband. The algorithm uses an adaptively parameterized Savitzky-Golay (SG) filter to smooth the signal. The adaptive parameterization enables the algorithm to be effective on a wide range of respiratory frequencies, even when the frequencies change sharply. Further, the algorithm is three times faster and three times more accurate than the current Bellyband respiratory rate detection algorithm and is able to run in real time. Using an off-the-shelf respiratory monitor and metronome-synchronized breathing, we gathered 25 sets of data and tested the algorithm against these trials. The algorithm's respiratory rate estimates diverged from ground truth by an average Root Mean Square Error (RMSE) of 4.1 breaths per minute (BPM) over all 25 trials. Further, preliminary results suggest that the algorithm could be made as or more accurate than widely used algorithms that detect the respiratory rate of non-ventilated patients using data from an Electrocardiogram (ECG) or Impedance Plethysmography (IP).
{"title":"An Adaptively Parameterized Algorithm Estimating Respiratory Rate from a Passive Wearable RFID Smart Garment.","authors":"Robert Ross, William M Mongan, Patrick O'Neill, Ilhaan Rasheed, Adam Fontecchio, Genevieve Dion, Kapil R Dandekar","doi":"10.1109/COMPSAC51774.2021.00110","DOIUrl":"10.1109/COMPSAC51774.2021.00110","url":null,"abstract":"<p><p>Currently, wired respiratory rate sensors tether patients to a location and can potentially obscure their body from medical staff. In addition, current wired respiratory rate sensors are either inaccurate or invasive. Spurred by these deficiencies, we have developed the Bellyband, a less invasive smart garment sensor, which uses wireless, passive Radio Frequency Identification (RFID) to detect bio-signals. Though the Bellyband solves many physical problems, it creates a signal processing challenge, due to its noisy, quantized signal. Here, we present an algorithm by which to estimate respiratory rate from the Bellyband. The algorithm uses an adaptively parameterized Savitzky-Golay (SG) filter to smooth the signal. The adaptive parameterization enables the algorithm to be effective on a wide range of respiratory frequencies, even when the frequencies change sharply. Further, the algorithm is three times faster and three times more accurate than the current Bellyband respiratory rate detection algorithm and is able to run in real time. Using an off-the-shelf respiratory monitor and metronome-synchronized breathing, we gathered 25 sets of data and tested the algorithm against these trials. The algorithm's respiratory rate estimates diverged from ground truth by an average Root Mean Square Error (RMSE) of 4.1 breaths per minute (BPM) over all 25 trials. Further, preliminary results suggest that the algorithm could be made as or more accurate than widely used algorithms that detect the respiratory rate of non-ventilated patients using data from an Electrocardiogram (ECG) or Impedance Plethysmography (IP).</p>","PeriodicalId":74502,"journal":{"name":"Proceedings : Annual International Computer Software and Applications Conference. COMPSAC","volume":"2021 ","pages":"774-784"},"PeriodicalIF":0.0,"publicationDate":"2021-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8463037/pdf/nihms-1701078.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"39456050","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-01-01DOI: 10.1109/COMPSAC51774.2021.00189
Zhaoxiong Meng, T. Morizumi, S. Miyata, H. Kinoshita
{"title":"Design Scheme of Perceptual Hashing based on Output of CNN for Digital Watermarking","authors":"Zhaoxiong Meng, T. Morizumi, S. Miyata, H. Kinoshita","doi":"10.1109/COMPSAC51774.2021.00189","DOIUrl":"https://doi.org/10.1109/COMPSAC51774.2021.00189","url":null,"abstract":"","PeriodicalId":74502,"journal":{"name":"Proceedings : Annual International Computer Software and Applications Conference. COMPSAC","volume":"38 1","pages":"1345-1350"},"PeriodicalIF":0.0,"publicationDate":"2021-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"88672125","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}