Advanced Machine Learning Tools to Monitor Biomarkers of Dysphagia: A Wearable Sensor Proof-of-Concept Study.

Q1 Computer Science Digital Biomarkers Pub Date : 2021-07-27 eCollection Date: 2021-05-01 DOI:10.1159/000517144
Megan K O'Brien, Olivia K Botonis, Elissa Larkin, Julia Carpenter, Bonnie Martin-Harris, Rachel Maronati, KunHyuck Lee, Leora R Cherney, Brianna Hutchison, Shuai Xu, John A Rogers, Arun Jayaraman
{"title":"Advanced Machine Learning Tools to Monitor Biomarkers of Dysphagia: A Wearable Sensor Proof-of-Concept Study.","authors":"Megan K O'Brien,&nbsp;Olivia K Botonis,&nbsp;Elissa Larkin,&nbsp;Julia Carpenter,&nbsp;Bonnie Martin-Harris,&nbsp;Rachel Maronati,&nbsp;KunHyuck Lee,&nbsp;Leora R Cherney,&nbsp;Brianna Hutchison,&nbsp;Shuai Xu,&nbsp;John A Rogers,&nbsp;Arun Jayaraman","doi":"10.1159/000517144","DOIUrl":null,"url":null,"abstract":"<p><strong>Introduction: </strong>Difficulty swallowing (dysphagia) occurs frequently in patients with neurological disorders and can lead to aspiration, choking, and malnutrition. Dysphagia is typically diagnosed using costly, invasive imaging procedures or subjective, qualitative bedside examinations. Wearable sensors are a promising alternative to noninvasively and objectively measure physiological signals relevant to swallowing. An ongoing challenge with this approach is consolidating these complex signals into sensitive, clinically meaningful metrics of swallowing performance. To address this gap, we propose 2 novel, digital monitoring tools to evaluate swallows using wearable sensor data and machine learning.</p><p><strong>Methods: </strong>Biometric swallowing and respiration signals from wearable, mechano-acoustic sensors were compared between patients with poststroke dysphagia and nondysphagic controls while swallowing foods and liquids of different consistencies, in accordance with the Mann Assessment of Swallowing Ability (MASA). Two machine learning approaches were developed to (1) classify the severity of impairment for each swallow, with model confidence ratings for transparent clinical decision support, and (2) compute a similarity measure of each swallow to nondysphagic performance. Task-specific models were trained using swallow kinematics and respiratory features from 505 swallows (321 from patients and 184 from controls).</p><p><strong>Results: </strong>These models provide sensitive metrics to gauge impairment on a per-swallow basis. Both approaches demonstrate intrasubject swallow variability and patient-specific changes which were not captured by the MASA alone. Sensor measures encoding respiratory-swallow coordination were important features relating to dysphagia presence and severity. Puree swallows exhibited greater differences from controls than saliva swallows or liquid sips (<i>p</i> < 0.037).</p><p><strong>Discussion: </strong>Developing interpretable tools is critical to optimize the clinical utility of novel, sensor-based measurement techniques. The proof-of-concept models proposed here provide concrete, communicable evidence to track dysphagia recovery over time. With refined training schemes and real-world validation, these tools can be deployed to automatically measure and monitor swallowing in the clinic and community for patients across the impairment spectrum.</p>","PeriodicalId":11242,"journal":{"name":"Digital Biomarkers","volume":"5 2","pages":"167-175"},"PeriodicalIF":0.0000,"publicationDate":"2021-07-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1159/000517144","citationCount":"7","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Digital Biomarkers","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1159/000517144","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2021/5/1 0:00:00","PubModel":"eCollection","JCR":"Q1","JCRName":"Computer Science","Score":null,"Total":0}
引用次数: 7

Abstract

Introduction: Difficulty swallowing (dysphagia) occurs frequently in patients with neurological disorders and can lead to aspiration, choking, and malnutrition. Dysphagia is typically diagnosed using costly, invasive imaging procedures or subjective, qualitative bedside examinations. Wearable sensors are a promising alternative to noninvasively and objectively measure physiological signals relevant to swallowing. An ongoing challenge with this approach is consolidating these complex signals into sensitive, clinically meaningful metrics of swallowing performance. To address this gap, we propose 2 novel, digital monitoring tools to evaluate swallows using wearable sensor data and machine learning.

Methods: Biometric swallowing and respiration signals from wearable, mechano-acoustic sensors were compared between patients with poststroke dysphagia and nondysphagic controls while swallowing foods and liquids of different consistencies, in accordance with the Mann Assessment of Swallowing Ability (MASA). Two machine learning approaches were developed to (1) classify the severity of impairment for each swallow, with model confidence ratings for transparent clinical decision support, and (2) compute a similarity measure of each swallow to nondysphagic performance. Task-specific models were trained using swallow kinematics and respiratory features from 505 swallows (321 from patients and 184 from controls).

Results: These models provide sensitive metrics to gauge impairment on a per-swallow basis. Both approaches demonstrate intrasubject swallow variability and patient-specific changes which were not captured by the MASA alone. Sensor measures encoding respiratory-swallow coordination were important features relating to dysphagia presence and severity. Puree swallows exhibited greater differences from controls than saliva swallows or liquid sips (p < 0.037).

Discussion: Developing interpretable tools is critical to optimize the clinical utility of novel, sensor-based measurement techniques. The proof-of-concept models proposed here provide concrete, communicable evidence to track dysphagia recovery over time. With refined training schemes and real-world validation, these tools can be deployed to automatically measure and monitor swallowing in the clinic and community for patients across the impairment spectrum.

Abstract Image

Abstract Image

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
先进的机器学习工具监测吞咽困难的生物标志物:可穿戴传感器的概念验证研究。
吞咽困难(吞咽困难)常见于神经系统疾病患者,可导致误吸、窒息和营养不良。吞咽困难的诊断通常使用昂贵的侵入性成像程序或主观的定性床边检查。可穿戴传感器是非侵入性的、客观测量与吞咽相关的生理信号的一种很有前途的替代方法。该方法面临的一个持续挑战是将这些复杂的信号整合成敏感的、临床有意义的吞咽表现指标。为了解决这一差距,我们提出了两种新颖的数字监测工具,利用可穿戴传感器数据和机器学习来评估燕子。方法:根据Mann吞咽能力评估(MASA),比较脑卒中后吞咽困难患者和非吞咽困难对照组患者在吞咽不同浓度的食物和液体时的生物特征吞咽和呼吸信号。开发了两种机器学习方法来(1)对每次吞咽损伤的严重程度进行分类,并使用透明临床决策支持的模型可信度评级,以及(2)计算每次吞咽与非吞咽困难表现的相似性度量。使用505只燕子(321只来自患者,184只来自对照组)的运动学和呼吸特征来训练特定任务模型。结果:这些模型提供了敏感的指标来衡量每次吞咽的损害。这两种方法都证明了受试者吞咽变异性和患者特异性变化,而这些变化并不是单独由MASA捕获的。传感器测量编码呼吸-吞咽协调是与吞咽困难存在和严重程度相关的重要特征。吞下果泥比吞下唾液或小口液体表现出更大的差异(p < 0.037)。讨论:开发可解释的工具对于优化基于传感器的新型测量技术的临床应用至关重要。这里提出的概念验证模型提供了具体的、可传递的证据来跟踪吞咽困难随时间的恢复。通过完善的训练方案和真实世界的验证,这些工具可以在诊所和社区中自动测量和监测吞咽障碍患者。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Digital Biomarkers
Digital Biomarkers Medicine-Medicine (miscellaneous)
CiteScore
10.60
自引率
0.00%
发文量
12
审稿时长
23 weeks
期刊最新文献
The Imperative of Voice Data Collection in Clinical Trials. eHealth and mHealth in Antimicrobial Stewardship Programs. Detecting Longitudinal Trends between Passively Collected Phone Use and Anxiety among College Students. Video Assessment to Detect Amyotrophic Lateral Sclerosis. Digital Vocal Biomarker of Smoking Status Using Ecological Audio Recordings: Results from the Colive Voice Study.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1