SonoMyoNet:从高度稀疏的超声波图像预测等长力的卷积神经网络

IF 3.5 3区 计算机科学 Q2 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE IEEE Transactions on Human-Machine Systems Pub Date : 2024-03-13 DOI:10.1109/THMS.2024.3389690
Anne Tryphosa Kamatham;Meena Alzamani;Allison Dockum;Siddhartha Sikdar;Biswarup Mukherjee
{"title":"SonoMyoNet:从高度稀疏的超声波图像预测等长力的卷积神经网络","authors":"Anne Tryphosa Kamatham;Meena Alzamani;Allison Dockum;Siddhartha Sikdar;Biswarup Mukherjee","doi":"10.1109/THMS.2024.3389690","DOIUrl":null,"url":null,"abstract":"Ultrasound imaging or sonomyography has been found to be a robust modality for measuring muscle activity due to its ability to image deep-seated muscles directly while providing superior spatiotemporal specificity compared with surface electromyography-based techniques. Quantifying the morphological changes during muscle activity involves computationally expensive approaches for tracking muscle anatomical structures or extracting features from brightness-mode (B-mode) images and amplitude-mode signals. This article uses an offline regression convolutional neural network called SonoMyoNet to estimate continuous isometric force from sparse ultrasound scanlines. SonoMyoNet learns features from a few equispaced scanlines selected from B-mode images and utilizes the learned features to estimate continuous isometric force accurately. The performance of SonoMyoNet was evaluated by varying the number of scanlines to simulate the placement of multiple single-element ultrasound transducers in a wearable system. Results showed that SonoMyoNet could accurately predict isometric force with just four scanlines and is immune to speckle noise and shifts in the scanline location. Thus, the proposed network reduces the computational load involved in feature tracking algorithms and estimates muscle force from the global features of sparse ultrasound images.","PeriodicalId":48916,"journal":{"name":"IEEE Transactions on Human-Machine Systems","volume":null,"pages":null},"PeriodicalIF":3.5000,"publicationDate":"2024-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"SonoMyoNet: A Convolutional Neural Network for Predicting Isometric Force From Highly Sparse Ultrasound Images\",\"authors\":\"Anne Tryphosa Kamatham;Meena Alzamani;Allison Dockum;Siddhartha Sikdar;Biswarup Mukherjee\",\"doi\":\"10.1109/THMS.2024.3389690\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Ultrasound imaging or sonomyography has been found to be a robust modality for measuring muscle activity due to its ability to image deep-seated muscles directly while providing superior spatiotemporal specificity compared with surface electromyography-based techniques. Quantifying the morphological changes during muscle activity involves computationally expensive approaches for tracking muscle anatomical structures or extracting features from brightness-mode (B-mode) images and amplitude-mode signals. This article uses an offline regression convolutional neural network called SonoMyoNet to estimate continuous isometric force from sparse ultrasound scanlines. SonoMyoNet learns features from a few equispaced scanlines selected from B-mode images and utilizes the learned features to estimate continuous isometric force accurately. The performance of SonoMyoNet was evaluated by varying the number of scanlines to simulate the placement of multiple single-element ultrasound transducers in a wearable system. Results showed that SonoMyoNet could accurately predict isometric force with just four scanlines and is immune to speckle noise and shifts in the scanline location. Thus, the proposed network reduces the computational load involved in feature tracking algorithms and estimates muscle force from the global features of sparse ultrasound images.\",\"PeriodicalId\":48916,\"journal\":{\"name\":\"IEEE Transactions on Human-Machine Systems\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":3.5000,\"publicationDate\":\"2024-03-13\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Transactions on Human-Machine Systems\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10529644/\",\"RegionNum\":3,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Human-Machine Systems","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10529644/","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

摘要

与基于表面肌电图的技术相比,超声成像或超声肌电图能够直接对深层肌肉进行成像,同时具有更高的时空特异性,因此被认为是测量肌肉活动的可靠模式。要量化肌肉活动过程中的形态变化,需要采用计算昂贵的方法来跟踪肌肉解剖结构,或从亮度模式(B-mode)图像和振幅模式信号中提取特征。本文使用一种名为 SonoMyoNet 的离线回归卷积神经网络来估计稀疏超声扫描线中的连续等长力。SonoMyoNet 从 B 型图像中选取的几条等间距扫描线中学习特征,并利用所学特征准确估计连续等长力。通过改变扫描线的数量来模拟在可穿戴系统中放置多个单元素超声传感器的情况,对 SonoMyoNet 的性能进行了评估。结果表明,SonoMyoNet 只需四条扫描线就能准确预测等长力,并且不受斑点噪声和扫描线位置偏移的影响。因此,所提出的网络可减少特征跟踪算法的计算负荷,并根据稀疏超声图像的全局特征估算肌力。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
SonoMyoNet: A Convolutional Neural Network for Predicting Isometric Force From Highly Sparse Ultrasound Images
Ultrasound imaging or sonomyography has been found to be a robust modality for measuring muscle activity due to its ability to image deep-seated muscles directly while providing superior spatiotemporal specificity compared with surface electromyography-based techniques. Quantifying the morphological changes during muscle activity involves computationally expensive approaches for tracking muscle anatomical structures or extracting features from brightness-mode (B-mode) images and amplitude-mode signals. This article uses an offline regression convolutional neural network called SonoMyoNet to estimate continuous isometric force from sparse ultrasound scanlines. SonoMyoNet learns features from a few equispaced scanlines selected from B-mode images and utilizes the learned features to estimate continuous isometric force accurately. The performance of SonoMyoNet was evaluated by varying the number of scanlines to simulate the placement of multiple single-element ultrasound transducers in a wearable system. Results showed that SonoMyoNet could accurately predict isometric force with just four scanlines and is immune to speckle noise and shifts in the scanline location. Thus, the proposed network reduces the computational load involved in feature tracking algorithms and estimates muscle force from the global features of sparse ultrasound images.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
IEEE Transactions on Human-Machine Systems
IEEE Transactions on Human-Machine Systems COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE-COMPUTER SCIENCE, CYBERNETICS
CiteScore
7.10
自引率
11.10%
发文量
136
期刊介绍: The scope of the IEEE Transactions on Human-Machine Systems includes the fields of human machine systems. It covers human systems and human organizational interactions including cognitive ergonomics, system test and evaluation, and human information processing concerns in systems and organizations.
期刊最新文献
Table of Contents Present a World of Opportunity IEEE Systems, Man, and Cybernetics Society Information IEEE Transactions on Human-Machine Systems Information for Authors TechRxiv: Share Your Preprint Research with the World!
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1