KineWheel-DeepLabCut 使用交替频闪紫外光和白光照明自动注释爪子。

IF 2.7 3区 医学 Q3 NEUROSCIENCES eNeuro Pub Date : 2024-08-29 Print Date: 2024-08-01 DOI:10.1523/ENEURO.0304-23.2024
Björn Albrecht, Alexej Schatz, Katja Frei, York Winter
{"title":"KineWheel-DeepLabCut 使用交替频闪紫外光和白光照明自动注释爪子。","authors":"Björn Albrecht, Alexej Schatz, Katja Frei, York Winter","doi":"10.1523/ENEURO.0304-23.2024","DOIUrl":null,"url":null,"abstract":"<p><p>Uncovering the relationships between neural circuits, behavior, and neural dysfunction may require rodent pose tracking. While open-source toolkits such as DeepLabCut have revolutionized markerless pose estimation using deep neural networks, the training process still requires human intervention for annotating key points of interest in video data. To further reduce human labor for neural network training, we developed a method that automatically generates annotated image datasets of rodent paw placement in a laboratory setting. It uses invisible but fluorescent markers that become temporarily visible under UV light. Through stroboscopic alternating illumination, adjacent video frames taken at 720 Hz are either UV or white light illuminated. After color filtering the UV-exposed video frames, the UV markings are identified and the paw locations are deterministically mapped. This paw information is then transferred to automatically annotate paw positions in the next white light-exposed frame that is later used for training the neural network. We demonstrate the effectiveness of our method using a KineWheel-DeepLabCut setup for the markerless tracking of the four paws of a harness-fixed mouse running on top of the transparent wheel with mirror. Our automated approach, made available open-source, achieves high-quality position annotations and significantly reduces the need for human involvement in the neural network training process, paving the way for more efficient and streamlined rodent pose tracking in neuroscience research.</p>","PeriodicalId":11617,"journal":{"name":"eNeuro","volume":null,"pages":null},"PeriodicalIF":2.7000,"publicationDate":"2024-08-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11363514/pdf/","citationCount":"0","resultStr":"{\"title\":\"KineWheel-DeepLabCut Automated Paw Annotation Using Alternating Stroboscopic UV and White Light Illumination.\",\"authors\":\"Björn Albrecht, Alexej Schatz, Katja Frei, York Winter\",\"doi\":\"10.1523/ENEURO.0304-23.2024\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Uncovering the relationships between neural circuits, behavior, and neural dysfunction may require rodent pose tracking. While open-source toolkits such as DeepLabCut have revolutionized markerless pose estimation using deep neural networks, the training process still requires human intervention for annotating key points of interest in video data. To further reduce human labor for neural network training, we developed a method that automatically generates annotated image datasets of rodent paw placement in a laboratory setting. It uses invisible but fluorescent markers that become temporarily visible under UV light. Through stroboscopic alternating illumination, adjacent video frames taken at 720 Hz are either UV or white light illuminated. After color filtering the UV-exposed video frames, the UV markings are identified and the paw locations are deterministically mapped. This paw information is then transferred to automatically annotate paw positions in the next white light-exposed frame that is later used for training the neural network. We demonstrate the effectiveness of our method using a KineWheel-DeepLabCut setup for the markerless tracking of the four paws of a harness-fixed mouse running on top of the transparent wheel with mirror. Our automated approach, made available open-source, achieves high-quality position annotations and significantly reduces the need for human involvement in the neural network training process, paving the way for more efficient and streamlined rodent pose tracking in neuroscience research.</p>\",\"PeriodicalId\":11617,\"journal\":{\"name\":\"eNeuro\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":2.7000,\"publicationDate\":\"2024-08-29\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11363514/pdf/\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"eNeuro\",\"FirstCategoryId\":\"3\",\"ListUrlMain\":\"https://doi.org/10.1523/ENEURO.0304-23.2024\",\"RegionNum\":3,\"RegionCategory\":\"医学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"2024/8/1 0:00:00\",\"PubModel\":\"Print\",\"JCR\":\"Q3\",\"JCRName\":\"NEUROSCIENCES\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"eNeuro","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.1523/ENEURO.0304-23.2024","RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2024/8/1 0:00:00","PubModel":"Print","JCR":"Q3","JCRName":"NEUROSCIENCES","Score":null,"Total":0}
引用次数: 0

摘要

要揭示神经回路、行为和神经功能紊乱之间的关系,可能需要对啮齿动物进行姿势跟踪。虽然 DeepLabCut 等开源工具包已经彻底改变了使用深度神经网络进行无标记姿势估计的方法,但训练过程仍然需要人工干预,以注释视频数据中的关键兴趣点。为了进一步减少神经网络训练所需的人力,我们开发了一种方法,可在实验室环境中自动生成啮齿动物爪子位置的注释图像数据集。该方法使用不可见但在紫外线下暂时可见的荧光标记。通过频闪交替照明,以 720 Hz 频率拍摄的相邻视频帧要么被紫外线照射,要么被白光照射。对紫外线照射的视频帧进行彩色滤波后,紫外线标记被识别出来,爪子的位置也被确定地映射出来。然后,这些爪子信息将被传输到下一帧白光照射的视频中,自动标注爪子位置,随后用于训练神经网络。我们使用 KineWheel-DeepLabCut 装置演示了这种方法的有效性,该装置用于无标记跟踪一只固定在带镜子的透明轮子上的小鼠的四只爪子。我们的自动化方法开源,实现了高质量的位置注释,大大减少了神经网络训练过程中的人工参与,为神经科学研究中更高效、更简化的啮齿动物姿势跟踪铺平了道路。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
KineWheel-DeepLabCut Automated Paw Annotation Using Alternating Stroboscopic UV and White Light Illumination.

Uncovering the relationships between neural circuits, behavior, and neural dysfunction may require rodent pose tracking. While open-source toolkits such as DeepLabCut have revolutionized markerless pose estimation using deep neural networks, the training process still requires human intervention for annotating key points of interest in video data. To further reduce human labor for neural network training, we developed a method that automatically generates annotated image datasets of rodent paw placement in a laboratory setting. It uses invisible but fluorescent markers that become temporarily visible under UV light. Through stroboscopic alternating illumination, adjacent video frames taken at 720 Hz are either UV or white light illuminated. After color filtering the UV-exposed video frames, the UV markings are identified and the paw locations are deterministically mapped. This paw information is then transferred to automatically annotate paw positions in the next white light-exposed frame that is later used for training the neural network. We demonstrate the effectiveness of our method using a KineWheel-DeepLabCut setup for the markerless tracking of the four paws of a harness-fixed mouse running on top of the transparent wheel with mirror. Our automated approach, made available open-source, achieves high-quality position annotations and significantly reduces the need for human involvement in the neural network training process, paving the way for more efficient and streamlined rodent pose tracking in neuroscience research.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
eNeuro
eNeuro Neuroscience-General Neuroscience
CiteScore
5.00
自引率
2.90%
发文量
486
审稿时长
16 weeks
期刊介绍: An open-access journal from the Society for Neuroscience, eNeuro publishes high-quality, broad-based, peer-reviewed research focused solely on the field of neuroscience. eNeuro embodies an emerging scientific vision that offers a new experience for authors and readers, all in support of the Society’s mission to advance understanding of the brain and nervous system.
期刊最新文献
Sex-Dependent Changes in Gonadotropin-Releasing Hormone Neuron Voltage-Gated Potassium Currents in a Mouse Model of Temporal Lobe Epilepsy. Bilateral Alignment of Receptive Fields in the Olfactory Cortex. Peripheral CaV2.2 channels in skin regulate prolonged heat hypersensitivity during neuroinflammation. The Neural Correlates of Spontaneous Beat Processing and Its Relationship with Music-Related Characteristics of the Individual. The Orbitofrontal Cortex Is Required for Learned Modulation of Innate Olfactory Behavior.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1