Exploration of Key Point Localization Neural Network Architectures for Y-Maze Behavior Test Automation

Gwanghee Lee, Sangjun Moon, Dasom Choi, Gayeon Kim, Kyoungson Jhang
{"title":"Exploration of Key Point Localization Neural Network Architectures for Y-Maze Behavior Test Automation","authors":"Gwanghee Lee, Sangjun Moon, Dasom Choi, Gayeon Kim, Kyoungson Jhang","doi":"10.5626/jcse.2023.17.3.100","DOIUrl":null,"url":null,"abstract":"The Y-maze behavioral test is a pivotal tool for assessing the memory and exploratory tendencies of mice in novel environments. A significant aspect of this test involves the continuous tracking and pinpointing of the mouse’s location, a task that can be labor-intensive for human researchers. This study introduced an automated solution to this challenge through camera-based image processing. We argued that key point localization techniques are more effective than object detection methods, given that only a single mouse is involved in the test. Through an experimental comparison of eight distinct neural network architectures, we identified the most effective structures for localizing key points such as the mouse’s nose, body center, and tail base. Our models were designed to predict not only the mouse key points but also the reference points of the Y-maze device, aiming to streamline the analysis process and minimize human intervention. The approach involves the generation of a heatmap using a deep learning neural network structure, followed by the extraction of the key points’ central location from the heatmap using a soft argmax function. The findings of this study provide a practical guide for experimenters in the selection and application of neural network architectures for Y-maze behavioral testing.","PeriodicalId":37773,"journal":{"name":"Journal of Computing Science and Engineering","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2023-09-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Computing Science and Engineering","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.5626/jcse.2023.17.3.100","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"Engineering","Score":null,"Total":0}
引用次数: 0

Abstract

The Y-maze behavioral test is a pivotal tool for assessing the memory and exploratory tendencies of mice in novel environments. A significant aspect of this test involves the continuous tracking and pinpointing of the mouse’s location, a task that can be labor-intensive for human researchers. This study introduced an automated solution to this challenge through camera-based image processing. We argued that key point localization techniques are more effective than object detection methods, given that only a single mouse is involved in the test. Through an experimental comparison of eight distinct neural network architectures, we identified the most effective structures for localizing key points such as the mouse’s nose, body center, and tail base. Our models were designed to predict not only the mouse key points but also the reference points of the Y-maze device, aiming to streamline the analysis process and minimize human intervention. The approach involves the generation of a heatmap using a deep learning neural network structure, followed by the extraction of the key points’ central location from the heatmap using a soft argmax function. The findings of this study provide a practical guide for experimenters in the selection and application of neural network architectures for Y-maze behavioral testing.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
y迷宫行为测试自动化中关键点定位神经网络体系结构的探索
y形迷宫行为测试是评估小鼠在新环境中的记忆和探索倾向的关键工具。这项测试的一个重要方面包括持续跟踪和精确定位老鼠的位置,这对人类研究人员来说可能是一项劳动密集型的任务。这项研究通过基于相机的图像处理引入了一种自动化解决方案来应对这一挑战。我们认为关键点定位技术比目标检测方法更有效,因为只有一只老鼠参与了测试。通过对八种不同神经网络结构的实验比较,我们确定了最有效的结构来定位关键点,如老鼠的鼻子、身体中心和尾巴基部。我们设计的模型不仅可以预测鼠标关键点,还可以预测y形迷宫装置的参考点,旨在简化分析过程,最大限度地减少人为干预。该方法包括使用深度学习神经网络结构生成热图,然后使用软argmax函数从热图中提取关键点的中心位置。本研究结果为实验人员在y迷宫行为测试中选择和应用神经网络架构提供了实践指导。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Journal of Computing Science and Engineering
Journal of Computing Science and Engineering Engineering-Engineering (all)
CiteScore
1.00
自引率
0.00%
发文量
11
期刊介绍: Journal of Computing Science and Engineering (JCSE) is a peer-reviewed quarterly journal that publishes high-quality papers on all aspects of computing science and engineering. The primary objective of JCSE is to be an authoritative international forum for delivering both theoretical and innovative applied researches in the field. JCSE publishes original research contributions, surveys, and experimental studies with scientific advances. The scope of JCSE covers all topics related to computing science and engineering, with a special emphasis on the following areas: Embedded Computing, Ubiquitous Computing, Convergence Computing, Green Computing, Smart and Intelligent Computing, Human Computing.
期刊最新文献
An Efficient Attention Deficit Hyperactivity Disorder (ADHD) Diagnostic Technique Based on Multi-Regional Brain Magnetic Resonance Imaging A Study on the Recognition of English Pronunciation Features in Teaching by Machine Learning Algorithms Exploration of Key Point Localization Neural Network Architectures for Y-Maze Behavior Test Automation An Efficient Autism Detection Using Structural Magnetic Resonance Imaging Based on Selective Binary Coded Genetic Algorithm Segmentation and Rigid Registration of Liver Dynamic Computed Tomography Images for Diagnostic Assessment of Fatty Liver Disease
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1