FlightTrackAI:基于卷积神经网络的强健工具,用于追踪埃及伊蚊的飞行行为。

IF 2.9 3区 综合性期刊 Q1 MULTIDISCIPLINARY SCIENCES Royal Society Open Science Pub Date : 2024-10-02 eCollection Date: 2024-10-01 DOI:10.1098/rsos.240923
Nouman Javed, Adam J López-Denman, Prasad N Paradkar, Asim Bhatti
{"title":"FlightTrackAI:基于卷积神经网络的强健工具,用于追踪埃及伊蚊的飞行行为。","authors":"Nouman Javed, Adam J López-Denman, Prasad N Paradkar, Asim Bhatti","doi":"10.1098/rsos.240923","DOIUrl":null,"url":null,"abstract":"<p><p>Monitoring the flight behaviour of mosquitoes is crucial for assessing their fitness levels and understanding their potential role in disease transmission. Existing methods for tracking mosquito flight behaviour are challenging to implement in laboratory environments, and they also struggle with identity tracking, particularly during occlusions. Here, we introduce FlightTrackAI, a robust convolutional neural network (CNN)-based tool for automatic mosquito flight tracking. FlightTrackAI employs CNN, a multi-object tracking algorithm, and interpolation to track flight behaviour. It automatically processes each video in the input folder without supervision and generates tracked videos with mosquito positions across the frames and trajectory graphs before and after interpolation. FlightTrackAI does not require a sophisticated setup to capture videos; it can perform excellently with videos recorded using standard laboratory cages. FlightTrackAI also offers filtering capabilities to eliminate short-lived objects such as reflections. Validation of FlightTrackAI demonstrated its excellent performance with an average accuracy of 99.9%. The percentage of correctly assigned identities after occlusions exceeded 91%. The data produced by FlightTrackAI can facilitate analysis of various flight-related behaviours, including flight distance and volume coverage during flights. This advancement can help to enhance our understanding of mosquito ecology and behaviour, thereby informing targeted strategies for vector control.</p>","PeriodicalId":21525,"journal":{"name":"Royal Society Open Science","volume":null,"pages":null},"PeriodicalIF":2.9000,"publicationDate":"2024-10-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11444788/pdf/","citationCount":"0","resultStr":"{\"title\":\"FlightTrackAI: a robust convolutional neural network-based tool for tracking the flight behaviour of <i>Aedes aegypti</i> mosquitoes.\",\"authors\":\"Nouman Javed, Adam J López-Denman, Prasad N Paradkar, Asim Bhatti\",\"doi\":\"10.1098/rsos.240923\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Monitoring the flight behaviour of mosquitoes is crucial for assessing their fitness levels and understanding their potential role in disease transmission. Existing methods for tracking mosquito flight behaviour are challenging to implement in laboratory environments, and they also struggle with identity tracking, particularly during occlusions. Here, we introduce FlightTrackAI, a robust convolutional neural network (CNN)-based tool for automatic mosquito flight tracking. FlightTrackAI employs CNN, a multi-object tracking algorithm, and interpolation to track flight behaviour. It automatically processes each video in the input folder without supervision and generates tracked videos with mosquito positions across the frames and trajectory graphs before and after interpolation. FlightTrackAI does not require a sophisticated setup to capture videos; it can perform excellently with videos recorded using standard laboratory cages. FlightTrackAI also offers filtering capabilities to eliminate short-lived objects such as reflections. Validation of FlightTrackAI demonstrated its excellent performance with an average accuracy of 99.9%. The percentage of correctly assigned identities after occlusions exceeded 91%. The data produced by FlightTrackAI can facilitate analysis of various flight-related behaviours, including flight distance and volume coverage during flights. This advancement can help to enhance our understanding of mosquito ecology and behaviour, thereby informing targeted strategies for vector control.</p>\",\"PeriodicalId\":21525,\"journal\":{\"name\":\"Royal Society Open Science\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":2.9000,\"publicationDate\":\"2024-10-02\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11444788/pdf/\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Royal Society Open Science\",\"FirstCategoryId\":\"103\",\"ListUrlMain\":\"https://doi.org/10.1098/rsos.240923\",\"RegionNum\":3,\"RegionCategory\":\"综合性期刊\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"2024/10/1 0:00:00\",\"PubModel\":\"eCollection\",\"JCR\":\"Q1\",\"JCRName\":\"MULTIDISCIPLINARY SCIENCES\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Royal Society Open Science","FirstCategoryId":"103","ListUrlMain":"https://doi.org/10.1098/rsos.240923","RegionNum":3,"RegionCategory":"综合性期刊","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2024/10/1 0:00:00","PubModel":"eCollection","JCR":"Q1","JCRName":"MULTIDISCIPLINARY SCIENCES","Score":null,"Total":0}
引用次数: 0

摘要

监测蚊子的飞行行为对于评估它们的体能水平和了解它们在疾病传播中的潜在作用至关重要。现有的追踪蚊子飞行行为的方法在实验室环境中实施具有挑战性,而且它们在身份追踪方面也很困难,特别是在遮挡期间。在此,我们介绍基于卷积神经网络(CNN)的蚊子飞行自动跟踪工具 FlightTrackAI。FlightTrackAI 采用 CNN、多目标跟踪算法和插值法来跟踪飞行行为。它能在没有监督的情况下自动处理输入文件夹中的每段视频,并生成带有蚊子在各帧中位置的跟踪视频以及插值前后的轨迹图。FlightTrackAI 不需要复杂的设置来捕捉视频;使用标准实验室笼子录制的视频也能表现出色。FlightTrackAI 还提供过滤功能,以消除反射等短时物体。FlightTrackAI 的验证结果表明其性能卓越,平均准确率高达 99.9%。遮挡后正确分配身份的比例超过 91%。FlightTrackAI 生成的数据有助于分析各种与飞行相关的行为,包括飞行距离和飞行过程中的体积覆盖范围。这一进步有助于加深我们对蚊子生态和行为的了解,从而为有针对性的病媒控制策略提供信息。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
FlightTrackAI: a robust convolutional neural network-based tool for tracking the flight behaviour of Aedes aegypti mosquitoes.

Monitoring the flight behaviour of mosquitoes is crucial for assessing their fitness levels and understanding their potential role in disease transmission. Existing methods for tracking mosquito flight behaviour are challenging to implement in laboratory environments, and they also struggle with identity tracking, particularly during occlusions. Here, we introduce FlightTrackAI, a robust convolutional neural network (CNN)-based tool for automatic mosquito flight tracking. FlightTrackAI employs CNN, a multi-object tracking algorithm, and interpolation to track flight behaviour. It automatically processes each video in the input folder without supervision and generates tracked videos with mosquito positions across the frames and trajectory graphs before and after interpolation. FlightTrackAI does not require a sophisticated setup to capture videos; it can perform excellently with videos recorded using standard laboratory cages. FlightTrackAI also offers filtering capabilities to eliminate short-lived objects such as reflections. Validation of FlightTrackAI demonstrated its excellent performance with an average accuracy of 99.9%. The percentage of correctly assigned identities after occlusions exceeded 91%. The data produced by FlightTrackAI can facilitate analysis of various flight-related behaviours, including flight distance and volume coverage during flights. This advancement can help to enhance our understanding of mosquito ecology and behaviour, thereby informing targeted strategies for vector control.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Royal Society Open Science
Royal Society Open Science Multidisciplinary-Multidisciplinary
CiteScore
6.00
自引率
0.00%
发文量
508
审稿时长
14 weeks
期刊介绍: Royal Society Open Science is a new open journal publishing high-quality original research across the entire range of science on the basis of objective peer-review. The journal covers the entire range of science and mathematics and will allow the Society to publish all the high-quality work it receives without the usual restrictions on scope, length or impact.
期刊最新文献
Heliconius butterflies use wide-field landscape features, but not individual local landmarks, during spatial learning. Appreciation of singing and speaking voices is highly idiosyncratic. A first vocal repertoire characterization of long-finned pilot whales (Globicephala melas) in the Mediterranean Sea: a machine learning approach. Beyond bigrams: call sequencing in the common marmoset (Callithrix jacchus) vocal system. Enhancing biodiversity: historical ecology and biogeography of the Santa Catalina Island ground squirrel, Otospermophilus beecheyi nesioticus.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1