Automating the Surveillance of Mosquito Vectors from Trapped Specimens Using Computer Vision Techniques

M. Minakshi, Pratool Bharti, Willie McClinton, Jamshidbek Mirzakhalov, R. Carney, S. Chellappan
{"title":"Automating the Surveillance of Mosquito Vectors from Trapped Specimens Using Computer Vision Techniques","authors":"M. Minakshi, Pratool Bharti, Willie McClinton, Jamshidbek Mirzakhalov, R. Carney, S. Chellappan","doi":"10.1145/3378393.3402260","DOIUrl":null,"url":null,"abstract":"Among all animals, mosquitoes are responsible for the most deaths worldwide. Interestingly, not all types of mosquitoes spread diseases, but rather, a select few alone are competent enough to do so. In the case of any disease outbreak, an important first step is surveillance of vectors (i.e., those mosquitoes capable of spreading diseases). To do this today, public health workers lay several mosquito traps in the area of interest. Hundreds of mosquitoes will get trapped. Naturally, among these hundreds, taxonomists have to identify only the vectors to gauge their density. This process today is manual, requires complex expertise/ training, and is based on visual inspection of each trapped specimen under a microscope. It is long, stressful and self-limiting. This paper presents an innovative solution to this problem. Our technique assumes the presence of an embedded camera (similar to those in smart-phones) that can take pictures of trapped mosquitoes. Our techniques proposed here will then process these images to automatically classify the genus and species type. Our CNN model based on Inception-ResNet V2 and Transfer Learning yielded an overall accuracy of 80% in classifying mosquitoes when trained on 25, 867 images of 250 trapped mosquito vector specimens captured via many smart-phone cameras. In particular, the accuracy of our model in classifying Aedes aegypti and Anopheles stephensi mosquitoes (both of which are especially deadly vectors) is amongst the highest. We also present important lessons learned and practical impact of our techniques in this paper.","PeriodicalId":176951,"journal":{"name":"Proceedings of the 3rd ACM SIGCAS Conference on Computing and Sustainable Societies","volume":"22 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-05-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"10","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 3rd ACM SIGCAS Conference on Computing and Sustainable Societies","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3378393.3402260","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 10

Abstract

Among all animals, mosquitoes are responsible for the most deaths worldwide. Interestingly, not all types of mosquitoes spread diseases, but rather, a select few alone are competent enough to do so. In the case of any disease outbreak, an important first step is surveillance of vectors (i.e., those mosquitoes capable of spreading diseases). To do this today, public health workers lay several mosquito traps in the area of interest. Hundreds of mosquitoes will get trapped. Naturally, among these hundreds, taxonomists have to identify only the vectors to gauge their density. This process today is manual, requires complex expertise/ training, and is based on visual inspection of each trapped specimen under a microscope. It is long, stressful and self-limiting. This paper presents an innovative solution to this problem. Our technique assumes the presence of an embedded camera (similar to those in smart-phones) that can take pictures of trapped mosquitoes. Our techniques proposed here will then process these images to automatically classify the genus and species type. Our CNN model based on Inception-ResNet V2 and Transfer Learning yielded an overall accuracy of 80% in classifying mosquitoes when trained on 25, 867 images of 250 trapped mosquito vector specimens captured via many smart-phone cameras. In particular, the accuracy of our model in classifying Aedes aegypti and Anopheles stephensi mosquitoes (both of which are especially deadly vectors) is amongst the highest. We also present important lessons learned and practical impact of our techniques in this paper.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
利用计算机视觉技术对捕获标本中的蚊虫媒介进行自动监测
在所有动物中,蚊子造成的死亡是全世界最多的。有趣的是,并不是所有类型的蚊子都传播疾病,而是只有少数蚊子有能力传播疾病。在任何疾病暴发的情况下,重要的第一步是监测媒介(即那些能够传播疾病的蚊子)。今天,为了做到这一点,公共卫生工作者在感兴趣的地区放置了几个蚊子陷阱。数以百计的蚊子会被困住。自然地,在这几百个物种中,分类学家只需要识别带菌者来测量它们的密度。今天,这个过程是手动的,需要复杂的专业知识/培训,并且是基于在显微镜下对每个捕获标本的目视检查。它是漫长的、有压力的、自我限制的。本文提出了一种创新的解决方案。我们的技术假设存在一个嵌入式摄像头(类似于智能手机中的摄像头),可以拍摄被困蚊子的照片。我们在这里提出的技术将处理这些图像,然后自动分类属和种类型。我们的CNN模型基于Inception-ResNet V2和迁移学习(Transfer Learning),当对通过许多智能手机摄像头捕获的250个被困蚊子载体样本的25,867张图像进行训练时,分类蚊子的总体准确率为80%。特别是,我们的模型在分类埃及伊蚊和斯氏按蚊(这两种蚊子都是特别致命的媒介)方面的准确性是最高的。在本文中,我们还介绍了重要的经验教训和我们的技术的实际影响。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Extracting Features from Online Forums to Meet Social Needs of Breast Cancer Patients ICTs as Enablers of Resilient Social Capital for Ethnic Peace Persuasive information campaign to save water in Universities: An option for water-stressed areas? The "opaque panopticon": Why publishing data online doesn't make the State transparent? The case of India's livelihood program Competitive Cities: Establishing a Classification Model using Data Science-related Jobs
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1