基于时间注意力的动物声音分类

Jungmin Kim, Young Lo Lee, Donghyeon Kim, Hanseok Ko
{"title":"基于时间注意力的动物声音分类","authors":"Jungmin Kim, Young Lo Lee, Donghyeon Kim, Hanseok Ko","doi":"10.7776/ASK.2020.39.5.406","DOIUrl":null,"url":null,"abstract":"In this paper, to improve the classification accuracy of bird and amphibian acoustic sound, we utilize GLU (Gated Linear Unit) and Self-attention that encourages the network to extract important features from data and discriminate relevant important frames from all the input sequences for further performance improvement. To utilize acoustic data, we convert 1-D acoustic data to a log-Mel spectrogram. Subsequently, undesirable component such as background noise in the log-Mel spectrogram is reduced by GLU. Then, we employ the proposed temporal self-attention to improve classification accuracy. The data consist of 6-species of birds, 8-species of amphibians including endangered species in the natural environment. As a result, our proposed method is shown to achieve an accuracy of 91 % with bird data and 93 % with amphibian data. Overall, an improvement of about 6 % ~ 7 % accuracy in performance is achieved compared to the existing algorithms.","PeriodicalId":42689,"journal":{"name":"Journal of the Acoustical Society of Korea","volume":null,"pages":null},"PeriodicalIF":0.2000,"publicationDate":"2020-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Temporal attention based animal sound classification\",\"authors\":\"Jungmin Kim, Young Lo Lee, Donghyeon Kim, Hanseok Ko\",\"doi\":\"10.7776/ASK.2020.39.5.406\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In this paper, to improve the classification accuracy of bird and amphibian acoustic sound, we utilize GLU (Gated Linear Unit) and Self-attention that encourages the network to extract important features from data and discriminate relevant important frames from all the input sequences for further performance improvement. To utilize acoustic data, we convert 1-D acoustic data to a log-Mel spectrogram. Subsequently, undesirable component such as background noise in the log-Mel spectrogram is reduced by GLU. Then, we employ the proposed temporal self-attention to improve classification accuracy. The data consist of 6-species of birds, 8-species of amphibians including endangered species in the natural environment. As a result, our proposed method is shown to achieve an accuracy of 91 % with bird data and 93 % with amphibian data. Overall, an improvement of about 6 % ~ 7 % accuracy in performance is achieved compared to the existing algorithms.\",\"PeriodicalId\":42689,\"journal\":{\"name\":\"Journal of the Acoustical Society of Korea\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.2000,\"publicationDate\":\"2020-09-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of the Acoustical Society of Korea\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.7776/ASK.2020.39.5.406\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q4\",\"JCRName\":\"ACOUSTICS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of the Acoustical Society of Korea","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.7776/ASK.2020.39.5.406","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"ACOUSTICS","Score":null,"Total":0}
引用次数: 1

摘要

为了提高鸟类和两栖动物声音的分类精度,本文采用了门控线性单元(GLU)和自注意(Self-attention)方法,鼓励网络从数据中提取重要特征,并从所有输入序列中区分出相关的重要帧,从而进一步提高性能。为了利用声学数据,我们将一维声学数据转换为对数梅尔谱图。在此基础上,对对数mel谱图中的背景噪声等不良成分进行了滤波。然后,我们利用提出的时间自关注来提高分类精度。该数据包括6种鸟类,8种两栖动物,包括自然环境中的濒危物种。结果表明,我们提出的方法对鸟类数据的准确率为91%,对两栖动物数据的准确率为93%。总体而言,与现有算法相比,该算法的性能精度提高了6% ~ 7%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Temporal attention based animal sound classification
In this paper, to improve the classification accuracy of bird and amphibian acoustic sound, we utilize GLU (Gated Linear Unit) and Self-attention that encourages the network to extract important features from data and discriminate relevant important frames from all the input sequences for further performance improvement. To utilize acoustic data, we convert 1-D acoustic data to a log-Mel spectrogram. Subsequently, undesirable component such as background noise in the log-Mel spectrogram is reduced by GLU. Then, we employ the proposed temporal self-attention to improve classification accuracy. The data consist of 6-species of birds, 8-species of amphibians including endangered species in the natural environment. As a result, our proposed method is shown to achieve an accuracy of 91 % with bird data and 93 % with amphibian data. Overall, an improvement of about 6 % ~ 7 % accuracy in performance is achieved compared to the existing algorithms.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
0.60
自引率
50.00%
发文量
1
期刊最新文献
A quantitative analysis of synthetic aperture sonar image distortion according to sonar platform motion parameters Measurements of mid-frequency transmission loss in shallow waters off the East Sea: Comparison with Rayleigh reflection model and high-frequency bottom loss model An explorative study on the perceived emotion of music: according to cognitive styles of music listening A robust data association gate method of non-linear target tracking in dense cluttered environment Performance analysis of weakly-supervised sound event detection system based on the mean-teacher convolutional recurrent neural network model
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1