Antony Asir Daniel V , Basarikodi K , Suresh S , Nallasivan G , Bhuvanesh A , Milner Paul V
{"title":"Development of Evolutionary Gravity Neocognitron Neural Network Model for Behavioral Studies in Rodents","authors":"Antony Asir Daniel V , Basarikodi K , Suresh S , Nallasivan G , Bhuvanesh A , Milner Paul V","doi":"10.1016/j.measen.2024.101194","DOIUrl":null,"url":null,"abstract":"<div><p>From the past decades, rodent models have played role in evaluating the use of several drugs for the treatment of brain diseases. Generally, these tests are performed by recoding a video and examine to carry out various annotation about the behavior and activities of the rodents. However, the video must be executed continuously to ensure proper annotation that causes time complexity and increases the human observation error. Conventional techniques for rodent behavioral analysis process are not affordable for the research purpose due to increase cost and poor interpretability. To tackle this issue, a new and effective deep learning (DL) technique is introduced to analyze the multiclass behaviors in rodents under real-time scenario. At first, the video captured from camera is preprocessed by performing frame conversion and noise removal process. For removing the noise, the Butterworth-amended unsharp mask filtering (B_UMF) technique is emphasized thereby improving the image quality. Finally, the Evolutionary Gravity Neocognitron Neural Network (EGravity-NCNN) model is proposed to classify multiple rodent behaviours using adaptive feature learning. The simulation process for the developed method is carried out via the Python platform and various performance like accuracy, precision and recall are scrutinized and compared with conventional schemes. The developed method achieved the overall accuracy of 97.33 %, precision of 96.29 %, and recall of 97.02 % for the classification of rodent behaviours accurately.</p></div>","PeriodicalId":34311,"journal":{"name":"Measurement Sensors","volume":"33 ","pages":"Article 101194"},"PeriodicalIF":0.0000,"publicationDate":"2024-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2665917424001703/pdfft?md5=ad8c258a9fce10907b3b850f547d4ba1&pid=1-s2.0-S2665917424001703-main.pdf","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Measurement Sensors","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2665917424001703","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"Engineering","Score":null,"Total":0}
引用次数: 0
Abstract
From the past decades, rodent models have played role in evaluating the use of several drugs for the treatment of brain diseases. Generally, these tests are performed by recoding a video and examine to carry out various annotation about the behavior and activities of the rodents. However, the video must be executed continuously to ensure proper annotation that causes time complexity and increases the human observation error. Conventional techniques for rodent behavioral analysis process are not affordable for the research purpose due to increase cost and poor interpretability. To tackle this issue, a new and effective deep learning (DL) technique is introduced to analyze the multiclass behaviors in rodents under real-time scenario. At first, the video captured from camera is preprocessed by performing frame conversion and noise removal process. For removing the noise, the Butterworth-amended unsharp mask filtering (B_UMF) technique is emphasized thereby improving the image quality. Finally, the Evolutionary Gravity Neocognitron Neural Network (EGravity-NCNN) model is proposed to classify multiple rodent behaviours using adaptive feature learning. The simulation process for the developed method is carried out via the Python platform and various performance like accuracy, precision and recall are scrutinized and compared with conventional schemes. The developed method achieved the overall accuracy of 97.33 %, precision of 96.29 %, and recall of 97.02 % for the classification of rodent behaviours accurately.