Pub Date : 2025-10-30DOI: 10.1038/s41684-025-01640-2
Alexandra Le Bras
{"title":"Role of bile acids in early puberty","authors":"Alexandra Le Bras","doi":"10.1038/s41684-025-01640-2","DOIUrl":"10.1038/s41684-025-01640-2","url":null,"abstract":"","PeriodicalId":17936,"journal":{"name":"Lab Animal","volume":"54 11","pages":"288-288"},"PeriodicalIF":3.9,"publicationDate":"2025-10-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145399094","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-10-30DOI: 10.1038/s41684-025-01637-x
Jorge Ferreira
{"title":"Automated method to detect early motor dysfunction","authors":"Jorge Ferreira","doi":"10.1038/s41684-025-01637-x","DOIUrl":"10.1038/s41684-025-01637-x","url":null,"abstract":"","PeriodicalId":17936,"journal":{"name":"Lab Animal","volume":"54 11","pages":"287-287"},"PeriodicalIF":3.9,"publicationDate":"2025-10-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145399101","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-10-30DOI: 10.1038/s41684-025-01641-1
Alexandra Le Bras
{"title":"ATP imaging in mice","authors":"Alexandra Le Bras","doi":"10.1038/s41684-025-01641-1","DOIUrl":"10.1038/s41684-025-01641-1","url":null,"abstract":"","PeriodicalId":17936,"journal":{"name":"Lab Animal","volume":"54 11","pages":"288-288"},"PeriodicalIF":3.9,"publicationDate":"2025-10-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145399096","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-10-27DOI: 10.1038/s41684-025-01634-0
Jung Ah Jo, Taeyun Park, Suhyun Hwang, Sunwoo Lee, Jihwan Min, Geunhong Park, Minho Song, Jeehyun Kwag, Hwiyoung Kim, Changhyuk Lee, Jeongjin Kim
The heavy reliance of animal studies on human observers makes them prone to observer-specific biases. To mitigate such shortcomings, much research has been focused on computerizing the characterization of animal behavior. Such automation can lead to more reliable and cost-effective behavior quantifications. Yet, there remain challenges in developing end-to-end solutions that allow users to easily train custom behavioral classifiers with minimal data while maintaining low computational demands. Here we resolve these challenges through a rodent behavior classifier, the real-time rodent behavior classifier using color-based body segmentation (R2C2) algorithm, which uses color-based body segmentation to track rodent body parts and consequently their behaviors. Based on the ‘hue, saturation, value’ (HSV) color difference in furs or exposed skins, the R2C2 creates simple white–black color boundaries for each body part, which are then used to discern and track body parts in real time to extract movement-based features. We combined wavelet transform-based tracking with HSV color-based body part segmentation to substantially reduce computational requirements while minimizing the number of input features needed for classification. Loading these features into our convolutional neural network algorithm, the R2C2 achieves performance on par with an expert human observer. Furthermore, it can differentiate subtle behavioral patterns associated with autism spectrum disorder in mouse models. As the R2C2 is a complete, lightweight end-to-end pipeline package with a graphical user interface and does not require end-user programming or heavy computation resources, it can be easily adopted in conventional neuroscience laboratories. By enabling effective auto-labeling of fine animal actions, R2C2 will facilitate studies aiming to uncover the neural mechanisms driving behavioral modulations. The authors developed a new behavior classifier using color-based body segmentation that can capture body parts, track body movements and detect subtle behavioral phenotypes in freely moving mice.
{"title":"Real-time rodent behavior classifier using color-based body segmentation (R2C2)","authors":"Jung Ah Jo, Taeyun Park, Suhyun Hwang, Sunwoo Lee, Jihwan Min, Geunhong Park, Minho Song, Jeehyun Kwag, Hwiyoung Kim, Changhyuk Lee, Jeongjin Kim","doi":"10.1038/s41684-025-01634-0","DOIUrl":"10.1038/s41684-025-01634-0","url":null,"abstract":"The heavy reliance of animal studies on human observers makes them prone to observer-specific biases. To mitigate such shortcomings, much research has been focused on computerizing the characterization of animal behavior. Such automation can lead to more reliable and cost-effective behavior quantifications. Yet, there remain challenges in developing end-to-end solutions that allow users to easily train custom behavioral classifiers with minimal data while maintaining low computational demands. Here we resolve these challenges through a rodent behavior classifier, the real-time rodent behavior classifier using color-based body segmentation (R2C2) algorithm, which uses color-based body segmentation to track rodent body parts and consequently their behaviors. Based on the ‘hue, saturation, value’ (HSV) color difference in furs or exposed skins, the R2C2 creates simple white–black color boundaries for each body part, which are then used to discern and track body parts in real time to extract movement-based features. We combined wavelet transform-based tracking with HSV color-based body part segmentation to substantially reduce computational requirements while minimizing the number of input features needed for classification. Loading these features into our convolutional neural network algorithm, the R2C2 achieves performance on par with an expert human observer. Furthermore, it can differentiate subtle behavioral patterns associated with autism spectrum disorder in mouse models. As the R2C2 is a complete, lightweight end-to-end pipeline package with a graphical user interface and does not require end-user programming or heavy computation resources, it can be easily adopted in conventional neuroscience laboratories. By enabling effective auto-labeling of fine animal actions, R2C2 will facilitate studies aiming to uncover the neural mechanisms driving behavioral modulations. The authors developed a new behavior classifier using color-based body segmentation that can capture body parts, track body movements and detect subtle behavioral phenotypes in freely moving mice.","PeriodicalId":17936,"journal":{"name":"Lab Animal","volume":"54 11","pages":"321-334"},"PeriodicalIF":3.9,"publicationDate":"2025-10-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145374029","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}