Winson King Wai Tsang , Emily Shui Kei Poon , Chris Newman , Christina D. Buesching , Simon Yung Wa Sin
{"title":"Investigating the use of odour and colour foraging cues by rosy-faced lovebirds using deep-learning based analysis","authors":"Winson King Wai Tsang , Emily Shui Kei Poon , Chris Newman , Christina D. Buesching , Simon Yung Wa Sin","doi":"10.1016/j.anbehav.2025.123085","DOIUrl":null,"url":null,"abstract":"<div><div>Olfaction and vision can play important roles in optimizing foraging decisions of birds, enabling them to maximize their net rate of energy intake while searching for, handling and consuming food. Parrots have been used extensively in avian cognition research, and some species use olfactory cues to find food. Here we used machine-learning analysis and pose estimation with convolutional neural networks (CNNs) to elucidate the relative importance of visual and olfactory cues for informing foraging decisions in the rosy-faced lovebird, <em>Agapornis roseicollis</em>, as a nontypical model species. In a binary choice experiment, we used markerless body pose tracking to analyse bird response behaviours. Rosy-faced lovebirds quickly learnt to discriminate the feeder provisioned with food by forming an association with visual (red/green papers) but not olfactory (banana/almond odour) cues. When visual cues indicated the provisioned and empty feeders, feeder choice was more successful, hesitation time shorter and interest in the empty feeder significantly lower. Our findings reveal that lovebirds can rapidly learn novel visual cues but not olfactory cues, indicating that vision plays a more important role in their learning and foraging decisions than olfaction.</div></div>","PeriodicalId":50788,"journal":{"name":"Animal Behaviour","volume":"221 ","pages":"Article 123085"},"PeriodicalIF":2.3000,"publicationDate":"2025-02-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Animal Behaviour","FirstCategoryId":"99","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0003347225000120","RegionNum":2,"RegionCategory":"生物学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"BEHAVIORAL SCIENCES","Score":null,"Total":0}
引用次数: 0
Abstract
Olfaction and vision can play important roles in optimizing foraging decisions of birds, enabling them to maximize their net rate of energy intake while searching for, handling and consuming food. Parrots have been used extensively in avian cognition research, and some species use olfactory cues to find food. Here we used machine-learning analysis and pose estimation with convolutional neural networks (CNNs) to elucidate the relative importance of visual and olfactory cues for informing foraging decisions in the rosy-faced lovebird, Agapornis roseicollis, as a nontypical model species. In a binary choice experiment, we used markerless body pose tracking to analyse bird response behaviours. Rosy-faced lovebirds quickly learnt to discriminate the feeder provisioned with food by forming an association with visual (red/green papers) but not olfactory (banana/almond odour) cues. When visual cues indicated the provisioned and empty feeders, feeder choice was more successful, hesitation time shorter and interest in the empty feeder significantly lower. Our findings reveal that lovebirds can rapidly learn novel visual cues but not olfactory cues, indicating that vision plays a more important role in their learning and foraging decisions than olfaction.
期刊介绍:
Growing interest in behavioural biology and the international reputation of Animal Behaviour prompted an expansion to monthly publication in 1989. Animal Behaviour continues to be the journal of choice for biologists, ethologists, psychologists, physiologists, and veterinarians with an interest in the subject.