Fabian Steinbeck, Efstathios Kagioulis, Alex D. M. Dewar, A. Philippides, Thomas Nowotny, Paul Graham
{"title":"熟悉-分类法:基于视图的快照导航双边方法","authors":"Fabian Steinbeck, Efstathios Kagioulis, Alex D. M. Dewar, A. Philippides, Thomas Nowotny, Paul Graham","doi":"10.1177/10597123231221312","DOIUrl":null,"url":null,"abstract":"Many insects use view-based navigation, or snapshot matching, to return to familiar locations, or navigate routes. This relies on egocentric memories being matched to current views of the world. Previous Snapshot navigation algorithms have used full panoramic vision for the comparison of memorised images with query images to establish a measure of familiarity, which leads to a recovery of the original heading direction from when the snapshot was taken. Many aspects of insect sensory systems are lateralised with steering being derived from the comparison of left and right signals like a classic Braitenberg vehicle. Here, we investigate whether view-based route navigation can be implemented using bilateral visual familiarity comparisons. We found that the difference in familiarity between estimates from left and right fields of view can be used as a steering signal to recover the original heading direction. This finding extends across many different sizes of field of view and visual resolutions. In insects, steering computations are implemented in a brain region called the Lateral Accessory Lobe, within the Central Complex. In a simple simulation, we show with an SNN model of the LAL an existence proof of how bilateral visual familiarity could drive a search for a visually defined goal.","PeriodicalId":55552,"journal":{"name":"Adaptive Behavior","volume":"59 5","pages":""},"PeriodicalIF":1.2000,"publicationDate":"2024-01-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Familiarity-taxis: A bilateral approach to view-based snapshot navigation\",\"authors\":\"Fabian Steinbeck, Efstathios Kagioulis, Alex D. M. Dewar, A. Philippides, Thomas Nowotny, Paul Graham\",\"doi\":\"10.1177/10597123231221312\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Many insects use view-based navigation, or snapshot matching, to return to familiar locations, or navigate routes. This relies on egocentric memories being matched to current views of the world. Previous Snapshot navigation algorithms have used full panoramic vision for the comparison of memorised images with query images to establish a measure of familiarity, which leads to a recovery of the original heading direction from when the snapshot was taken. Many aspects of insect sensory systems are lateralised with steering being derived from the comparison of left and right signals like a classic Braitenberg vehicle. Here, we investigate whether view-based route navigation can be implemented using bilateral visual familiarity comparisons. We found that the difference in familiarity between estimates from left and right fields of view can be used as a steering signal to recover the original heading direction. This finding extends across many different sizes of field of view and visual resolutions. In insects, steering computations are implemented in a brain region called the Lateral Accessory Lobe, within the Central Complex. In a simple simulation, we show with an SNN model of the LAL an existence proof of how bilateral visual familiarity could drive a search for a visually defined goal.\",\"PeriodicalId\":55552,\"journal\":{\"name\":\"Adaptive Behavior\",\"volume\":\"59 5\",\"pages\":\"\"},\"PeriodicalIF\":1.2000,\"publicationDate\":\"2024-01-10\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Adaptive Behavior\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://doi.org/10.1177/10597123231221312\",\"RegionNum\":4,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q4\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Adaptive Behavior","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1177/10597123231221312","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
Familiarity-taxis: A bilateral approach to view-based snapshot navigation
Many insects use view-based navigation, or snapshot matching, to return to familiar locations, or navigate routes. This relies on egocentric memories being matched to current views of the world. Previous Snapshot navigation algorithms have used full panoramic vision for the comparison of memorised images with query images to establish a measure of familiarity, which leads to a recovery of the original heading direction from when the snapshot was taken. Many aspects of insect sensory systems are lateralised with steering being derived from the comparison of left and right signals like a classic Braitenberg vehicle. Here, we investigate whether view-based route navigation can be implemented using bilateral visual familiarity comparisons. We found that the difference in familiarity between estimates from left and right fields of view can be used as a steering signal to recover the original heading direction. This finding extends across many different sizes of field of view and visual resolutions. In insects, steering computations are implemented in a brain region called the Lateral Accessory Lobe, within the Central Complex. In a simple simulation, we show with an SNN model of the LAL an existence proof of how bilateral visual familiarity could drive a search for a visually defined goal.
期刊介绍:
_Adaptive Behavior_ publishes articles on adaptive behaviour in living organisms and autonomous artificial systems. The official journal of the _International Society of Adaptive Behavior_, _Adaptive Behavior_, addresses topics such as perception and motor control, embodied cognition, learning and evolution, neural mechanisms, artificial intelligence, behavioral sequences, motivation and emotion, characterization of environments, decision making, collective and social behavior, navigation, foraging, communication and signalling.
Print ISSN: 1059-7123