Simon L. Gay , Edwige Pissaloux , Jean-Paul Jamont
{"title":"生物启发的鲁棒导航辅助设备模型","authors":"Simon L. Gay , Edwige Pissaloux , Jean-Paul Jamont","doi":"10.1016/j.smhl.2024.100484","DOIUrl":null,"url":null,"abstract":"<div><p>This paper proposes a new implementation and evaluation in a real-world environment of a bio-inspired predictive navigation model for mobility control, suitable especially for assistance of visually impaired people and autonomous mobile systems. This bio-inspired model relies on the interactions between formal models of three types of neurons identified in the mammals’ brain implied in navigation tasks, namely place cells, grid cells, and head direction cells, to construct a topological model of the environment under the form of a decentralized navigation graph. Previously tested in virtual environments, this model demonstrated a high tolerance to motion drift, making possible to map large environments without the need to correct it to handle such drifts, and robustness to environment changes. The presented implementation is based on a stereoscopic camera, and is evaluated on its possibilities to map and guide a person or an autonomous mobile robot in an unknown real environment. The evaluation results confirm the effectiveness of the proposed bio-inspired navigation model to build a path map, localize and guide a person through this path. The model predictions remain robust to environment changes, and allow to estimate traveled distances with an error rate below 3% over test paths, up to 100m. The tests performed on a robotic platform also demonstrated the pertinence of navigation data produced by this navigation model to guide an autonomous system. These results open the way toward efficient wearable assistive devices for visually impaired people independent navigation.</p></div>","PeriodicalId":37151,"journal":{"name":"Smart Health","volume":"33 ","pages":"Article 100484"},"PeriodicalIF":0.0000,"publicationDate":"2024-04-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2352648324000400/pdfft?md5=da26293fcfb66b6f6d83277cab8ac5b3&pid=1-s2.0-S2352648324000400-main.pdf","citationCount":"0","resultStr":"{\"title\":\"A bio-inspired model for robust navigation assistive devices\",\"authors\":\"Simon L. Gay , Edwige Pissaloux , Jean-Paul Jamont\",\"doi\":\"10.1016/j.smhl.2024.100484\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>This paper proposes a new implementation and evaluation in a real-world environment of a bio-inspired predictive navigation model for mobility control, suitable especially for assistance of visually impaired people and autonomous mobile systems. This bio-inspired model relies on the interactions between formal models of three types of neurons identified in the mammals’ brain implied in navigation tasks, namely place cells, grid cells, and head direction cells, to construct a topological model of the environment under the form of a decentralized navigation graph. Previously tested in virtual environments, this model demonstrated a high tolerance to motion drift, making possible to map large environments without the need to correct it to handle such drifts, and robustness to environment changes. The presented implementation is based on a stereoscopic camera, and is evaluated on its possibilities to map and guide a person or an autonomous mobile robot in an unknown real environment. The evaluation results confirm the effectiveness of the proposed bio-inspired navigation model to build a path map, localize and guide a person through this path. The model predictions remain robust to environment changes, and allow to estimate traveled distances with an error rate below 3% over test paths, up to 100m. The tests performed on a robotic platform also demonstrated the pertinence of navigation data produced by this navigation model to guide an autonomous system. These results open the way toward efficient wearable assistive devices for visually impaired people independent navigation.</p></div>\",\"PeriodicalId\":37151,\"journal\":{\"name\":\"Smart Health\",\"volume\":\"33 \",\"pages\":\"Article 100484\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-04-17\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.sciencedirect.com/science/article/pii/S2352648324000400/pdfft?md5=da26293fcfb66b6f6d83277cab8ac5b3&pid=1-s2.0-S2352648324000400-main.pdf\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Smart Health\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S2352648324000400\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"Health Professions\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Smart Health","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2352648324000400","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"Health Professions","Score":null,"Total":0}
A bio-inspired model for robust navigation assistive devices
This paper proposes a new implementation and evaluation in a real-world environment of a bio-inspired predictive navigation model for mobility control, suitable especially for assistance of visually impaired people and autonomous mobile systems. This bio-inspired model relies on the interactions between formal models of three types of neurons identified in the mammals’ brain implied in navigation tasks, namely place cells, grid cells, and head direction cells, to construct a topological model of the environment under the form of a decentralized navigation graph. Previously tested in virtual environments, this model demonstrated a high tolerance to motion drift, making possible to map large environments without the need to correct it to handle such drifts, and robustness to environment changes. The presented implementation is based on a stereoscopic camera, and is evaluated on its possibilities to map and guide a person or an autonomous mobile robot in an unknown real environment. The evaluation results confirm the effectiveness of the proposed bio-inspired navigation model to build a path map, localize and guide a person through this path. The model predictions remain robust to environment changes, and allow to estimate traveled distances with an error rate below 3% over test paths, up to 100m. The tests performed on a robotic platform also demonstrated the pertinence of navigation data produced by this navigation model to guide an autonomous system. These results open the way toward efficient wearable assistive devices for visually impaired people independent navigation.