Computers helping people with special needs : ... International Conference, ICCHP ... : proceedings. International Conference on Computers Helping People with Special Needs最新文献
Pub Date : 2024-07-01Epub Date: 2024-07-05DOI: 10.1007/978-3-031-62846-7_30
Andrea Narcisi, Huiying Shen, Dragan Ahmetovic, Sergio Mascetti, James M Coughlan
We have devised a novel "Point-and-Tap" interface that enables people who are blind or visually impaired (BVI) to easily acquire multiple levels of information about tactile graphics and 3D models. The interface uses an iPhone's depth and color cameras to track the user's hands while they interact with a model. When the user points to a feature of interest on the model with their index finger, the system reads aloud basic information about that feature. For additional information, the user lifts their index finger and taps the feature again. This process can be repeated multiple times to access additional levels of information. For instance, tapping once on a region in a tactile map could trigger the name of the region, with subsequent taps eliciting the population, area, climate, etc. No audio labels are triggered unless the user makes a pointing gesture, which allows the user to explore the model freely with one or both hands. Multiple taps can be used to skip through information levels quickly, with each tap interrupting the current utterance. This allows users to reach the desired level of information more quickly than listening to all levels in sequence. Experiments with six BVI participants demonstrate that the approach is practical, easy to learn and effective.
{"title":"Accessible Point-and-Tap Interaction for Acquiring Detailed Information about Tactile Graphics and 3D Models.","authors":"Andrea Narcisi, Huiying Shen, Dragan Ahmetovic, Sergio Mascetti, James M Coughlan","doi":"10.1007/978-3-031-62846-7_30","DOIUrl":"10.1007/978-3-031-62846-7_30","url":null,"abstract":"<p><p>We have devised a novel \"Point-and-Tap\" interface that enables people who are blind or visually impaired (BVI) to easily acquire multiple levels of information about tactile graphics and 3D models. The interface uses an iPhone's depth and color cameras to track the user's hands while they interact with a model. When the user points to a feature of interest on the model with their index finger, the system reads aloud basic information about that feature. For additional information, the user lifts their index finger and taps the feature again. This process can be repeated multiple times to access additional levels of information. For instance, tapping once on a region in a tactile map could trigger the name of the region, with subsequent taps eliciting the population, area, climate, etc. No audio labels are triggered unless the user makes a pointing gesture, which allows the user to explore the model freely with one or both hands. Multiple taps can be used to skip through information levels quickly, with each tap interrupting the current utterance. This allows users to reach the desired level of information more quickly than listening to all levels in sequence. Experiments with six BVI participants demonstrate that the approach is practical, easy to learn and effective.</p>","PeriodicalId":90476,"journal":{"name":"Computers helping people with special needs : ... International Conference, ICCHP ... : proceedings. International Conference on Computers Helping People with Special Needs","volume":"14750 ","pages":"252-259"},"PeriodicalIF":0.0,"publicationDate":"2024-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11338176/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142019823","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-07-01Epub Date: 2024-07-05DOI: 10.1007/978-3-031-62846-7_48
Fatemeh Elyasi, Roberto Manduchi
Wayfinding systems using inertial data recorded from a smartphone carried by the walker have great potential for increasing mobility independence of blind pedestrians. Pedestrian dead-reckoning (PDR) algorithms for localization require estimation of the step length of the walker. Prior work has shown that step length can be reliably predicted by processing the inertial data recorded by the smartphone with a simple machine learning algorithm. However, this prior work only considered sighted walkers, whose gait may be different from that of blind walkers using a long cane or a dog guide. In this work, we show that a step length estimation network trained on data from sighted walkers performs poorly when tested on blind walkers, and that retraining with data from blind walkers can dramatically increase the accuracy of step length prediction.
{"title":"Step Length Estimation for Blind Walkers.","authors":"Fatemeh Elyasi, Roberto Manduchi","doi":"10.1007/978-3-031-62846-7_48","DOIUrl":"10.1007/978-3-031-62846-7_48","url":null,"abstract":"<p><p>Wayfinding systems using inertial data recorded from a smartphone carried by the walker have great potential for increasing mobility independence of blind pedestrians. Pedestrian dead-reckoning (PDR) algorithms for localization require estimation of the step length of the walker. Prior work has shown that step length can be reliably predicted by processing the inertial data recorded by the smartphone with a simple machine learning algorithm. However, this prior work only considered sighted walkers, whose gait may be different from that of blind walkers using a long cane or a dog guide. In this work, we show that a step length estimation network trained on data from sighted walkers performs poorly when tested on blind walkers, and that retraining with data from blind walkers can dramatically increase the accuracy of step length prediction.</p>","PeriodicalId":90476,"journal":{"name":"Computers helping people with special needs : ... International Conference, ICCHP ... : proceedings. International Conference on Computers Helping People with Special Needs","volume":"14750 ","pages":"400-407"},"PeriodicalIF":0.0,"publicationDate":"2024-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11298791/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141895056","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-07-01DOI: 10.1007/978-3-031-08648-9_29
James M Coughlan, Brandon Biggs, Huiying Shen
Maps are indispensable for helping people learn about unfamiliar environments and plan trips. While tactile (2D) and 3D maps offer non-visual map access to people who are blind or visually impaired (BVI), this access is greatly enhanced by adding interactivity to the maps: when the user points at a feature of interest on the map, the name and other information about the feature is read aloud in audio. We explore how the use of an interactive 3D map of a playground, containing over seventy play structures and other features, affects spatial learning and cognition. Specifically, we perform experiments in which four blind participants answer questions about the map to evaluate their grasp of three types of spatial knowledge: landmark, route and survey. The results of these experiments demonstrate that participants are able to acquire this knowledge, most of which would be inaccessible without the interactivity of the map.
{"title":"Non-Visual Access to an Interactive 3D Map.","authors":"James M Coughlan, Brandon Biggs, Huiying Shen","doi":"10.1007/978-3-031-08648-9_29","DOIUrl":"10.1007/978-3-031-08648-9_29","url":null,"abstract":"<p><p>Maps are indispensable for helping people learn about unfamiliar environments and plan trips. While tactile (2D) and 3D maps offer non-visual map access to people who are blind or visually impaired (BVI), this access is greatly enhanced by adding interactivity to the maps: when the user points at a feature of interest on the map, the name and other information about the feature is read aloud in audio. We explore how the use of an interactive 3D map of a playground, containing over seventy play structures and other features, affects spatial learning and cognition. Specifically, we perform experiments in which four blind participants answer questions about the map to evaluate their grasp of three types of spatial knowledge: landmark, route and survey. The results of these experiments demonstrate that participants are able to acquire this knowledge, most of which would be inaccessible without the interactivity of the map.</p>","PeriodicalId":90476,"journal":{"name":"Computers helping people with special needs : ... International Conference, ICCHP ... : proceedings. International Conference on Computers Helping People with Special Needs","volume":"13341 ","pages":"253-260"},"PeriodicalIF":0.0,"publicationDate":"2022-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9467469/pdf/nihms-1832494.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"40357560","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-01-01DOI: 10.1007/978-3-031-08648-9
{"title":"Computers Helping People with Special Needs: 18th International Conference, ICCHP-AAATE 2022, Lecco, Italy, July 11–15, 2022, Proceedings, Part I","authors":"","doi":"10.1007/978-3-031-08648-9","DOIUrl":"https://doi.org/10.1007/978-3-031-08648-9","url":null,"abstract":"","PeriodicalId":90476,"journal":{"name":"Computers helping people with special needs : ... International Conference, ICCHP ... : proceedings. International Conference on Computers Helping People with Special Needs","volume":"128 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2022-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"83240433","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-01-01DOI: 10.1007/978-3-031-08645-8
{"title":"Computers Helping People with Special Needs: 18th International Conference, ICCHP-AAATE 2022, Lecco, Italy, July 11–15, 2022, Proceedings, Part II","authors":"","doi":"10.1007/978-3-031-08645-8","DOIUrl":"https://doi.org/10.1007/978-3-031-08645-8","url":null,"abstract":"","PeriodicalId":90476,"journal":{"name":"Computers helping people with special needs : ... International Conference, ICCHP ... : proceedings. International Conference on Computers Helping People with Special Needs","volume":"23 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2022-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"82541545","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-09-04DOI: 10.1007/978-3-030-58805-2_58
Marien Narváez, J. Aranda
{"title":"Correction to: Gait Patterns Monitoring Using Instrumented Forearm Crutches","authors":"Marien Narváez, J. Aranda","doi":"10.1007/978-3-030-58805-2_58","DOIUrl":"https://doi.org/10.1007/978-3-030-58805-2_58","url":null,"abstract":"","PeriodicalId":90476,"journal":{"name":"Computers helping people with special needs : ... International Conference, ICCHP ... : proceedings. International Conference on Computers Helping People with Special Needs","volume":"148 1","pages":"C1 - C1"},"PeriodicalIF":0.0,"publicationDate":"2020-09-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"81020907","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-09-04DOI: 10.1007/978-3-030-58796-3_61
M. Iwamura, Yoshihiko Inoue, Kazunori Minatani, K. Kise
{"title":"Correction to: Suitable Camera and Rotation Navigation for People with Visual Impairment on Looking for Something Using Object Detection Technique","authors":"M. Iwamura, Yoshihiko Inoue, Kazunori Minatani, K. Kise","doi":"10.1007/978-3-030-58796-3_61","DOIUrl":"https://doi.org/10.1007/978-3-030-58796-3_61","url":null,"abstract":"","PeriodicalId":90476,"journal":{"name":"Computers helping people with special needs : ... International Conference, ICCHP ... : proceedings. International Conference on Computers Helping People with Special Needs","volume":"55 1","pages":"C1 - C1"},"PeriodicalIF":0.0,"publicationDate":"2020-09-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"78824672","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-09-01Epub Date: 2020-09-04DOI: 10.1007/978-3-030-58805-2_29
Leighanne Jarvis, Sarah Moninger, Juliessa Pavon, Chandra Throckmorton, Kevin Caves
This manuscript describes tests and results of a study to evaluate classification algorithms derived from accelerometer data collected on healthy adults and older adults to better classify posture movements. Specifically, tests were conducted to 1) compare performance of 1 sensor vs. 2 sensors; 2) examine custom trained algorithms to classify for a given task 3) determine overall classifier accuracy for healthy adults under 55 and older adults (55 or older). Despite the current variety of commercially available platforms, sensors, and analysis software, many do not provide the data granularity needed to characterize all stages of movement. Additionally, some clinicians have expressed concerns regarding validity of analysis on specialized populations, such as hospitalized older adults. Accurate classification of movement data is important in a clinical setting as more hospital systems are using sensors to help with clinical decision making. We developed custom software and classification algorithms to identify laying, reclining, sitting, standing, and walking. Our algorithm accuracy is 93.2% for healthy adults under 55 and 95% for healthy older adults over 55 for the tasks in our setting. The high accuracy of this approach will aid future investigation into classifying movement in hospitalized older adults. Results from these tests also indicate that researchers and clinicians need to be aware of sensor body position in relation to where the algorithm used was trained. Additionally, results suggest more research is needed to determine if algorithms trained on one population can accurately be used to classify data from another population.
{"title":"Accelerometer-Based Machine Learning Categorization of Body Position in Adult Populations.","authors":"Leighanne Jarvis, Sarah Moninger, Juliessa Pavon, Chandra Throckmorton, Kevin Caves","doi":"10.1007/978-3-030-58805-2_29","DOIUrl":"https://doi.org/10.1007/978-3-030-58805-2_29","url":null,"abstract":"<p><p>This manuscript describes tests and results of a study to evaluate classification algorithms derived from accelerometer data collected on healthy adults and older adults to better classify posture movements. Specifically, tests were conducted to 1) compare performance of 1 sensor vs. 2 sensors; 2) examine custom trained algorithms to classify for a given task 3) determine overall classifier accuracy for healthy adults under 55 and older adults (55 or older). Despite the current variety of commercially available platforms, sensors, and analysis software, many do not provide the data granularity needed to characterize all stages of movement. Additionally, some clinicians have expressed concerns regarding validity of analysis on specialized populations, such as hospitalized older adults. Accurate classification of movement data is important in a clinical setting as more hospital systems are using sensors to help with clinical decision making. We developed custom software and classification algorithms to identify laying, reclining, sitting, standing, and walking. Our algorithm accuracy is 93.2% for healthy adults under 55 and 95% for healthy older adults over 55 for the tasks in our setting. The high accuracy of this approach will aid future investigation into classifying movement in hospitalized older adults. Results from these tests also indicate that researchers and clinicians need to be aware of sensor body position in relation to where the algorithm used was trained. Additionally, results suggest more research is needed to determine if algorithms trained on one population can accurately be used to classify data from another population.</p>","PeriodicalId":90476,"journal":{"name":"Computers helping people with special needs : ... International Conference, ICCHP ... : proceedings. International Conference on Computers Helping People with Special Needs","volume":"12377 ","pages":"242-249"},"PeriodicalIF":0.0,"publicationDate":"2020-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7548108/pdf/nihms-1634319.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"38480045","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-09-01Epub Date: 2020-09-04DOI: 10.1007/978-3-030-58796-3_56
Giovanni Fusco, Seyed Ali Cheraghi, Leo Neat, James M Coughlan
Indoor navigation is a major challenge for people with visual impairments, who often lack access to visual cues such as informational signs, landmarks and structural features that people with normal vision rely on for wayfinding. Building on our recent work on a computer vision-based localization approach that runs in real time on a smartphone, we describe an accessible wayfinding iOS app we have created that provides turn-by-turn directions to a desired destination. The localization approach combines dead reckoning obtained using visual-inertial odometry (VIO) with information about the user's location in the environment from informational sign detections and map constraints. We explain how we estimate the user's distance from Exit signs appearing in the image, describe new improvements in the sign detection and range estimation algorithms, and outline our algorithm for determining appropriate turn-by-turn directions.
{"title":"An Indoor Navigation App using Computer Vision and Sign Recognition.","authors":"Giovanni Fusco, Seyed Ali Cheraghi, Leo Neat, James M Coughlan","doi":"10.1007/978-3-030-58796-3_56","DOIUrl":"https://doi.org/10.1007/978-3-030-58796-3_56","url":null,"abstract":"<p><p>Indoor navigation is a major challenge for people with visual impairments, who often lack access to visual cues such as informational signs, landmarks and structural features that people with normal vision rely on for wayfinding. Building on our recent work on a computer vision-based localization approach that runs in real time on a smartphone, we describe an accessible wayfinding iOS app we have created that provides turn-by-turn directions to a desired destination. The localization approach combines dead reckoning obtained using visual-inertial odometry (VIO) with information about the user's location in the environment from informational sign detections and map constraints. We explain how we estimate the user's distance from Exit signs appearing in the image, describe new improvements in the sign detection and range estimation algorithms, and outline our algorithm for determining appropriate turn-by-turn directions.</p>","PeriodicalId":90476,"journal":{"name":"Computers helping people with special needs : ... International Conference, ICCHP ... : proceedings. International Conference on Computers Helping People with Special Needs","volume":"12376 ","pages":"485-494"},"PeriodicalIF":0.0,"publicationDate":"2020-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7703403/pdf/nihms-1645298.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"38664361","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-09-01Epub Date: 2020-09-04DOI: 10.1007/978-3-030-58796-3_55
James M Coughlan, Brandon Biggs, Marc-Aurèle Rivière, Huiying Shen
Augmented reality (AR) has great potential for blind users because it enables a range of applications that provide audio information about specific locations or directions in the user's environment. For instance, the CamIO ("Camera Input-Output") AR app makes physical objects (such as documents, maps, devices and 3D models) accessible to blind and visually impaired persons by providing real-time audio feedback in response to the location on an object that the user is touching (using an inexpensive stylus). An important feature needed by blind users of AR apps such as CamIO is a 3D spatial guidance feature that provides real-time audio feedback to help the user find a desired location on an object. We have devised a simple audio interface to provide verbal guidance towards a target of interest in 3D. The experiment we report with blind participants using this guidance interface demonstrates the feasibility of the approach and its benefit for helping users find locations of interest.
{"title":"An Audio-Based 3D Spatial Guidance AR System for Blind Users.","authors":"James M Coughlan, Brandon Biggs, Marc-Aurèle Rivière, Huiying Shen","doi":"10.1007/978-3-030-58796-3_55","DOIUrl":"https://doi.org/10.1007/978-3-030-58796-3_55","url":null,"abstract":"<p><p>Augmented reality (AR) has great potential for blind users because it enables a range of applications that provide audio information about specific locations or directions in the user's environment. For instance, the CamIO (\"Camera Input-Output\") AR app makes physical objects (such as documents, maps, devices and 3D models) accessible to blind and visually impaired persons by providing real-time audio feedback in response to the location on an object that the user is touching (using an inexpensive stylus). An important feature needed by blind users of AR apps such as CamIO is a 3D spatial guidance feature that provides real-time audio feedback to help the user find a desired location on an object. We have devised a simple audio interface to provide verbal guidance towards a target of interest in 3D. The experiment we report with blind participants using this guidance interface demonstrates the feasibility of the approach and its benefit for helping users find locations of interest.</p>","PeriodicalId":90476,"journal":{"name":"Computers helping people with special needs : ... International Conference, ICCHP ... : proceedings. International Conference on Computers Helping People with Special Needs","volume":"12376 ","pages":"475-484"},"PeriodicalIF":0.0,"publicationDate":"2020-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7676634/pdf/nihms-1645296.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"38632482","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Computers helping people with special needs : ... International Conference, ICCHP ... : proceedings. International Conference on Computers Helping People with Special Needs