Christos Kouroupetroglou, A. Koumpis, D. Papageorgiou
This paper presents a set of future scenarios as a part of our study which explores and analyzes the relationships between the emerging ICT landscape in the European societal and economic context, and the development and provision of e-Accessibility, within a perspective of 10 years. Part of our study is the development and validation of various scenarios regarding the impact of new technologies in accessibility. This paper presents some draft scenarios that were produced by combining technologies referred by experts as crucial for the future of eAccessibility.
{"title":"Future technology oriented scenarios on e-accessibility","authors":"Christos Kouroupetroglou, A. Koumpis, D. Papageorgiou","doi":"10.1145/2049536.2049590","DOIUrl":"https://doi.org/10.1145/2049536.2049590","url":null,"abstract":"This paper presents a set of future scenarios as a part of our study which explores and analyzes the relationships between the emerging ICT landscape in the European societal and economic context, and the development and provision of e-Accessibility, within a perspective of 10 years. Part of our study is the development and validation of various scenarios regarding the impact of new technologies in accessibility. This paper presents some draft scenarios that were produced by combining technologies referred by experts as crucial for the future of eAccessibility.","PeriodicalId":351090,"journal":{"name":"The proceedings of the 13th international ACM SIGACCESS conference on Computers and accessibility","volume":"40 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-10-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116488663","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Bernd Tessendorf, D. Roggen, M. Spuhler, T. Stiefmeier, G. Tröster, T. Grämer, Manuela Feilner, Peter Derleth
We present a bilateral vibrotactile feedback system for accurate lateralization of target angles in the complete 360 degree-range. We envision integrating this system into context-aware hearing instruments (HIs) or cochlear implants (CIs) to support users that experience lateralization difficulties. As a foundation for this it is vital to investigate which kind of feedback and vibration patterns are optimal to provide support for lateralization. Our system enables to evaluate and compare different encoding schemes with respect to resolution, reaction time, intuitiveness and user dependency. The system supports bilateral vibrotactile feedback to reflect integration into HIs or CIs worn at both ears and implemented two approaches: Quantized Absolute Heading (QAH) and Continuous Guidance Feedback (CGF). We provide a detailed description of our hardware that was designed to be also applicable for generic vibrotactile feedback applications.
{"title":"Design of a bilateral vibrotactile feedback system for lateralization","authors":"Bernd Tessendorf, D. Roggen, M. Spuhler, T. Stiefmeier, G. Tröster, T. Grämer, Manuela Feilner, Peter Derleth","doi":"10.1145/2049536.2049583","DOIUrl":"https://doi.org/10.1145/2049536.2049583","url":null,"abstract":"We present a bilateral vibrotactile feedback system for accurate lateralization of target angles in the complete 360 degree-range. We envision integrating this system into context-aware hearing instruments (HIs) or cochlear implants (CIs) to support users that experience lateralization difficulties. As a foundation for this it is vital to investigate which kind of feedback and vibration patterns are optimal to provide support for lateralization. Our system enables to evaluate and compare different encoding schemes with respect to resolution, reaction time, intuitiveness and user dependency. The system supports bilateral vibrotactile feedback to reflect integration into HIs or CIs worn at both ears and implemented two approaches: Quantized Absolute Heading (QAH) and Continuous Guidance Feedback (CGF). We provide a detailed description of our hardware that was designed to be also applicable for generic vibrotactile feedback applications.","PeriodicalId":351090,"journal":{"name":"The proceedings of the 13th international ACM SIGACCESS conference on Computers and accessibility","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-10-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130606830","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Session details: Mobile and ubiquitious UI","authors":"S. Harper","doi":"10.1145/3253160","DOIUrl":"https://doi.org/10.1145/3253160","url":null,"abstract":"","PeriodicalId":351090,"journal":{"name":"The proceedings of the 13th international ACM SIGACCESS conference on Computers and accessibility","volume":"57 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-10-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132141366","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
P. Viswanathan, J. Little, Alan K. Mackworth, Alex Mihailidis
Many older adults with cognitive impairment are excluded from powered wheelchair use because of safety concerns. This leads to reduced mobility, and in turn, higher dependence on caregivers. In this paper, we describe an intelligent wheelchair that uses computer vision and machine learning methods to provide adaptive navigation assistance to users with cognitive impairment. We demonstrate the performance of the system in a user study with the target population. We show that the collision avoidance module of the system successfully decreases the number of collisions for all participants. We also show that the wayfinding module assists users with memory and vision impairments. We share feedback from the users on various aspects of the intelligent wheelchair system. In addition, we provide our own observations and insights on the target population and their use of intelligent wheelchairs. Finally, we suggest directions for future work.
{"title":"Navigation and obstacle avoidance help (NOAH) for older adults with cognitive impairment: a pilot study","authors":"P. Viswanathan, J. Little, Alan K. Mackworth, Alex Mihailidis","doi":"10.1145/2049536.2049546","DOIUrl":"https://doi.org/10.1145/2049536.2049546","url":null,"abstract":"Many older adults with cognitive impairment are excluded from powered wheelchair use because of safety concerns. This leads to reduced mobility, and in turn, higher dependence on caregivers. In this paper, we describe an intelligent wheelchair that uses computer vision and machine learning methods to provide adaptive navigation assistance to users with cognitive impairment. We demonstrate the performance of the system in a user study with the target population. We show that the collision avoidance module of the system successfully decreases the number of collisions for all participants. We also show that the wayfinding module assists users with memory and vision impairments. We share feedback from the users on various aspects of the intelligent wheelchair system. In addition, we provide our own observations and insights on the target population and their use of intelligent wheelchairs. Finally, we suggest directions for future work.","PeriodicalId":351090,"journal":{"name":"The proceedings of the 13th international ACM SIGACCESS conference on Computers and accessibility","volume":"93 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-10-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133263977","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
To enable individuals who are blind and visually impaired to participate fully in the world around them, it is important to make all environments accessible to them. This includes art museums which provide opportunities for cultural education and personal interest/enjoyment. Our interest focuses on the portrayal of paintings through refreshable haptic displays from their digital representations. As a complement to representing the structural elements (i.e., objects and shapes) in a painting, we believe it is also important to provide a personal experience of the style and expressiveness of the artist. This paper proposes a haptic display and display methods to do so. The haptic display consists of: (1) a pin matrix display to the fingers to relay tactile texture information about brushstroke, (2) a thermal display on which the warm-cold spectrum of colors is mapped, and (3) the sensing of location within the painting used to change tactile and thermal feedback to create contrasts within a painting.
{"title":"A tactile-thermal display for haptic exploration of virtual paintings","authors":"Victoria E. Hribar, D. Pawluk","doi":"10.1145/2049536.2049577","DOIUrl":"https://doi.org/10.1145/2049536.2049577","url":null,"abstract":"To enable individuals who are blind and visually impaired to participate fully in the world around them, it is important to make all environments accessible to them. This includes art museums which provide opportunities for cultural education and personal interest/enjoyment. Our interest focuses on the portrayal of paintings through refreshable haptic displays from their digital representations. As a complement to representing the structural elements (i.e., objects and shapes) in a painting, we believe it is also important to provide a personal experience of the style and expressiveness of the artist. This paper proposes a haptic display and display methods to do so. The haptic display consists of: (1) a pin matrix display to the fingers to relay tactile texture information about brushstroke, (2) a thermal display on which the warm-cold spectrum of colors is mapped, and (3) the sensing of location within the painting used to change tactile and thermal feedback to create contrasts within a painting.","PeriodicalId":351090,"journal":{"name":"The proceedings of the 13th international ACM SIGACCESS conference on Computers and accessibility","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-10-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128663980","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A cognitive impairment can restrict the independence an individual has over their own life. Commonly, attention deficits affect individuals with cognitive impairments and make completion of everyday tasks difficult. While many technologies exist to assist this group in memory storage and retrieval, these individuals could also benefit from technology focused on time management and assistance with task completion. The TaskTracker has been created for this purpose by incorporating several features focused on task completion into one useful Android™ smart phone application. A progress bar, alarm reminders, and a motivational message have been combined to motivate task completion and time management rather than task memory alone.
{"title":"The TaskTracker: assistive technology for task completion","authors":"Victoria E. Hribar","doi":"10.1145/2049536.2049631","DOIUrl":"https://doi.org/10.1145/2049536.2049631","url":null,"abstract":"A cognitive impairment can restrict the independence an individual has over their own life. Commonly, attention deficits affect individuals with cognitive impairments and make completion of everyday tasks difficult. While many technologies exist to assist this group in memory storage and retrieval, these individuals could also benefit from technology focused on time management and assistance with task completion. The TaskTracker has been created for this purpose by incorporating several features focused on task completion into one useful Android™ smart phone application. A progress bar, alarm reminders, and a motivational message have been combined to motivate task completion and time management rather than task memory alone.","PeriodicalId":351090,"journal":{"name":"The proceedings of the 13th international ACM SIGACCESS conference on Computers and accessibility","volume":"24 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-10-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128728793","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Session details: Multimedia and TV","authors":"Enrico Pontelli","doi":"10.1145/3253158","DOIUrl":"https://doi.org/10.1145/3253158","url":null,"abstract":"","PeriodicalId":351090,"journal":{"name":"The proceedings of the 13th international ACM SIGACCESS conference on Computers and accessibility","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-10-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115289076","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This paper presents an indoor wayfinding system to help the visually impaired finding their way to a given destination in an unfamiliar environment. The main novelty is the use of the user's situation as the basis for designing color codes to explain the environmental information and for developing the wayfinding system to detect and recognize such color codes. Actually, people would require different information according to their situations. Therefore, situation-based color codes are designed, including location-specific codes and guide codes. These color codes are affixed in certain locations to provide information to the visually impaired, and their location and meaning are then recognized using the proposed wayfinding system. Consisting of three steps, the proposed wayfinding system first recognizes the current situation using a vocabulary tree that is built on the shape properties of images taken of various situations. Next, it detects and recognizes the necessary codes according to the current situation, based on color and edge information. Finally, it provides the user with environmental information and their path through an auditory interface. To assess the validity of the proposed wayfinding system, we have conducted field test with four visually impaired, then the results showed that they can find the optimal path in real-time with an accuracy of 95%.
{"title":"Situation-based indoor wayfinding system for the visually impaired","authors":"Eunjeong Ko, Jinsun Ju, Eun Yi Kim","doi":"10.1145/2049536.2049545","DOIUrl":"https://doi.org/10.1145/2049536.2049545","url":null,"abstract":"This paper presents an indoor wayfinding system to help the visually impaired finding their way to a given destination in an unfamiliar environment. The main novelty is the use of the user's situation as the basis for designing color codes to explain the environmental information and for developing the wayfinding system to detect and recognize such color codes. Actually, people would require different information according to their situations. Therefore, situation-based color codes are designed, including location-specific codes and guide codes. These color codes are affixed in certain locations to provide information to the visually impaired, and their location and meaning are then recognized using the proposed wayfinding system. Consisting of three steps, the proposed wayfinding system first recognizes the current situation using a vocabulary tree that is built on the shape properties of images taken of various situations. Next, it detects and recognizes the necessary codes according to the current situation, based on color and edge information. Finally, it provides the user with environmental information and their path through an auditory interface. To assess the validity of the proposed wayfinding system, we have conducted field test with four visually impaired, then the results showed that they can find the optimal path in real-time with an accuracy of 95%.","PeriodicalId":351090,"journal":{"name":"The proceedings of the 13th international ACM SIGACCESS conference on Computers and accessibility","volume":"58 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-10-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121941064","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
G. M. Poor, L. Leventhal, Scott Kelley, J. Ringenberg, Samuel D. Jaffee
Brain-computer interfaces (BCI) allow users to relay information to a computer by capturing reactions to their thoughts via brain waves (or similar measurements). This "new" type of interaction allows users with limited motor control to interact with a computer without a mouse/keyboard or other physically manipulated interaction device. While this technology is in its infancy, there have been major strides in the area allowing researchers to investigate potential uses. One of the first such interfaces that has broached the commercial market at an affordable price is the Emotiv "EPOC" headset. This paper reports on results of a study exploring usage of the EPOC headset.
{"title":"Thought cubes: exploring the use of an inexpensive brain-computer interface on a mental rotation task","authors":"G. M. Poor, L. Leventhal, Scott Kelley, J. Ringenberg, Samuel D. Jaffee","doi":"10.1145/2049536.2049612","DOIUrl":"https://doi.org/10.1145/2049536.2049612","url":null,"abstract":"Brain-computer interfaces (BCI) allow users to relay information to a computer by capturing reactions to their thoughts via brain waves (or similar measurements). This \"new\" type of interaction allows users with limited motor control to interact with a computer without a mouse/keyboard or other physically manipulated interaction device. While this technology is in its infancy, there have been major strides in the area allowing researchers to investigate potential uses. One of the first such interfaces that has broached the commercial market at an affordable price is the Emotiv \"EPOC\" headset. This paper reports on results of a study exploring usage of the EPOC headset.","PeriodicalId":351090,"journal":{"name":"The proceedings of the 13th international ACM SIGACCESS conference on Computers and accessibility","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-10-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129691952","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This paper presents a prototype of an educative and inclusive application: the Brazilian Sign Language Multimedia Hangman Game. This application aims to estimulate people, specially children, deaf or not, to learn a sign language and to help deaf people to improve their vocabulary in an oral language. The differential of this game is that its input consists of videos of the user performing signs from Brazilian Sign Language corresponding to Latin alphabet letters, recorded through the game graphical interface. These videos are processed by a computer vision module in order to recognize the letter to which the sign corresponds, using a recognition strategy based on primitives - hand configuration, movement and orientation, reaching 84.3% accuracy.
{"title":"Brazilian sign language multimedia hangman game: a prototype of an educational and inclusive application","authors":"R. Madeo","doi":"10.1145/2049536.2049623","DOIUrl":"https://doi.org/10.1145/2049536.2049623","url":null,"abstract":"This paper presents a prototype of an educative and inclusive application: the Brazilian Sign Language Multimedia Hangman Game. This application aims to estimulate people, specially children, deaf or not, to learn a sign language and to help deaf people to improve their vocabulary in an oral language. The differential of this game is that its input consists of videos of the user performing signs from Brazilian Sign Language corresponding to Latin alphabet letters, recorded through the game graphical interface. These videos are processed by a computer vision module in order to recognize the letter to which the sign corresponds, using a recognition strategy based on primitives - hand configuration, movement and orientation, reaching 84.3% accuracy.","PeriodicalId":351090,"journal":{"name":"The proceedings of the 13th international ACM SIGACCESS conference on Computers and accessibility","volume":"58 11","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-10-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132575234","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}