智能手机上非视觉文本输入的可访问手势输入。

Syed Masum Billah, Yu-Jung Ko, Vikas Ashok, Xiaojun Bi, I V Ramakrishnan
{"title":"智能手机上非视觉文本输入的可访问手势输入。","authors":"Syed Masum Billah,&nbsp;Yu-Jung Ko,&nbsp;Vikas Ashok,&nbsp;Xiaojun Bi,&nbsp;I V Ramakrishnan","doi":"10.1145/3290605.3300606","DOIUrl":null,"url":null,"abstract":"<p><p>Gesture typing-entering a word by gliding the finger sequentially over letter to letter- has been widely supported on smartphones for sighted users. However, this input paradigm is currently inaccessible to blind users: it is difficult to draw shape gestures on a virtual keyboard without access to key visuals. This paper describes the design of <i>accessible gesture typing</i>, to bring this input paradigm to blind users. To help blind users figure out key locations, the design incorporates the familiar screen-reader supported touch exploration that narrates the keys as the user drags the finger across the keyboard. The design allows users to seamlessly switch between exploration and gesture typing mode by simply lifting the finger. Continuous touch-exploration like audio feedback is provided during word shape construction that helps the user glide in the right direction of the key locations constituting the word. Exploration mode resumes once word shape is completed. Distinct earcons help distinguish gesture typing mode from touch exploration mode, and thereby avoid unintended mix-ups. A user study with 14 blind people shows 35% increment in their typing speed, indicative of the promise and potential of gesture typing technology for non-visual text entry.</p>","PeriodicalId":74552,"journal":{"name":"Proceedings of the SIGCHI conference on human factors in computing systems. CHI Conference","volume":"2019 ","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2019-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1145/3290605.3300606","citationCount":"12","resultStr":"{\"title\":\"Accessible Gesture Typing for Non-Visual Text Entry on Smartphones.\",\"authors\":\"Syed Masum Billah,&nbsp;Yu-Jung Ko,&nbsp;Vikas Ashok,&nbsp;Xiaojun Bi,&nbsp;I V Ramakrishnan\",\"doi\":\"10.1145/3290605.3300606\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Gesture typing-entering a word by gliding the finger sequentially over letter to letter- has been widely supported on smartphones for sighted users. However, this input paradigm is currently inaccessible to blind users: it is difficult to draw shape gestures on a virtual keyboard without access to key visuals. This paper describes the design of <i>accessible gesture typing</i>, to bring this input paradigm to blind users. To help blind users figure out key locations, the design incorporates the familiar screen-reader supported touch exploration that narrates the keys as the user drags the finger across the keyboard. The design allows users to seamlessly switch between exploration and gesture typing mode by simply lifting the finger. Continuous touch-exploration like audio feedback is provided during word shape construction that helps the user glide in the right direction of the key locations constituting the word. Exploration mode resumes once word shape is completed. Distinct earcons help distinguish gesture typing mode from touch exploration mode, and thereby avoid unintended mix-ups. A user study with 14 blind people shows 35% increment in their typing speed, indicative of the promise and potential of gesture typing technology for non-visual text entry.</p>\",\"PeriodicalId\":74552,\"journal\":{\"name\":\"Proceedings of the SIGCHI conference on human factors in computing systems. CHI Conference\",\"volume\":\"2019 \",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-05-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://sci-hub-pdf.com/10.1145/3290605.3300606\",\"citationCount\":\"12\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the SIGCHI conference on human factors in computing systems. CHI Conference\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3290605.3300606\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the SIGCHI conference on human factors in computing systems. CHI Conference","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3290605.3300606","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 12

摘要

手势输入——通过逐个字母滑动手指来输入单词——已经在智能手机上为视力正常的用户广泛支持。然而,这种输入范例目前对盲人用户来说是不可访问的:如果不访问键视觉,很难在虚拟键盘上绘制形状手势。本文描述了无障碍手势输入的设计,将这种输入范例带给盲人用户。为了帮助盲人用户找到键的位置,该设计结合了熟悉的屏幕阅读器支持的触摸探索,当用户在键盘上拖动手指时,它会显示键的位置。该设计允许用户通过简单地抬起手指在探索和手势输入模式之间无缝切换。在单词形状构建过程中,会提供像音频反馈一样的持续触摸探索,帮助用户在组成单词的关键位置的正确方向上滑动。完成单词形状后恢复探索模式。不同的耳塞有助于区分手势输入模式和触摸探索模式,从而避免意外的混淆。一项针对14名盲人的用户研究表明,他们的打字速度提高了35%,这表明手势打字技术在非视觉文本输入方面的前景和潜力。
本文章由计算机程序翻译,如有差异,请以英文原文为准。

摘要图片

摘要图片

摘要图片

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Accessible Gesture Typing for Non-Visual Text Entry on Smartphones.

Gesture typing-entering a word by gliding the finger sequentially over letter to letter- has been widely supported on smartphones for sighted users. However, this input paradigm is currently inaccessible to blind users: it is difficult to draw shape gestures on a virtual keyboard without access to key visuals. This paper describes the design of accessible gesture typing, to bring this input paradigm to blind users. To help blind users figure out key locations, the design incorporates the familiar screen-reader supported touch exploration that narrates the keys as the user drags the finger across the keyboard. The design allows users to seamlessly switch between exploration and gesture typing mode by simply lifting the finger. Continuous touch-exploration like audio feedback is provided during word shape construction that helps the user glide in the right direction of the key locations constituting the word. Exploration mode resumes once word shape is completed. Distinct earcons help distinguish gesture typing mode from touch exploration mode, and thereby avoid unintended mix-ups. A user study with 14 blind people shows 35% increment in their typing speed, indicative of the promise and potential of gesture typing technology for non-visual text entry.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
MigraineTracker: Examining Patient Experiences with Goal-Directed Self-Tracking for a Chronic Health Condition. Rethinking Human-AI Collaboration in Complex Medical Decision Making: A Case Study in Sepsis Diagnosis. Understanding the Role of Large Language Models in Personalizing and Scaffolding Strategies to Combat Academic Procrastination. Investigating the Role of Context in the Delivery of Text Messages for Supporting Psychological Wellbeing. Understanding Contexts and Challenges of Information Management for Epilepsy Care.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1