{"title":"ANFORA:在丰富的架构上研究听觉导航流","authors":"Romisa Rohani Ghahari, D. Bolchini","doi":"10.1109/WSE.2011.6081816","DOIUrl":null,"url":null,"abstract":"People use mobile web applications in a variety of contexts, typically on-the-go, while engaged in other tasks, such as walking, jogging or driving. Conventional visual user interfaces are efficient for supporting quick scanning of a page, but they can easily cause distractions and accidents. This problem is intensified when web information services are richer and highly structured in content and navigation architectures. To support a graceful evolution of web systems from a conventional to an aural experience, we introduce ANFORA (Aural Navigation Flows On Rich Architectures), a framework for designing mobile web systems based on automated, semi-controlled aural navigation flows that can be listened to by the user while engaged in a secondary activity (e.g., walking). We demonstrate a set of design rules that could govern salient aural interactions with large web architectures. Our approach opens a new paradigm for aural web systems which can complement existing visual interfaces, and has the potential to inform new technologies, navigation models, design tools, and methods in the area of aural web information access. As case study, we are applying ANFORA to the domain of web-based news casting.","PeriodicalId":414937,"journal":{"name":"2011 13th IEEE International Symposium on Web Systems Evolution (WSE)","volume":"23 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2011-11-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":"{\"title\":\"ANFORA: Investigating aural navigation flows on rich architectures\",\"authors\":\"Romisa Rohani Ghahari, D. Bolchini\",\"doi\":\"10.1109/WSE.2011.6081816\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"People use mobile web applications in a variety of contexts, typically on-the-go, while engaged in other tasks, such as walking, jogging or driving. Conventional visual user interfaces are efficient for supporting quick scanning of a page, but they can easily cause distractions and accidents. This problem is intensified when web information services are richer and highly structured in content and navigation architectures. To support a graceful evolution of web systems from a conventional to an aural experience, we introduce ANFORA (Aural Navigation Flows On Rich Architectures), a framework for designing mobile web systems based on automated, semi-controlled aural navigation flows that can be listened to by the user while engaged in a secondary activity (e.g., walking). We demonstrate a set of design rules that could govern salient aural interactions with large web architectures. Our approach opens a new paradigm for aural web systems which can complement existing visual interfaces, and has the potential to inform new technologies, navigation models, design tools, and methods in the area of aural web information access. As case study, we are applying ANFORA to the domain of web-based news casting.\",\"PeriodicalId\":414937,\"journal\":{\"name\":\"2011 13th IEEE International Symposium on Web Systems Evolution (WSE)\",\"volume\":\"23 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2011-11-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"4\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2011 13th IEEE International Symposium on Web Systems Evolution (WSE)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/WSE.2011.6081816\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2011 13th IEEE International Symposium on Web Systems Evolution (WSE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/WSE.2011.6081816","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
ANFORA: Investigating aural navigation flows on rich architectures
People use mobile web applications in a variety of contexts, typically on-the-go, while engaged in other tasks, such as walking, jogging or driving. Conventional visual user interfaces are efficient for supporting quick scanning of a page, but they can easily cause distractions and accidents. This problem is intensified when web information services are richer and highly structured in content and navigation architectures. To support a graceful evolution of web systems from a conventional to an aural experience, we introduce ANFORA (Aural Navigation Flows On Rich Architectures), a framework for designing mobile web systems based on automated, semi-controlled aural navigation flows that can be listened to by the user while engaged in a secondary activity (e.g., walking). We demonstrate a set of design rules that could govern salient aural interactions with large web architectures. Our approach opens a new paradigm for aural web systems which can complement existing visual interfaces, and has the potential to inform new technologies, navigation models, design tools, and methods in the area of aural web information access. As case study, we are applying ANFORA to the domain of web-based news casting.