Information Technology & Services - Indianapolis, Indiana, United States
People use mobile web applications in a variety of contexts, typically on-the-go, while engaged in other tasks, such as walking, jogging or driving. Conventional visual user interfaces are efficient for supporting quick scanning of a page, but they can easily cause distractions and accidents. This problem is intensified when web information services are richer and highly structured in content and navigation architectures.To support a graceful evolution of web systems from a conventional to an aural experience, we introduce ANFORA (Aural Navigation Flows On Rich Architectures), a framework for designing mobile web systems based on automated, semi-controlled aural navigation flows that can be listened to by the user while engaged in a secondary activity (e.g., walking). We are applying ANFORA to the domain of web-based news casting.