Providing access to graphical user interfaces—not graphical screens

W. Keith Edwards, Elizabeth Mynatt and Kathryn Stockton


The Mercator project, described in this paper at the first ASSETS conference in 1994, had a significant and lasting impact on accessibility to graphical user interfaces. It was foundational in enabling and setting the direction of screen reader technology for X Windows, and opening up opportunities for assistive technology.

This paper is one of the first to raise and tackle the challenge of providing screen reader capabilities in graphical user interfaces. It proposes that translation of the GUI should be done at a semantic, rather than syntactic level. At the time of writing, screen reader software for GUIs operated by intercepting drawing requests to the screen, which contained only low level information about the visual display elements. Edwards, Mynatt and Stockton proposed instead to offer descriptions in terms of the operations the GUI allows the user to perform. The work includes several ideas that have proven to be important and influential in accessibility, including the use of auditory icons to represent different objects, audio formatting to confer status and other properties, and hierarchical modelling of containment and cause-effect relationships between interface objects. The notion of defining user interfaces at an abstract level to allow for realization in many forms has been a major research thread in accessibility, leading to the development of several standards, and underpinning ongoing efforts to develop personalized user interfaces.

The paper describes Mercator, an implementation of these ideas within the X Window System, and the modifications made to the X Window library to support it. These modifications became part of the standard X Windows release, making it possible for anyone to create a non-visual interface agent that exists externally to the application and the window system, using only the mechanisms provided by the platform. This laid the foundation for an assistive technology market to build on these capabilities, and opened the door to further research on alternative access mechanisms. The impact of this approach to the provision of access to graphical user interfaces has been profound. The 1997 release of Microsoft’s Active Accessibility (MSAA) functionality took this approach to expose user interface control objects to assistive technologies, again opening opportunities for assistive technologies to be developed.

Paper Year:



Have graphical user interfaces improved the lives of Mind computer users? The simple answer is not very much. This opportunity has not been realized because current screen reader technology provides access to graphical screens, not graphical interfaces. In this paper, we discuss the historical reasons for this mismatch as well as analyze the contents of graphical user interfaces. Next, we describe one possible way for a blind user to interact with a graphical user interface, independent of its presentation on the screen. We conclude by describing the components of a software architecture which can capture and model a graphical user interface for presentation to a blind computer user.

Full Paper:
Download Full Paper Here