How Much Does Expertise Matter? A Barrier Walkthrough Study with Experts and Non-Experts
Yeliz Yesilada, Giorgio Brajnik and Simon Harper
Eleventh International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS 2009)
Pittsburgh, PA, USA, October 26-28, 2009
Manual accessibility evaluation plays an important role in validating the accessibility of Web pages. This role has become increasingly critical with the advent of the Web Content Accessibility Guidelines (WCAG) 2.0 and their reliance on user evaluation to validate certain conformance measures. However, the role of expertise, in such evaluations, is unknown and has not previously been studied. This paper sets out to investigate the interplay between expert and non-expert evaluation by conducting a Barrier Walkthrough (BW) study with 19 expert and 51 non-expert judges. The BW method provides an evaluation framework that can be used to manually assess the accessibility of Web pages for different user groups including motor impaired, hearing impaired, low vision, cognitive impaired, etc. We conclude that the level of expertise is an important factor in the quality of accessibility evaluation of Web pages. Expert judges spent significantly less time than non-experts; rated themselves as more productive and confident than non-experts; and ranked and rated pages differently against each type of disability. Finally, both effectiveness, which is the quality of finding all and only true accessibility problems, and reliability, which is the repeatability of the outcomes when used in different context, of the expert judges are significantly higher than non-expert judges.