Tabular Results
The following table summarizes the results of these tests. There are 14 tests which were handled correctly by all the tools. The column labeled OK is the count of tools that passes the test.
Test | OK | Bobby | In- Focus |
LIFT | Ramp | Web- King |
Web- XM |
---|---|---|---|---|---|---|---|
1. Alt-text | 6 | yes | yes | yes | yes | yes | yes |
2. ASCII art as alt-text | 2 | no | no | no | yes | yes | no |
3. Object requires default content | 6 | yes | yes | yes | yes | yes | yes |
4. Image button | 5 | yes | yes | yes | yes | yes | no |
5. Long alt-text | 6 | yes | yes | yes | yes | yes | yes |
6. Image map areas | 6 | yes | yes | yes | yes | yes | yes |
7. Server-side image maps | 6 | yes | yes | yes | yes | yes | yes |
8. Frame titles | 6 | yes | yes | yes | yes | yes | yes |
9. Quality of frame titles | 3 | no | yes | no | yes | yes | no |
10. Input element needs label | 6 | yes | yes | yes | yes | yes | yes |
Test | OK | Bobby | In- Focus |
LIFT | Ramp | Web- King |
Web- XM |
11. Use of title attribute for form control | 3 | no | yes | no | yes | yes | no |
12. Text intervenes between label and control | 5 | yes | no | yes | yes | yes | yes |
13. Label text from two places | 6 | yes | yes | yes | yes | yes | yes |
14. Invisible GIF holds prompt | 6 | yes | yes | yes | yes | yes | yes |
15. Label matches no control | 6 | yes | yes | yes | yes | yes | yes |
16. Two controls with same id | 3 | no | no | yes | yes | yes | no |
17. Text area needs label | 6 | yes | yes | yes | yes | yes | yes |
18. Select menu needs label | 6 | yes | yes | yes | yes | yes | yes |
19. Inaccessible select menu | 1 | no | no | no | no | yes | no |
20. Empty label | 3 | no | no | no | yes | yes | yes |
Test | OK | Bobby | In- Focus |
LIFT | Ramp | Web- King |
Web- XM |
21. “Click here” | 4 | yes | no | no | yes | yes | yes |
22. Image link with empty alt-text | 4 | yes | no | yes | yes | no | yes |
23. Image link with spaces for alt-text | 5 | yes | no | yes | yes | yes | yes |
24. Link with text and image with empty alt-text | 3 | no | no | yes | yes | no | yes |
25. “Click here” plus title | 2 | no | no | no | yes | yes | no |
26. Same link text; different URLs | 6 | yes | yes | yes | yes | yes | yes |
27. Page title | 4 | yes | no | yes | yes | no | yes |
28. Adequate page title | 1 | no | no | no | yes | no | no |
29. Skip link | 3 | no | yes | yes | yes | no | no |
30. Headings for skipping | 1 | no | no | no | yes | no | no |
Test | OK | Bobby | In- Focus |
LIFT | Ramp | Web- King |
Web- XM |
31. Layout table won’t resize | 6 | yes | yes | yes | yes | yes | yes |
32. Data table | 4 | no | yes | yes | yes | yes | no |
33. Layout table with summary | 2 | yes | no | no | yes | no | no |
34. Frame source must be HTML | 5 | yes | yes | no | yes | yes | yes |
35. Blink | 6 | yes | yes | yes | yes | yes | yes |
36. Marquee | 6 | yes | yes | yes | yes | yes | yes |
37. Auto-refresh | 4 | yes | no | yes | yes | yes | no |
38. Keyboard access | 1 | yes | no | no | no | no | no |
39. Headings structure | 4 | yes | yes | no | yes | no | yes |
40. Inline frame title | 5 | yes | no | yes | yes | yes | yes |
Totals | -- | 28 | 23 | 27 | 38 | 34 | 26 |
Summary
As we said at the beginning of this chapter, there is a large number of sophisticated software tools designed to facilitate checking web content for accessibility. The basic products range in price from $50 to over $2500 and we have also taken a look at two enterprise level products. According to the lists of customers on the suppliers’ web sites, it is clear that federal agencies and corporations are buying into the idea of using these tools to test their sites for accessibility.
For looking at our test files, the six tools that we examined are remarkably similar. In a recent post to an email list, Glenda Sims at the University of Texas, had this to say about one of the tools we evaluated in this chapter:
WebXM is perfect for our decentralized needs. I work on a campus with 1000+ webmasters. Yes, it is like trying to herd cats. WebXM gives us a delicious dashboard that lets me quickly see the “health” of our entire site with a quick overview of which subsites within
www.utexas.edu
are the best and the worst.
We didn’t even look at the dashboard. The point is that although the tools are similar in the task we undertook, they are radically different regarding other tasks such as usability, security, integrity, scalability and the nature of the reporting and availability of scheduling.
As we have stressed before, these tools are inherently limited in what they can do. Most aspects of web accessibility
require some human evaluation and the best that can be asked of the software tools is that they facilitate the human review
process. On the other hand software accessibility checkers can do something human evaluators cannot do. Tools can examine
dozens (even millions, for some tools) of pages to find missing alt
attributes or label
elements.
Humans are not so good at such exhaustive and tedious examination. Detectable errors like these include some of the most
important concerns for accessibility and generally are symptomatic of more serious mistakes.
An important part of carrying out these tests was the way the tool developers reacted. All were supportive and responsive, and that was very reassuring. Of course some were more responsive than others and two stand out because not only were they helpful, they made significant changes in their products in response to this process. Parasoft (WebKing) and Deque (Ramp Ascend) made major improvements in their accessibility checking to end up in second and first place respectively. Ramp went through two versions during this review process and their current “winning” score of 38 out of 40 is for their newest version, Ramp Ascend 6.0.