This blog post marks the official launch of the “Zero Errors” campaign, a university-wide effort in May 2021 to use accessibility checkers to reduce – or eliminate! – accessibility errors from UW websites, online courses, and other digital resources.
At the UW, we have a wide variety of tools available that can check our digital resources for accessibility. See our Tools and resources page for an annotated list. If you are responsible for one or more websites, online courses, videos, or publicly available digital documents, please take time to learn about – and use – one or more of the available tools.
“Zero Errors” activities
Throughout May, UW-IT Accessible Technology Services (ATS) will be providing support, training, and generally encouraging UW employees to eliminate the measurable accessibility errors in their websites and online courses, and to caption all of their videos. Follow is a list of formal scheduled activities.
- Thursday May 6, 9:00 – 10:00am – Web Council Meeting *
- Thursday May 6, 11:00 – Noon – Accessible Tech Office Hours
- Monday May 10, 1:00 – 4:00pm – IT Accessibility Liaisons Meeting *
- Thursday, May 14, 10:00 – 11:00am – Web Accessibility & Usability Meetup *
- Thursday, May 20, All Day – Global Accessibility Awareness Day
- Thursday, May 27, 3:00 – 4:00pm – Accessible Tech Webinar Series (on Document Accessibility) *
For details about these and other events, see our Events & Collaboration page.
* = Zero Errors is part of a larger agenda.
The ultimate goal: ZERO errors!
Choose one or more accessibility checkers. And see if you can reduce – or eliminate! – the number of errors found. The following screen shots are provided for motivation.
Example goal: A perfect score in WAVE
WAVE is an accessibility checker developed by WebAIM at Utah State University. You can check accessibility of your web pages via the WAVE website, or using the WAVE browser extension for either Chrome or Firefox.
Example goal: A perfect score in Lighthouse (in Chrome DevTools)
Lighthouse is a web quality auditor that’s integrated into Chrome DevTools. You can use it to check any web page, even those requiring authentication. In addition to checking accessibility, it has audits for performance, SEO, and more. For additional information see the Lighthouse documentation.
Example goal: A perfect score in axe DevTools (in Chrome)
axe is an accessibility testing toolkit from Deque (accessibility consultancy). The axe Chrome Extension and axe for Android App are both free. There are also are various commercial versions available with additional features, as explained on the axe home page.
Example goal: A perfect score in the Accessibility Report (in Canvas)
All Canvas courses at the UW include an accessibility tool called Ally, which – among other features – generates an accessibility report with a score for the entire course. The report provides various options for getting started with fixing the problems it identifies. For more information about Ally, see our help page on Using Ally in Canvas Courses.
Example goal: 100% of videos captioned, as reported by the YouTube Caption Auditor
YouTube Caption Auditor (YTCA) is a free, open-source tool developed by ATS that helps YouTube channel owners to prioritize their video captioning efforts. For the tool itself, see YTCA on GitHub. If you are responsible for a UW-affiliated YouTube channel and would like access to the UW’s hosted YTCA instance, contact Terrill, firstname.lastname@example.org.
How accurate are accessibility checkers?
Automated checkers can only check a subset of accessibility issues. Estimates vary widely of the actual percentage of issues that can be checked automatically. “Passing the checker” doesn’t guarantee that a website, digital document, or online course is accessible. However, it’s a good place to start. Fixing all the issues identified by accessibility checkers will make your digital resources more accessible, even if not necessarily fully accessible.
Also, automated checkers vary widely in the rules they use. If you check your website using four different tools, you will likely get overlapping, but different, results. Similarly, if you check your PDFs using the built-in accessibility checker in Adobe Acrobat, you will likely get different results than the results reported by Ally (the accessibility checker that’s available in Canvas courses at the UW).
These discrepancies don’t mean automated checkers are inaccurate. Each checker provides feedback based on its unique ruleset. The more accessibility checkers you use, the more complete a picture you will have of the accessibility of your digital resources.
Why “Zero Errors”?
According to data collected from the U.S. Department of Education Office for Civil Rights OCR Recent Resolution Search, at least 247 postsecondary education institutions have resolved disability discrimination complaints related to accessibility of their websites and/or online courses. An analysis of a sample of the OCR resolution agreement letters found that in 100% of the complaints, automated web accessibility checkers were either used as evidence by the person filing the complaint or by OCR to verify the complaint. This underscores the need to work toward eliminating automatically measurable accessibility errors as part of our risk management strategy.
I’ll be sharing more findings from my OCR resolution research at multiple venues in May, including the Web Council meeting on May 6, the Liaisons meeting on May 10, and during my opening remarks on Global Accessibility Awareness Day on May 20. See the “Zero Errors” Activities section at the top of this post for a complete list of related events.
Let us know how you did
If you succeed in reducing your errors (or in captioning your videos), we’d love to know about it, even if you don’t attain a perfect score. Please share your experience in an email to me (Terrill, email@example.com), optionally including one or more screen shots. We recommend capturing screen shots of your scores before and after you fix your accessibility errors.