Content
Testing should gather the data necessary to
understand, prioritize, and fix problems.
List specific issues, with information that is
accurate, descriptive, and actionable.
Not all accessibility issues are equal.
Give priority to fixes that have a high positive impact to users.
High: Users will be unable to perform important tasks or understand important content.
Medium: Users will be able to perform and understand, but with difficulty.
Low: Users will be inconvenienced, but still able to accomplish all tasks.
Knock out easier fixes before more difficult ones.
Gain quick improvements with minimal effort.
Get more accessible in less time.
Accessibility issues that don’t impact anyone aren’t issues.
Prioritize issues found on high traffic pages.
Multiple issues can be caused by the same code.
Some accesibility fixes improve usability for other users.
A large amount of accessibility errors
can be found automatically.
You can reliably catch approximately
25% of accessibility best practices.
Quality of automated testing tools can vary significantly.
With manual verification, you can catch another
35% of accessibility best practices.
Can be time consuming.
Depends on number and complexity of features testing.
It’s possible to find everything, but highly dependent on
the skill of the reviewer and the time alotted.
Use Case: A list of steps defining interactions
between a role and the system.
Can find a close representation of the
system’s real level of accessibility.
Accurate in assessing how severe the problems are.
Caution: Different types, brands, and versions of
Assisstive Technology can behave differently.
Often misses the specific code flaws
that cause the system to fail.
Use real people with disabilities.
Find out exactly how easy or hard a system is to use.
Most accessibility testing occurs too late.
Fixes are more expensive.
More legal risk.
Incorporate accessibility testing into your workflow.
Incorporate accessibility testing into
each phase of the project.
Make accessibility an acceptance criteria
for each story.
Developers should run automated tests on their code
and check keyboard functionality.
Include manual testing in the QA process.
A significant volume of accessibility errors
are caused by content creators.
Train content creators in basic accessibility best practices.
Test accessibility before publishing the content.
Publishing workflow should include automated testing,
using only pass/fail tests on just the content.
If possible, have an accessibility expert review
the content after automated testing.
Goal: Become compliant faster and cheaper.
The first round of testing should only include
automated testing for pass/fail items.
You should never pay a human to find errors that can be found through automated testing.” — Karl Groves Accessibility Guru
After initial problems are fixed, test manually to
uncover issues automated testing couldn’t find.
Save use case scenarios and usability studies for the end.
Any Questions?