The axe team has taken a stand on accessibility support. We’re not going to ignore poor support in browsers and assistive technologies, and neither should you. In this post, we’ll take a look inside the axe team effort to ensure broad accessibility testing support.
We get a lot of questions on the axe-core team about why axe reports (or doesn’t report) certain things or when specific development techniques will be supported. From ARIA 1.1 to WCAG 2.1 and beyond, the team at Deque has one criterion in particular for deciding to include a new feature: whether it is supported by a minimum reasonable set of browsers and assistive technologies (ATs). This is known as “accessibility supported” in the Web Content Accessibility Guidelines (WCAG), and it’s a central part of how we build and maintain the axe-core accessibility testing library.
New technology standards emerge all the time in web development. ATs and browsers – also known as user agents – have to be redesigned or modified to support new technologies such as the state-of-the-art roles, states, and properties of ARIA 1.1; Shadow DOM and Web Components; or even SVG accessibility features. Reliance on web technologies that aren’t fully supported means sometimes features won’t work as expected in our users’ browsers or ATs. Users with disabilities can be left behind despite our best efforts to make things accessible (“I’m using this new ARIA feature for accessibility, but I only tested it in one browser!”).
There’s no set number of browsers or ATs required for accessibility support according to the W3C, as any standard for that would be a complex moving target that’s sure to be out of date fast. Instead, web authors should use accessible development techniques for all of the major browsers you support and test with the most popular assistive technologies.
How do we measure accessibility support for axe?
In axe-core, we support a common matrix of modern browsers and assistive technologies so that any technique recommended by the API will work in a variety of user scenarios. If a technique fails on even one notable platform for accessibility (such as Safari with Voiceover or Firefox with NVDA–both commonly used by people with disabilities), to us that means it is not accessibility supported and therefore is not supported in axe. Our decisions about accessibility support mostly live on in the axe-core code for rules and checks: the result for an unsupported technique is typically a violation or a review item.
By limiting ourselves to accessibility support only – which the team evaluates for every relevant change to axe-core – we can reduce our clients’ and colleagues’ need for manual testing in every single browser and AT, a huge time and cost savings. One downside is this sometimes this means holding up adoption of new features, even if they’re outlined in the ARIA spec, because browsers or ATs simply aren’t supporting them yet.
In axe-core rules we can work around some support limitations if a new technology is silently ignored (meaning it falls back to default behavior) or doesn’t negatively impact users (such as aria-errormessage with a little help from aria-describedby
). This isn’t always the case, so we manually test and evaluate each new feature based on current support, available workarounds, and (occasionally) historical knowledge of browser or AT bugs filed in years past, yet still open.
As a widely-adopted testing tool, axe-core has a rather broad set of requirements for accessibility support. On the axe team we often ask ourselves, “do these techniques work in as many places as possible?” We evaluate every change to a rule or check for how it will perform in major AT/browser combos, and we’ve unfortunately had to reject code submissions for techniques that fail in at least one of those.
Your accessibility policy doesn’t 100% match axe-core – what next?
With the number of ways to build accessible websites, it isn’t unusual to have varying policies when it comes to accessibility support. Some organizations impose their own limits, such as barring the `aria-label` property company-wide or greenlighting new ARIA 1.1 features regardless of support. A web application could be limited to specific browsers or closed environments, making broader accessibility support questions irrelevant. Under these many scenarios, it is both normal and expected for the default axe results to differ from a company’s internal accessibility guidelines.
When testing for accessibility with axe and encountering differences in policy, there are a few options for handling it:
- Modify the default axe-core ruleset using the `axe.configure` API.
- Open a Github issue to discuss support in axe-core (after searching for related issues first!)
- Level up to WorldSpace Attest and roll out custom rule configurations to all of your team members.
Which direction you decide to take depends on your team’s needs and circumstances. Rather than outline it all again here, you can read a whole lot more about axe vs. Attest decision making in my last post, “Moving Beyond axe to WorldSpace Attest”.
To support or not to support…that is a good question
For organizations with their own internal accessibility policies, deciding whether to support techniques with known limitations–such as something not working on iOS Safari and Voiceover–is both a difficult judgement call and a moving target.
Sometimes policies for accessibility support are based on a limited set of browser and AT combinations (i.e. “no IE!”). Other times, requirements shake out from agreed-upon company standards. Acceptable limitations can start to creep in for content on private intranets, where users are recommended to use specific browser versions or ATs (e.g. an Edge case). For sites available to everyone, you should aim to support as many browsers and assistive technologies as you can in a reasonable amount of time.
When teams have to make decisions about accessibility support, their first question is often, “How many AT users are actually using our site?” While you can collect information about the browsers your traffic is coming from and whether people are using mobile devices or desktop, for privacy purposes, there are no analytics for ATs. There are, however, a few things you can use to craft an accessibility support policy that reflects your users’ needs:
- Look at your site’s analytics for a lower-end of browser versions (IE 9? 10? 11?)
- Cross-reference WebAIM’s latest Screen Reader Survey to learn about the most popular assistive technologies.
- Decide when to discontinue support for old browser or AT versions, and document those decisions on your accessibility statement. (You have an accessibility statement, right?)
You’ll have to make some trade-offs as to which browsers and ATs to support as including absolutely everything would require infinite time and attention in a world where we’re still fighting to accomplish the basics.
While it’s tempting to only support the latest and greatest browser and AT, consider that some users with disabilities don’t have the resources to upgrade as quickly either due to a government subsidy or relying on old computer technology. This tension can be a difficult line to walk, so teams often support the most common combinations for the latest and one version back:
- NVDA and Firefox on Windows
- JAWS and IE11
- Voiceover and Safari (OSX and Mobile)
- Android Talkback and stock browser
- Dragon NaturallySpeaking and Chrome or IE11
Oftentimes accessibility is fragmented in browsers and ATs, meaning some features are well supported and others fall short. For example, Voiceover is a leading screen reader in some ways, such as the `prefers-reduced-motion` media query; but it lacks basic support for other things: SVG accessible name computation and table headers, to name a few. All screen readers have shortcomings, which is usually when we start filing bugs and asking maintainers “when will this be supported?” Which, we should all do…as well as provide +1s to support existing accessibility issues found in the wild.
How and where to file accessibility bugs ? http://whoseline.a11yideas.com/bugs.html
New Technologies & Graceful Degradation
In axe-core, we add support for things like the new ARIA 1.1 roles, states, and properties if they don’t cause problems in older browsers/ATs. But how do we know if something is “accessibility supported”? We test given techniques in our common support matrix and report on the findings so we can make a decision on how to move forward. We ask lots of questions, such as: does it work as expected? Does it degrade gracefully, meaning is it safely ignored if not supported? If axe-core detects a programmatically-determined guardrail like an `aria-describedby` attribute, can the lacking accessibility support be disregarded?
Sadly, sometimes the lack of support in a major AT can mean that we don’t move forward with a standard technique, such as using multiple roles on the same element (a.k.a. “fallback roles” in ARIA 1.1, which result in no role being announced in JAWS). Until support improves in a browser/AT combination we know that many users with disabilities rely on, we don’t feel comfortable recommending that developers use it.
The best way to keep track of accessibility support in axe-core is to look at our open issues or ARIA supported document on Github, or ask us on Gitter. Chances are good that someone has had the same question as you, and there might be a whole thread on that exact topic detailing historical research and support discussions. After a non-specific amount of time it might be reasonable to revisit a feature that previously wasn’t ready for inclusion into axe, but is worth bringing up again.
Moving forward
Those of us on the axe team do our best to suggest rules and techniques that work for everyone, but differences in opinion are normal (and unavoidable) when it comes to accessibility support. Heck, that’s true for much of the web development industry – ask anyone about their preferred build tool or linter settings. What matters is how our websites and applications perform for our end-users. Is someone with a disability going to be able to apply for a job with the platform you’re building or excel at their current job?
We have immense power in our industry to move the needle for people with disabilities. No matter how exactly we go about it, our unwavering forward momentum is what will make a difference in the long run. We hope you’ll find the axe ecosystem of accessibility testing tools to be useful. Should you encounter a difference in policy or need additional support, be sure to let us know at any of our available support channels.