Skip to main content

7 ways automated testing can help your web accessibility

Accessibility testing can be time-consuming, but automated tools exist to help you.

Let’s start by getting the elephant in the room out of the way. Automated accessibility testing alone will not make your website 100% accessible.

Always, always combine automated testing with manual auditing. The reason for this is that some WCAG criteria are ambiguous.

For example, automated tests can tell you if alternative text exists, but what they can’t do is understand the context.

If you display an image of a banana, but your alt text says, ‘apple’, a computer cannot determine its context and therefore, whether it passes or fails.

A picture of a banana with the word 'Apple' underneath, to illustrate bad alt text.

That said, a good platform will show you a list of all your alt text and allow you to manually review them fairly quickly.

What automated testing can do for you is find a broad range of issues quickly, across many thousands of web pages.

It’s scalable, saves you time and money, and helps you find obvious accessibility issues before your users do.

Automated testing allows for consistency, makes your team more efficient, and allows you to report on progress towards your accessibility goals.

Here are 7 ways automated accessibility testing can help your team improve accessibility.

1. Scalability

Your approach to accessibility testing might currently be to undertake a manual audit of selected web pages and themes.

You can’t check for accessibility issues across thousands of web pages manually.

Sure, you can spot-check your theme and template, and maybe find some global issues affecting a lot of pages, but you’ll still miss issues on individual pages.

With an automated test, you can very quickly check your whole site for unambiguous WCAG failures.

It goes beyond simply checking the theme, and includes all your content.

2. Frequency

How often do you ask the manual auditors to review your site? If you said ‘annually’, then that’s probably about average for a public sector web team.

Automated tools give you near-realtime feedback on changes you’re making to your site and themes.

Your content team is likely pushing out new articles and pages daily. People are fallible, so it’s likely that some content accessibility issues might come up. Missing alt text or non-descriptive links can hinder your efforts.

3. Efficiency

‘Save time and money with automated accessibility testing.’

That’s a soundbite, but there’s a good reason, it’s true.

You expend a lot of resources tracking down and fixing accessibility issues manually.

If you test your website using a screen reader (and by the way, this is good practice – use Voiceover (iOS) or Talkback (Android) on your phone for a good experience), you may find issues on individual pages.

(ASIDE: We have a video demonstration of a mobile screen reader over on our YouTube channel)

But this is a slow process. It might take a few minutes or more to identify some obvious issues with screen reader compatibility.

Multiply that by thousands of pages and you have a real problem.

Automated testing covers a wider breadth of content in one pass, so this will get the low-hanging accessibility fruit out of the way.

It also helps the manual auditing team. They can review issues found in automated tools and immediately know where to look for others.

Given the cost of manual auditing, this saves your organization money by removing some relatively straightforward issues from their list.

It does this by reducing your workload considerably and allowing you to focus on issues that can’t be reasonably tested by a computer.

4. Testing on mobile devices

It’s important to understand the WCAG requirements regarding mobile compatibility. It’s also important to use an accessibility testing platform that can test for mobile issues.

Two-dimensional scrolling

When you read text, your eyes track from one side of the screen to the other, then move to the next line to continue where you left off.

It’s a natural flow that requires minimal effort.

It only works if the text appears in one column on a mobile device, though.

Reflow is a WCAG 2.1 success criterion that ensures text appears in this way.

It aims to prevent the need for two-dimensional scrolling. Users should only be able to scroll up and down, or left and right, but not both.

Usually, HTML tables or images that are not constrained by their containers are the cause of content running off to the right of a mobile web page.

This can impact:

  • users with motor issues
  • users with cognitive disabilities

The effort required to navigate your site and remember which line of text you were reading is increased when content scrolls in two dimensions.


The ‘Resize Text’ success criterion (WCAG 1.4.4) mandates that all text should be readable. Users must be allowed to zoom in to any website and increase the text size.

Failures here are usually caused by a developer disabling pinch-to-zoom. All browsers default this to ‘on’, so you’d really have to go out of your way to disable it.

This can impact:

  • users with low vision

5. Readability testing

You should make your content readable. User shorter sentences where you can.

WCAG’s Readable success criteria are split into a number of sub-categories;

These guidelines intend to help users understand the content on your website.

Reading level can, to an extent, be checked automatically. It is generally defined as the educational level of the user.

There are certain formulae that can be employed to check readability.

Silktide uses the SMOG formula and our own proprietary technology is used to consider which parts of each webpage should be analyzed.

This means you can get a good understanding of problem areas on your site, and consider them for review.

6. Set consistent standards

Automation helps you find issues before your users do.

Let’s say you have a range of content creators working on separate parts of your site.

Automation helps you create consistency everywhere, not just in accessibility but also across the whole user experience.

You should of course do your basic checks as decided by WCAG, but also you should go over and above what is legally required.

For example, with automated policies, you can set rules for color contrast ratios, or for limiting the length of alt text.

Remember, accessibility is an extension of user experience and isn’t just about compliance with WCAG.

Another example. You should avoid ‘Read more’ links where possible, especially if you have multiple links on the same page. Screen reader users won’t be able to distinguish between link locations without proper descriptions.

This makes it tricky for a visually-impaired user to navigate your website.

Creating an automated policy that checks every button on every page solves this problem and creates a far better user experience.

Finally, automated tools can find things that impact accessibility but aren’t included in WCAG, such as page loading times or spelling errors (including within alt text).

7. Set and report on accessibility goals

Creating a goal to improve accessibility is important, but you need to be able to track your progress towards that goal.

This is especially true if you have multiple websites or multiple sections in a single website that you’re tackling across multiple teams.

If you’re performing manual tests across small sections of your site, you’ll never really be able to get an understanding of how well you’re doing across hundreds or thousands of pages.

If you’re reporting on your progress you can get:

  • an audit trail
  • some accountability
  • tangible improvement metrics

A basic example of this can be found over on our Silktide Index. The data supports how various websites across different sectors are improving accessibility over time.

In conclusion

Yes, some accessibility tools have their weakness. You’ll never be able to get fully accessible using automated tools alone. That said, one could argue that given the ambiguity in WCAG you’d struggle to be fully accessible to AAA standards anyway.

An argument we see against automated tools is ‘Well, they only find 30-40% of the issues in WCAG.’

Our counter argument to that is, ‘That is 30-40% more issues than not using accessibility tools’.

That obviously sounds quite flippant, but genuinely, we’re all passionate about helping make the web a better place here at Silktide.

We also advocate for a mixture of manual and automated testing to get the best coverage possible.

Just don’t get us started on accessibility overlays.

Enjoyed this? Subscribe for more