A wide range of tools can help you find accessibility issues, from browser plugins to JavaScript libraries that analyze your codebase. Every tool has its own strengths and weaknesses when it comes to catching accessibility issues. Sparkbox’s goal is to research a variety of accessibility tools and empower anyone to run accessibility audits. In this article, I will review Paypal’s Automated Accessibility Testing Tool (AATT) by using it on a testing site that has planned accessibility issues.
Important information to note about this tool AATT requires the use of a command line interface (CLI) such as Terminal or Windows Command Prompt for installation and testing. If this is not something you are familiar with, I suggest choosing a different tool such as one that is available as a browser extension (and would likely suggest using a different tool regardless, stay tuned!). You can also follow this tutorial to learn how to get comfortable with the command line, which is a great skill and can be used to work with other tools such as Git.
There are a few different ways to use AATT, but all require the use of the command line. The main two options are:
Cloning the project from Github and running it locally on your computer. The AATT documentation provides a command to input in the command line that will generate a link to a webpage running locally on your computer. From there, you can copy and paste the URL or HTML you’d like to test.
Integrating it within your project using either the API or using Node and JavaScript.
I’ve tested both ways for this article and will be reviewing the results of both.
This review will rate the extension based on the following areas:
Watch a walkthrough of the tool and a short review or keep scrolling to read the full review. Jump to the conclusion if you are in a hurry.
In order to fully vet accessibility tools like this one, Sparkbox created a demo site with intentional errors. This site is not related to or connected with any live content or organization.
Want updates from Sparkbox?
Sign up to receive our emails right to your inbox—like Sparkbox’s monthly newsletters or updates about specific topics like more accessibility tool reviews!
Ease of Setup
Browser: Meets Expectations (2/4)
AATT requires the user to clone the Github repository and run a terminal command to use either the browser or code integration versions of the tool. While this may be fairly easy for a developer who is familiar with Git to do, it causes a barrier for anyone who is not a developer or who is unfamiliar with Git and/or the command line. That issue aside, the documentation for downloading and running this tool locally is short and to the point, describing all the needed commands and steps.
Integration: Areas for Improvement (1/4)
Installing AATT through integration proved to be a bit of a challenge. They provide brief information on a number of different ways to integrate it within your project, but each explanation is quite limited. It does require Node.js for installation which you can install using either Homebrew or nvm package installers.
Once AATT is installed, you need to add a Javascript function in order to run the tool within your project. No information is given in the documentation about what parameters you can pass to the function. One code example is given, and I personally could not get it to run if I changed any of the options to anything other than what was given in the example. I even tried looking through the source code for the tool and still could not find a list of additional options to pass to the function. This limited documentation is a large issue as there is no easy way of knowing what your options are or what the output is supposed to look like.
Ease of Use
Browser: Exceeds Expectations: (3/4)
Once installed, this tool has a very simple interface. The webpage it loads for you contains a single input for a URL and then a button to run the tests. There are also options for pasting in HTML rather than providing a URL. The results of the test are then displayed in an easy-to-read table on the same webpage.
Integration: Meets Expectations: (2/4)
The main issue with the integration output comes from the lack of documentation that was mentioned above. Without a list of available options or really any other documentation besides the one example given, it’s difficult to be sure that you are even using the tool correctly. While the tool description claims that you can utilize different accessibility testing suites, without knowing what your options are you can’t tailor the tool to your liking.
Reputable Sources
Both: Exceeds Expectations (3/4)
This tool is based on Axe, HTML CodeSniffer, and the Google Chrome accessibility developer tools. All of these references are based on WCAG 2.1. This tool also explicitly states that it is based on WCAG 2.1, although it uses the three tools mentioned to actually run its tests. Throughout my use of the tool, I did not find any inconsistencies with this claim.
Accuracy of Feedback
Browser: Meets Expectations (2/4)
AATT missed three to four of the 23 accessibility failures that we expect to be caught by automated testing tools, but what exact things are missed depends on which tool (Axe, HTML CodeSniffer, or Google Chrome) you choose to test the site.
Integration: Fail (0/4)
This tool failed this category to a point that I thought maybe I was using it incorrectly. When using the code integration method of AATT, it only caught two errors out of the 23 errors on the test webpage. It gave a lot of additional notices and suggestions about checking different things manually, which was a good addition, but the actual errors it found were few and far between.
To see all the test site issues that the AATT found or missed, skip to Accessibility Issues Found.
Clarity of Feedback
Browser: Meets Expectations (2/4)
The AATT browser tool clearly lists the errors but does not give a reference to WCAG as to why this is an error. It only provides HTML tags as a clue to where the error is located. This made it virtually impossible to find where the error was occurring on the page. It also does not list how many occurrences of an error there are. For example, it will say there is a color contrast error, but where and how many instances remains a mystery.
Integration: Meets Expectations (2/4)
When you run your code with the AATT code integration it outputs JSON data which proves difficult for us human users to understand. I could not get the JSON to parse properly when trying to use this within my own project. I had to print out the raw response because what the tool was sending back to me was not correctly formatted JSON. This also has the same issues as the browser tool in that it does not give how many occurrences or a location other than a bunch of HTML soup. One redeeming quality of the integration is that it does provide links to the WCAG principles that the errors are based on.
Cost
Both: Outstanding (4/4)
This tool is 100% free.
Strengths and Limitations
Both: Areas for Improvement (1/4)
The two ways to use this tool that I tested were both lacking in clarity of results and accuracy. They also both required the user to download the code and run it locally—steps that may prove too difficult for users that do not have development knowledge. There are a good deal of easy-to-use, more accurate tools out there that I would recommend before this one.
Conclusion: 👎 Not Recommended for General Use (17/28 and 13/28)
After testing PayPal’s Automated Accessibility Testing Tool, we’ve come to the conclusion that it is not the greatest choice for accessibility testing. This tool requires the user to download the code and use the command line, causing a barrier to ease of use and installation for anyone unfamiliar with the command line or Git. Additionally, AATT failed to capture a majority of errors on our testing website, and the errors that it did find did not include sufficient information on where they were located or how many occurrences there were.
As mentioned earlier, there are plenty of easier-to-use and more accurate tools that we would encourage you to use, learn more by checking out our accessibility tool review series.
Accessibility Issues Found
Test Site Issue | Tools Should Find | People Should Find | AATT Browser Found | AATT Integration Found |
---|---|---|---|---|
Insufficient contrast in large text | ✅ | — | ✅ | ❌ |
Insufficient contrast in small text | ✅ | — | ✅ | ❌ |
Form labels not associated with inputs | ✅ | — | ✅ | ❌ |
Missing alt attribute | ✅ | — | ✅ | ❌ |
Missing lang attribute on html element | ✅ | — | ❌ | ✅ |
Missing title element | ✅ | — | ❌ | ✅ |
Landmark inside of a landmark | ✅ | — | ✅ | ❌ |
Heading hierarchy is incorrect | ✅ | — | ✅ | ❌ |
Unorder list missing containing ul tag | ✅ | — | ✅ | ❌ |
ID attributes with the same name | ✅ | — | ✅ | ❌ |
Target for click area is less than 44px by 44px | ✅ | — | ❌ | ❌ |
Duplicate link text (lots of “Click Here”s or “Learn More”s) | ✅ | — | ❌ | ❌ |
div soup | ✅ | — | ❌ | ❌ |
Funky tab order | — | ✅ | ✅ | ❌ |
Missing skip-to-content link | — | ✅ | ❌ | ❌ |
Using alt text on decorative images that don’t need it | — | ✅ | ❌ | ❌ |
Alt text with unhelpful text, not relevant, or no text when needed | — | ✅ | ❌ | ❌ |
Page title is not relative to the content on the page (missing) | — | ✅ | ❌ | ❌ |
Has technical jargon | — | ✅ | ❌ | ❌ |
Using only color to show error and success messages | — | ✅ | ❌ | ❌ |
Removed focus (either on certain objects or the entire site) | — | ✅ | ❌ | ❌ |
Form helper text not associated with inputs | — | ✅ | ❌ | ❌ |
Pop-up that doesn’t close when you tab through it | — | ✅ | ❌ | ❌ |