Main Content
Web accessibility testing and remediation roadmap

Roadmap for Web Accessibility Testing and Remediation

Note: This blog was first published on September 6, 2019, and has been updated to reflect new information and insights.

Takeaway: Last October 2023, W3C released WCAG 2.2 as a Recommendation, which means they should be seen and implemented as web standards. Our team at Promet Source always aims for our client websites to be compliant according to the latest recommended standards, even when clients, such as State and local government, and prospects are still looking at the previous versions.

As IAAP certified CPACC Ashley Burns said:


Although government and non-government entities can still use WCAG 2.0 and WCAG 2.1, adhering to WCAG 2.2 is the most reliable way to meet ADA requirements and show their commitment to accessibility.


I explain what our process of accessibility testing and remediation entails, documents you’ll get, why manual accessibility testing matters, and more in this post.


How to do accessibility testing and remediation

The duration of a complete accessibility audit and remediation depends on the website's size. For smaller sites, the entire process can be completed within 1 to 1.5 months. For larger and more complex websites, it can be up to 4 months.

The average timeframe it takes from start to finish is 2 months.


Web accessibility roadmap audit steps


The Initial Statement

The Initial Statement is a document we create and provide to the client to place on their website.

It indicates that the company is actively working on accessibility improvements and provides contact information for reporting accessibility issues. 

With this Statement, users know right from the get-go about the company's commitment to becoming as accessible and compliant with WCAG standards as possible.


Preliminary audit

From the audits we’ve performed in the past, we learned that it's not feasible to check every page manually.

What we do instead is to conduct a preliminary audit to cover the most significant parts of the site, including top landing pages and at least one page for each content type.

We found we’re more able to address the more complex pages that feature elements like slideshows, modals, accordions, and tabbed interfaces when we perform a preliminary audit because it's impossible to test and remediate every single page especially with large websites.

Of course, that would be ideal, but the realistic goal is to identify and correct around 90-95% of the errors.


Automated testing

After compiling this list, we proceed with automated testing, either based on the selected pages or using a tool like DubBot to crawl the site.

It attempts to check every page, but as I said above, it's not about reviewing every page with errors. Instead, the focus is on categorizing and addressing the types of errors found across the pages.

Most of the time, correcting an issue in one area results in the same fix being applied across multiple locations, especially with issues like color contrast, focus and hover states, and elements like tab states, for example.

I was also experimenting with another tool called BrowserStack, which has its merits but comes with its own set of challenges. At the moment, DubBot is our primary tool for automated testing.

Personally, I have a range of tools installed on my computer and still use them periodically for testing, including Siteimprove, axe, WAVE, and Accessibility Insights. I have a list of accessibility tools with pros and cons so you can see which ones fit your project.


Manual test 1

Next, we log identified errors into our tracking sheet. Then we proceed to manual testing, focusing on the top pages I mentioned earlier.

This phase involves detailed examination of every page we've earmarked for a manual audit. Typically, this starts with the header, footer, and homepage.

After testing the core pages, we proceed to manually test all other individual pages we’ve identified for review.

This involves navigating the site with a keyboard, focusing on the use of the tabbed interface. We ensure that not only are elements focusable and have the correct focus states, but they are also arranged in a logical order—left to right, top to bottom.

For screen reader testing—most of us being Mac users—utilize VoiceOver, which works quite well. The process involves navigating the site using various methods—by tab, by headings, by inputs, etc.—to identify any issues.

As we come across problems, we document them in our report.


The manual accessibility report

The report/tracker we use was developed 3-4 years back by accessibility experts in our team to be aligned with the latest WCAG standards, and we have been using and updating it ever since.

This structure gave us the flexibility to document various types of errors, even when the same issue appeared in different forms across the site, like color contrast issues affecting multiple elements such as buttons, headers, footers, and links, leading to numerous errors on a single page.

This approach ensured that our reports were comprehensive, covering each conformance level, detailing why and where on the page each failure occurred.


manual accessibility audit template landing page


You can download the manual accessibility report for free. It’s updated for WCAG 2.2, and we’ll keep updating the file for future versions as well.


Client remediation

After the first round of manual testing, we present our accessibility report to the client, where they can choose to undertake the accessibility remediation by themselves or hire us to do it for them.

Our report includes details on the nature of each error, instructions for correction, and its location to ensure they can follow up effectively.

Clients who choose to perform remediation internally usually take about 2 to 4 weeks to address the identified issues. Once they believe all issues have been rectified, we conduct a second round of testing to confirm the remediation's effectiveness.


Issues with client accessibility remediation

I mentioned that it’s the clients with limited budgets who choose to fix issues internally to save money, but I would recommend otherwise.

We’ve found that those who opt to conduct their accessibility remediation often end up spending more because the process doesn’t proceed smoothly. In our second round of testing, we expect to see all issues resolved. However, we frequently encounter numerous unresolved issues.

We document these in a new report, and then our project managers discuss with the client the need for either a third round of testing or a new contract to address the persistent problems.

Handling both the WCAG audit and remediation on our end simplifies the process, reducing the need for extra testing and extra contracts.


Manual test 2

The second round of manual testing involves repeating all the initial tests to verify that the previously identified errors have been corrected and no longer exist on the selected pages.

If errors persist, we may have to revert to the first phase of manual testing, and the client might need to purchase another package for further remediation.


Final Statement

When everything has been successfully remediated and our second round of manual testing confirms that all is clear, we then prepare a final statement for the client to display on their website.


Statement of accessibility


This final statement declares their conformance, detailing the accessibility remediation performed on specified pages (with URLs listed) and reaffirms their ongoing commitment to adhere to WCAG standards.

It also provides contact information for reporting any future accessibility issues.


Statement of Work

Once all the previous steps have been completed, we issue the Statement of Work.


Statement of work


This document outlines the scope of work undertaken during our collaboration with the client. It emphasizes their ongoing commitment to monitor and address accessibility issues, including training their staff to maintain these standards.


Statement of Conformance

The final step is a Statement of Conformance, marking the completion of our process and confirming the site's accessibility.

It's designed to inform users of the site's compliance status and the client's continued efforts towards maintaining conformance.

To be clear, the Statement of Conformance is intended to serve as a legal document, though its legal standing can vary.

Ideally, we aim for a full Statement of Conformance, but more often than not we find ourselves issuing a Statement of Partial Conformance.


Statement of partial conformance


This is due to the numerous third-party components incorporated into sites that unfortunately don't meet accessibility standards, such as:

  • Inaccessible CAPTCHAs
  • Various Google tools
  • JavaScript issues
  • Third-party calendars
  • iFrames from social media platforms like Instagram

For instance, navigation loops can occur in social media feeds where users get stuck without a clear exit. We document these third-party issues as the reason for partial conformance.

This ends up being the most common document we issue.


Why we perform two types of accessibility tests

If you’ve noticed, we perform two types of accessibility testing: automated and manual.

Automated accessibility testing uses tools that I’ve mentioned, such as DubBot, axe, and others. Manual accessibility testing uses screen readers, keyboards, and others.

There is clearly a lot more effort involved in manual accessibility testing, which is why some organizations with limited budgets choose to just use automated testing tools. I believe this is a big mistake.

I started with a small Drupal firm in Ottawa, Canada, right after graduating from university. Our accessibility testing relied solely on an automated tool.

I left that company around 2017 or 2018, and I realized afterwards that we were actually doing very little in terms of accessibility, and most of the sites we claimed were accessible likely weren't.

We never tested with a keyboard or screen reader, and I didn't know better since it was my first job and I was a junior developer. We always claimed our sites were successful in accessibility, but that probably wasn't the case.

This realization made me want to learn more about accessibility to ensure our sites were genuinely accessible and to remediate others.

When I began working on the accessibility issues for Martin County, I learned a lot from our dedicated team of accessibility experts (the ones who also built the report I linked to earlier). They taught me that tools are just that—tools, and that it takes time and patience to catch errors that can’t be detected by automated accessibility tests.


Changes in the roadmap from WCAG 2.1 to 2.2

The manual testing process remains pretty similar to previous standards. This entails testing with a keyboard, doing device testing, and using a screen reader.

Automated tests capture about 30 to 40% of the errors, far from everything.

Regarding color contrast issues, automated tools sometimes misidentify them, especially with background colors on images which they can't accurately differentiate. In such cases, it's necessary to manually examine the colors around text and image edges.

Overall, the approach to testing and remediation hasn't significantly shifted despite the updates.

What changed was the issues we test for, with 9 new items and 1 deprecated item.


Recommendations for accessibility conformance maintenance

For any accessibility client, I recommend to set up with an automated tool for continuous monitoring, capable of conducting tests daily or weekly.

Tools like DubBot can send alerts either daily or weekly, depending on the preference. This approach helps catch a significant portion of potential issues.

If we've initially made the site accessible, it becomes the client's responsibility to maintain that level of conformance going forward.

Issues usually happen in content or when individuals add inline styles to their WYSIWYG. I recommend, at a minimum, setting up automated monitoring and then conducting a manual review of the top pages at least once or twice a year to ensure ongoing conformance.

Doing these checks more frequently, such as four times a year, could be expensive for many clients. So if you aren’t changing up your website frequently, manually auditing once or twice a year should be fine.


Reach out to our team for a comprehensive WCAG 2.2 accessibility audit and remediation

As we like to say here at Promet, working on your website’s accessibility isn’t just a legal requirement—it’s the right thing to do. When you improve your website, you help users of all abilities access products, services, and information they need without trouble.

That is good.

Looking to get your website audited and remediated? Schedule a 30-minute consultation call with Ashley Burns!