Testing and Monitoring AODA Compliance: A Comprehensive Guide

AODAOntario accessibilityCanadian accessibility lawAODA complianceBrowseCheck
·11 min read

Achieving AODA compliance is only half the battle—maintaining it as your website evolves requires systematic testing and continuous monitoring. Every content update, design change, or new feature introduces potential accessibility barriers. This guide explores effective strategies for testing WCAG 2.0 Level AA compliance and implementing ongoing monitoring to ensure your website remains accessible.

The Testing Challenge

Accessibility testing presents unique challenges compared to other quality assurance processes:

  • Automated tools catch only 30-40% of accessibility issues
  • Context matters: The same code can be accessible or inaccessible depending on usage
  • Subjective interpretation: Some WCAG criteria require human judgment
  • Assistive technology variety: Different screen readers behave differently
  • Constantly evolving content: Today's accessible page may have barriers tomorrow

Effective AODA testing requires combining automated scanning, manual testing, assistive technology evaluation, and user feedback.

The Four Pillars of Accessibility Testing

1. Automated Accessibility Scanning

Automated tools examine code against WCAG 2.0 success criteria, identifying violations programmatically. While limited in scope, they provide fast, consistent baseline checking.

What automated tools catch well:

  • Missing alt text on images
  • Insufficient color contrast
  • Missing form labels
  • Invalid HTML and ARIA
  • Heading hierarchy violations
  • Missing page titles and language attributes

What automated tools miss:

  • Whether alt text is descriptive and meaningful
  • Whether heading text accurately describes content
  • Keyboard interaction patterns in custom widgets
  • Logical tab order and focus management
  • Whether content makes sense to screen reader users
  • Context-dependent accessibility issues

Popular automated testing tools:

  • axe DevTools: Browser extension providing detailed WCAG violation reports
  • WAVE: Visual representation of accessibility issues on pages
  • Lighthouse: Built into Chrome DevTools, includes accessibility audits
  • Pa11y: Command-line tool for CI/CD integration
  • BrowseCheck: Continuous monitoring platform for ongoing compliance scanning

When to use automated tools: Run automated scans during development, before deployment, and continuously on production sites to catch regressions.

2. Manual Accessibility Testing

Manual testing examines accessibility aspects requiring human judgment and context understanding.

Essential manual tests:

Keyboard Navigation Testing

  • Disconnect your mouse and navigate using only keyboard
  • Verify every interactive element is reachable via Tab key
  • Confirm tab order follows logical reading sequence
  • Test that custom widgets (dropdowns, modals, tabs) work with keyboard
  • Ensure no keyboard traps exist
  • Verify visible focus indicators on all interactive elements

Visual Review

  • Examine heading hierarchy for logical structure
  • Verify link text is descriptive out of context
  • Check that error messages are clear and helpful
  • Confirm instructions are provided before complex interactions
  • Verify color isn't the only way information is conveyed
  • Test responsive behavior at different screen sizes

Content Quality Assessment

  • Evaluate whether alt text meaningfully describes images
  • Check video captions for accuracy and completeness
  • Verify form labels are clear and descriptive
  • Confirm error messages explain how to fix problems
  • Test that page titles accurately describe content

When to use manual testing: Include manual accessibility checks in your QA process for all new features and significant content updates.

3. Assistive Technology Testing

Testing with the actual tools people with disabilities use reveals issues automated scans and manual inspection miss.

Screen Reader Testing

Screen readers convert digital content to synthesized speech or refreshable Braille, enabling blind and visually impaired users to access content.

Recommended screen readers for testing:

  • NVDA (Windows, free): Most popular free screen reader
  • JAWS (Windows, paid): Industry standard, widely used professionally
  • VoiceOver (macOS/iOS, built-in): Default for Apple ecosystem users
  • TalkBack (Android, built-in): Default for Android users

Basic screen reader testing:

  • Navigate the page using heading shortcuts
  • Test form completion with only audio feedback
  • Verify custom interactive elements announce their purpose and state
  • Confirm dynamic content updates are announced
  • Check that images have meaningful descriptions
  • Test that tables communicate structure and relationships

You don't need to become a screen reader expert, but understanding basic operation helps identify critical issues.

Keyboard-Only Testing

Beyond standard keyboard navigation, test with tools keyboard users actually rely on:

  • Tab and Shift+Tab for navigation
  • Enter and Space for activation
  • Arrow keys for custom widgets
  • Escape to close modals and overlays

Voice Control Testing

Voice control users navigate and interact using speech commands. Test that:

  • Elements have visible labels matching their accessible names
  • Custom controls can be activated via voice
  • Form fields can be identified and filled verbally

4. User Testing with People with Disabilities

Nothing replaces testing with actual users who have disabilities. Their lived experience reveals barriers that even thorough technical testing might miss.

How to conduct accessibility user testing:

  1. Recruit diverse participants: Include people with various disabilities (visual, motor, cognitive, hearing)
  2. Provide realistic tasks: Test common user journeys (account creation, purchasing, content consumption)
  3. Observe without interference: Let users work naturally with their assistive technologies
  4. Ask open-ended questions: "What challenges did you encounter?" not "Did that work?"
  5. Document barriers encountered: Note both what failed and what succeeded

When to conduct user testing: Prioritize user testing for major redesigns, new features, and critical user journeys.

Building a Comprehensive Testing Strategy

Effective AODA compliance testing combines all four approaches in a layered strategy:

During Development

  • Automated scanning in local development environment
  • Component-level testing for custom widgets and interactions
  • Keyboard navigation testing as features are built
  • Code review with accessibility considerations

Before Deployment

  • Comprehensive automated scan of staging environment
  • Manual testing of new features and changed pages
  • Screen reader testing of critical user journeys
  • Contrast verification of any visual changes
  • Regression testing of previously accessible functionality

Post-Deployment

  • Smoke testing of production deployment
  • Continuous automated monitoring for ongoing compliance
  • Periodic comprehensive audits (quarterly or biannually)
  • User feedback monitoring for accessibility complaints

Ongoing

  • Content editor training on accessibility best practices
  • Design system maintenance ensuring accessible components
  • Documentation updates as standards evolve
  • User testing with people with disabilities annually

Common Testing Mistakes to Avoid

Mistake 1: Relying Only on Automated Tools

Why it's problematic: Automated tools catch only 30-40% of issues. Many critical barriers require human judgment.

Better approach: Use automated scanning as your first line of defense, but always supplement with manual and assistive technology testing.

Mistake 2: Testing Only Major Pages

Why it's problematic: Legal complaints often cite barriers on secondary pages, forms, and checkout flows.

Better approach: Test complete user journeys, not just high-traffic landing pages. Include forms, error states, search results, and account management.

Mistake 3: One-Time Testing at Launch

Why it's problematic: Websites change constantly. Today's accessible page can have barriers tomorrow after a CMS update.

Better approach: Implement continuous monitoring and integrate accessibility testing into standard development workflows.

Mistake 4: Testing Without Assistive Technology

Why it's problematic: You can't know if something works for screen reader users without testing with a screen reader.

Better approach: Learn basic screen reader operation and test critical functionality with NVDA or VoiceOver.

Mistake 5: Ignoring Mobile Accessibility

Why it's problematic: AODA applies to mobile experiences too. Mobile introduces unique accessibility considerations.

Better approach: Test on actual mobile devices with built-in accessibility features (VoiceOver, TalkBack, voice control).

Automated Monitoring for Continuous Compliance

Manual testing catches point-in-time issues, but websites change daily. Continuous automated monitoring provides ongoing compliance verification.

Benefits of Continuous Monitoring

Early detection: Catch accessibility regressions immediately after deployment, not weeks later during manual audits.

Coverage: Monitor thousands of pages automatically, catching issues human testers might miss.

Trending: Track accessibility improvements or regressions over time.

Alerting: Receive immediate notifications when critical barriers are introduced.

Evidence: Document ongoing compliance efforts for AODA reporting requirements.

What to Monitor Continuously

  • WCAG 2.0 Level AA violations across your site
  • Color contrast on all text elements
  • Form accessibility including labels and error handling
  • Image alt text presence and quality
  • Heading structure for proper hierarchy
  • Keyboard accessibility of interactive elements

Choosing Monitoring Tools

Effective accessibility monitoring tools should:

  • Scan sites regularly (daily or weekly)
  • Provide detailed violation reports with remediation guidance
  • Alert teams immediately when critical issues arise
  • Track trends and progress over time
  • Integrate with development workflows
  • Support large-scale site monitoring

BrowseCheck specializes in continuous accessibility monitoring, automatically scanning websites for WCAG violations and providing actionable reports. This proactive approach ensures accessibility issues are caught and resolved before impacting users or creating compliance risks.

Building Accessibility into Your Workflow

Sustainable AODA compliance requires integrating accessibility into standard processes, not treating it as a separate concern.

Design Phase

  • Include accessibility requirements in design briefs
  • Use accessible color palettes from the start
  • Design focus states for interactive elements
  • Create accessible component documentation
  • Review designs for keyboard navigation patterns

Development Phase

  • Use semantic HTML as foundation
  • Include accessibility in code review checklists
  • Write automated tests for accessibility features
  • Test with keyboard and screen readers during development
  • Run accessibility linters and plugins in IDEs

Content Creation

  • Train content editors on accessibility basics
  • Provide templates with accessible structure
  • Create image alt text guidelines
  • Review content for plain language and clarity
  • Test content with readability tools

Quality Assurance

  • Include accessibility in QA test plans
  • Maintain accessibility testing devices and tools
  • Document accessibility testing procedures
  • Track accessibility bugs with same priority as functional bugs

Deployment

  • Run accessibility scans before production deployment
  • Include accessibility in release notes
  • Monitor production sites for accessibility regressions
  • Establish rollback procedures for critical accessibility failures

Responding to Accessibility Issues

Even with robust testing, accessibility issues will be discovered. Having a clear response process is essential.

Triage and Prioritization

Not all accessibility issues have equal impact. Prioritize based on:

Severity:

  • Critical: Completely blocks access for users with disabilities
  • Serious: Causes significant difficulty but workarounds exist
  • Moderate: Creates inconvenience but access is possible
  • Minor: Best practice violation with minimal user impact

Reach:

  • How many pages are affected?
  • How many users encounter the issue?
  • How frequently is the component used?

Effort:

  • How difficult is remediation?
  • What resources are required?
  • Can it be batched with related fixes?

Remediation Workflow

  1. Document the issue: Capture screenshots, steps to reproduce, WCAG criterion violated
  2. Assign ownership: Designate who will fix it
  3. Set timeline: Establish remediation deadline based on severity
  4. Implement fix: Make code or content changes
  5. Test fix: Verify resolution with automated and manual testing
  6. Deploy: Release to production
  7. Verify: Confirm fix works in production environment
  8. Document: Update accessibility statement if needed

Accessibility Statements

AODA doesn't explicitly require accessibility statements, but they're valuable for:

  • Communicating your commitment to accessibility
  • Explaining known limitations and planned fixes
  • Providing contact information for accessibility feedback
  • Documenting compliance efforts

A good accessibility statement includes:

  • Conformance level (WCAG 2.0 Level AA)
  • Known accessibility issues and timelines
  • Feedback mechanisms
  • Date of last review
  • Contact information for accessibility concerns

Measuring Success

Track metrics that demonstrate improving accessibility:

  • Violation trends: Are WCAG violations decreasing over time?
  • Issue resolution time: How quickly are accessibility bugs fixed?
  • Accessibility coverage: What percentage of pages pass automated scans?
  • User feedback: Are accessibility complaints decreasing?
  • Assistive technology compatibility: Do critical journeys work with screen readers?

Conclusion

Testing and monitoring AODA compliance requires ongoing commitment, not one-time effort. Effective accessibility assurance combines automated scanning for baseline checking, manual testing for nuanced evaluation, assistive technology testing for real-world validation, and user testing for lived experience insights.

By integrating accessibility into your standard development workflows and implementing continuous monitoring, you transform accessibility from a compliance checkbox into a quality characteristic embedded in your processes. This proactive approach catches issues early when they're cheapest to fix, maintains ongoing AODA compliance, and ensures your website remains accessible as it evolves.

Ready to implement robust accessibility testing and monitoring? Start by running a comprehensive automated scan to establish your baseline, then layer in manual testing and continuous monitoring to maintain compliance over time. With the right tools and processes, AODA compliance becomes manageable and sustainable.