Testing for Section 508 Compliance: Complete Testing Guide
Section 508 compliance testing requires a comprehensive approach combining automated scanning, manual evaluation, and assistive technology testing. Unlike simple code validation, Section 508 testing evaluates whether federal electronic and information technology is actually usable by people with disabilities. This guide provides a systematic testing methodology for web content, software, documents, and multimedia covered by Section 508.
Section 508 Testing Fundamentals
The Three-Tier Testing Approach
Effective Section 508 testing combines three complementary methods:
Tier 1: Automated Testing (30-40% coverage)
- Fast, scalable, consistent
- Catches technical violations
- Limited to programmatically determinable issues
Tier 2: Manual Testing (40-50% coverage)
- Human judgment for context-dependent issues
- Keyboard navigation, visual review
- Meaningful alt text, logical structure
Tier 3: Assistive Technology Testing (20-30% coverage)
- Real-world validation with screen readers, magnifiers
- Catches issues automated tools and manual review miss
- Validates actual user experience
All three tiers are necessary for comprehensive Section 508 compliance verification.
Testing Scope Determination
Identify what needs testing:
Web content: Public websites, intranets, web applications Software: Desktop apps, mobile apps, web-based applications Documents: PDFs, Word docs, Excel sheets, PowerPoint presentations Multimedia: Videos, audio content, webinars, training materials Hardware: If applicable (kiosks, specialized devices)
Testing Web Content for Section 508
Web content must meet WCAG 2.0 Level AA, which Section 508 incorporates by reference.
Automated Web Testing
Best tools for Section 508/WCAG 2.0 scanning:
axe DevTools: Browser extension providing detailed WCAG violation reports
- Checks: 57+ WCAG 2.0 rules
- Pros: Accurate, detailed guidance, integrates with dev tools
- Usage: Install extension, run scan on each page template
WAVE (WebAIM): Visual representation of accessibility issues
- Checks: WCAG 2.0/Section 508 violations
- Pros: Visual overlay shows exact issue locations
- Usage: Enter URL or use browser extension
Lighthouse: Built into Chrome DevTools
- Checks: Subset of WCAG 2.0 rules
- Pros: Integrated, scores accessibility
- Usage: DevTools > Lighthouse > Accessibility audit
Pa11y: Command-line tool for CI/CD integration
- Checks: WCAG 2.0 rules via HTML CodeSniffer
- Pros: Automation-friendly, batch testing
- Usage:
pa11y https://yoursite.com
BrowseCheck: Continuous monitoring platform
- Checks: WCAG 2.0 compliance across entire site
- Pros: Scheduled scanning, alerts, trend tracking
- Usage: Configure site, receive ongoing compliance reports
Manual Web Testing Procedures
Keyboard Navigation Testing
Objective: Verify all functionality accessible via keyboard alone.
Procedure:
- Disconnect mouse (or ignore it completely)
- Use Tab to navigate forward through interactive elements
- Use Shift+Tab to navigate backward
- Use Enter/Space to activate buttons and links
- Use arrow keys in custom widgets (dropdowns, tabs, sliders)
- Use Esc to close modals and overlays
Check for:
- All interactive elements reachable
- Logical tab order matching visual layout
- Visible focus indicators on all focusable elements
- No keyboard traps (can navigate away from every component)
- Custom widgets support expected keyboard patterns
Common failures:
<div>styled as button withouttabindexor keyboard handler- Focus indicator removed with
outline: nonewithout replacement - Modal dialogs that don't trap focus properly
- Dropdown menus that don't support arrow key navigation
Visual and Structural Review
Heading hierarchy:
- Use browser extension (HeadingsMap, WAVE) to view heading structure
- Verify single
<h1>per page - Check headings don't skip levels (h1→h2→h3, not h1→h3)
- Confirm headings accurately describe following content
Color contrast:
- Use contrast checker (WebAIM, Colour Contrast Analyser)
- Check all normal text has 4.5:1 contrast minimum
- Check large text (18pt or 14pt bold+) has 3:1 minimum
- Verify UI components and graphics have 3:1 contrast (WCAG 2.1, recommended for Section 508)
Color dependency:
- Identify information conveyed by color (errors, required fields, status)
- Verify additional indicators exist (icons, text, patterns)
- Example: Error fields marked red PLUS text "Error:" PLUS icon
Link text:
- Review all links out of context
- Verify each link describes destination
- Flag generic text ("click here," "read more") without context
- Check that identical link text goes to identical destinations
Form Testing
Labels and instructions:
- Verify every input has an associated
<label> - Check labels properly connected via
for/idattributes - Confirm instructions appear before form fields, not just in placeholders
- Verify required field indicators
Error handling:
- Submit form with invalid data
- Verify errors clearly identified
- Check error messages explain how to fix problem
- Confirm screen readers announce errors (test with ARIA live regions)
Screen Reader Testing for Web
Windows + NVDA (free):
- Download and install NVDA
- Start NVDA (Ctrl+Alt+N)
- Navigate using NVDA commands:
- H: Jump between headings
- K: Jump between links
- F: Jump to form fields
- T: Jump to tables
- Fill out form with vision turned off
- Verify custom widgets announce purpose and state
macOS + VoiceOver (built-in):
- Enable VoiceOver (Cmd+F5)
- Navigate using VoiceOver commands:
- VO+Right Arrow: Next item
- VO+Cmd+H: Next heading
- VO+Cmd+L: Next link
- VO+Cmd+J: Next form control
- Test interaction with screen off
- Verify images have meaningful descriptions
What to verify:
- Page title announced when loaded
- Headings create usable page outline
- Images have appropriate alt text
- Links understandable out of context
- Form fields have clear labels
- Errors clearly announced
- Dynamic content updates announced (ARIA live regions)
- Custom controls announce role, state, and value
Testing Software for Section 508
Software testing focuses on platform accessibility API integration and keyboard operability.
Windows Software Testing
Microsoft Accessibility Insights:
- Download Accessibility Insights for Windows
- Run FastPass for quick automated checks
- Run Assessment for comprehensive manual testing
- Use Inspect tool to verify accessible properties exposed
Check for:
- All UI elements expose Name, Role, State via UI Automation or MSAA
- Keyboard access to all functionality
- High contrast mode support
- Screen reader announces all interactive elements
- Focus indicators visible
macOS/iOS Software Testing
Accessibility Inspector (Xcode):
- Open Xcode > Xcode > Open Developer Tool > Accessibility Inspector
- Target your application
- Run Audit for automated checks
- Use Inspection mode to verify properties
VoiceOver testing:
- Enable VoiceOver
- Navigate through all application screens
- Verify all elements properly announced
- Test all user workflows with VoiceOver only
Android Software Testing
Accessibility Scanner:
- Install Accessibility Scanner from Play Store
- Enable Scanner
- Navigate through app
- Review Scanner suggestions
TalkBack testing:
- Enable TalkBack in Settings
- Navigate app with TalkBack
- Verify all elements announced correctly
- Test gestures and navigation
Testing PDFs for Section 508
PDF accessibility is a common Section 508 compliance challenge.
Automated PDF Testing
PDF Accessibility Checker (PAC):
- Download PAC (free from Access-for-All foundation)
- Open PDF in PAC
- Run automatic check
- Review detailed report of issues
Adobe Acrobat Pro Accessibility Checker:
- Open PDF in Acrobat Pro
- Tools > Accessibility > Full Check
- Select accessibility standards
- Review and remediate identified issues
Manual PDF Testing
Tag structure:
- Open Tags panel in Acrobat
- Verify document has tag tree
- Check reading order follows logical sequence
- Verify proper tag types (headings, lists, tables)
Alternative text:
- Check all images have alt text
- Verify alt text is meaningful
- Confirm decorative images marked as artifacts
Form fields:
- Verify all fields have tooltips (accessible names)
- Check field tab order is logical
- Test form completion with keyboard only
Tables:
- Verify tables have headers
- Check header associations correct
- Confirm complex tables use ID/Headers method
Reading order:
- View > Read Out Loud > Activate Read Out Loud
- Listen to entire document
- Verify content reads in logical order
- Check no content is skipped or repeated
Testing Multimedia for Section 508
Caption Testing
Technical verification:
- Captions present for all speech
- Captions synchronized with audio
- Caption file format compatible with player
Quality verification:
- Captions accurate (verbatim or edited for clarity)
- Speaker identification when multiple speakers
- Sound effects described [applause], [music]
- Captions readable (sufficient contrast, appropriate size)
User testing:
- Play video with sound off
- Follow along with captions only
- Verify comprehension possible without audio
Audio Description Testing
Presence verification:
- Audio description track exists
- Can be enabled in player
- Synchronized with video
Quality verification:
- Describes relevant visual information
- Doesn't overlap dialogue
- Provides context for visual humor, text on screen, actions
User testing:
- Play video without watching screen
- Follow along with audio description
- Verify comprehension possible without video
Media Player Controls Testing
Keyboard operability:
- All controls accessible via keyboard
- Play/pause, volume, scrubbing all keyboard-operable
- Caption and audio description toggles keyboard-accessible
Screen reader compatibility:
- Controls properly labeled for screen readers
- Current time and duration announced
- Caption and audio description states announced
Creating a Section 508 Test Plan
Test Plan Components
Scope definition:
- What will be tested (URLs, software, documents)
- Which standards apply (web, software, documents, etc.)
- Testing timeline and milestones
Testing methodology:
- Automated tools to be used
- Manual testing procedures
- Assistive technologies for validation
- Browser and platform coverage
Roles and responsibilities:
- Who conducts automated testing
- Who performs manual testing
- Who tests with assistive technology
- Who documents and tracks issues
Success criteria:
- Conformance level target (WCAG 2.0 Level AA for web)
- Acceptable defect rates
- Remediation priorities and timelines
Deliverables:
- Testing reports
- Issue tracking log
- Accessibility conformance report (VPAT)
- Remediation recommendations
Sample Test Plan Template
1. Introduction
- Purpose of testing
- Scope (what's covered)
- Standards (Section 508, WCAG 2.0 Level AA)
2. Testing Approach
- Automated scanning (tools, frequency)
- Manual testing (procedures, checklists)
- Assistive technology testing (tools, scenarios)
3. Test Environment
- Browsers: Chrome, Firefox, Safari, Edge
- Screen readers: NVDA, JAWS, VoiceOver
- Platforms: Windows 10/11, macOS, iOS, Android
- Assistive tech: Screen readers, magnifiers, voice control
4. Test Cases
- [List specific test cases for your application]
- Example: "Login process keyboard accessibility"
- Example: "Form validation error announcement"
5. Defect Classification
- Critical: Blocks access completely
- Major: Significant difficulty, workarounds exist
- Minor: Inconvenience, access still possible
- Best practice: No impact on access, recommended improvement
6. Reporting and Documentation
- Issue tracking system
- Report format
- Remediation tracking
- Final conformance report
7. Schedule
- Testing phases and deadlines
- Remediation windows
- Re-testing schedule
Continuous Monitoring for Section 508
One-time testing isn't sufficient. Websites and software change constantly, introducing new accessibility barriers.
Continuous Testing Strategies
Automated regression testing:
- Run automated scans after each deployment
- Integrate accessibility checks into CI/CD pipelines
- Use tools like Pa11y, axe-core in build processes
Scheduled comprehensive scans:
- Weekly or monthly full-site scans
- Monitor for regressions and new issues
- Track trends over time
BrowseCheck continuous monitoring:
- Automated daily/weekly scans of entire site
- Immediate alerts when new violations detected
- Trend tracking showing improvement or regression
- Compliance dashboards for stakeholder reporting
User feedback channels:
- Accessible feedback form
- Dedicated accessibility email
- Regular user testing with people with disabilities
Documentation and Reporting
Creating Test Reports
Executive summary:
- Overall conformance status
- Critical issues count
- Priority recommendations
Detailed findings:
- Issue description
- Success criterion violated
- Location (URL, page, component)
- Severity/impact
- Remediation guidance
- Screenshots/examples
Conformance statement:
- Standards tested against
- Conformance level claimed
- Known limitations
- Date of evaluation
Accessibility Conformance Report (VPAT)
For federal procurement, create VPAT using official template:
- Download current VPAT template
- Complete product information section
- Evaluate against each applicable criterion
- Mark as: Supports, Partially Supports, Does Not Support, Not Applicable
- Provide explanatory remarks
- Include contact for accessibility questions
Common Testing Mistakes to Avoid
Relying only on automated tools: Catches 30-40% of issues; manual testing essential
Testing only with mouse: Keyboard and screen reader testing reveals different issues
Testing homepage only: Test complete user journeys including forms, checkout, account management
Ignoring mobile: Test responsive designs and mobile apps thoroughly
One-time testing: Continuous monitoring prevents regressions
Testing without assistive technology: Can't verify screen reader compatibility without testing
Skipping user testing: Real users with disabilities find issues tools miss
Conclusion
Section 508 testing requires systematic evaluation using automated tools, manual procedures, and assistive technology validation. Web content must meet WCAG 2.0 Level AA, software must integrate with platform accessibility APIs, documents must be properly tagged, and multimedia must include captions and audio descriptions.
Effective testing combines:
- Automated scanning for technical violations (axe, WAVE, Lighthouse, BrowseCheck)
- Manual testing for context-dependent issues (keyboard navigation, visual review)
- Screen reader testing for actual usability (NVDA, JAWS, VoiceOver, TalkBack)
- PDF-specific testing for document accessibility (PAC, Acrobat)
- Multimedia review for caption and audio description quality
Continuous monitoring prevents regressions and ensures ongoing compliance as technology evolves. Tools like BrowseCheck provide automated daily/weekly scanning with immediate alerts when new issues arise, making Section 508 compliance sustainable over time.
Ready to start Section 508 testing? Begin with automated scans to identify obvious issues, follow with systematic manual keyboard testing, and validate with screen readers. Document findings in a structured test report, prioritize remediation, and implement continuous monitoring to maintain compliance.