Skip to main content

Accessibility Testing

Accessibility testing verifies that applications function correctly for users with disabilities, including those who navigate by keyboard alone, use screen readers, have low vision, or experience colour blindness. You perform accessibility testing through a combination of automated scanning tools that detect programmatic violations and manual testing that evaluates the actual user experience. Automated tools catch approximately 30-40% of accessibility issues; the remainder require human evaluation of context, meaning, and usability.

The Web Content Accessibility Guidelines (WCAG) 2.1 Level AA standard serves as the compliance baseline for most organisations. This standard comprises 50 success criteria across four principles: perceivable, operable, understandable, and robust. Each criterion has specific, testable requirements. Accessibility testing validates conformance against these criteria through systematic evaluation of each interface component.

Prerequisites

Before beginning accessibility testing, ensure the following requirements are met.

RequirementSpecificationVerification
axe-coreVersion 4.6 or highernpm list @axe-core/cli or check browser extension version
Node.jsVersion 18 LTS or highernode --version
NVDA screen readerLatest stable releaseWindows only; download from nvaccess.org
VoiceOverBuilt into macOS/iOSEnable in System Preferences > Accessibility
Colour contrast analyserCCA 3.0+ or browser DevToolsInstall from TPGi or use Chrome/Firefox DevTools
Test accountsAccounts without mouse-dependent featuresCreate dedicated accessibility test user
Browser DevToolsChrome 100+ or Firefox 100+Check browser version in Help > About

Install the command-line accessibility scanner:

Terminal window
npm install -g @axe-core/cli pa11y

Verify installation:

Terminal window
axe --version
# Expected: 4.6.0 or higher
pa11y --version
# Expected: 6.2.0 or higher

For CI integration, install the packages as development dependencies in your project:

Terminal window
npm install --save-dev @axe-core/cli @axe-core/playwright axe-html-reporter pa11y-ci

Configure your test browser to disable smooth scrolling and animations, as these interfere with automated testing accuracy:

// playwright.config.js accessibility overrides
use: {
reducedMotion: 'reduce',
colorScheme: 'light',
viewport: { width: 1280, height: 720 }
}

Automated accessibility scanning

Automated scanning identifies violations of programmatically testable WCAG criteria. These tools parse the DOM, evaluate ARIA usage, check colour contrast ratios, and verify that interactive elements have accessible names. Run automated scans against every page and interactive state in your application.

  1. Run axe-core against a single page to establish baseline:
Terminal window
axe https://example.org/app/dashboard --exit

The scanner outputs violations grouped by impact level. A clean scan returns exit code 0:

Running axe-core 4.8.2 in chrome-headless
Violations: 0
Passes: 47
Incomplete: 3
Inapplicable: 28

Any violation causes exit code 1, which fails CI pipelines.

  1. Generate an HTML report for detailed review:
Terminal window
axe https://example.org/app/dashboard --save results.json
npx axe-html-reporter --source results.json --destination accessibility-report.html

The HTML report includes screenshots highlighting each violation, the specific WCAG criterion violated, and remediation guidance. Share this report with developers responsible for fixes.

  1. Scan multiple pages using a sitemap or URL list:
Terminal window
# Create urls.txt with one URL per line
echo "https://example.org/app/dashboard
https://example.org/app/cases
https://example.org/app/reports
https://example.org/app/settings" > urls.txt
# Scan all URLs
pa11y-ci --sitemap urls.txt --threshold 0

The threshold flag sets the maximum allowed violations per page. Set to 0 for new projects; existing projects may need higher thresholds during remediation.

  1. Configure axe to test specific WCAG levels and rule sets:
Terminal window
axe https://example.org/app/form \
--tags wcag2a,wcag2aa,wcag21aa \
--disable color-contrast \
--exit

The --tags flag restricts testing to specific standards. The --disable flag excludes rules that produce false positives in your context. Document any disabled rules and the justification.

  1. Integrate automated scanning into Playwright tests for stateful pages:
tests/accessibility.spec.js
import { test, expect } from '@playwright/test';
import AxeBuilder from '@axe-core/playwright';
test('dashboard has no accessibility violations', async ({ page }) => {
await page.goto('/app/dashboard');
// Wait for dynamic content to load
await page.waitForSelector('[data-testid="dashboard-loaded"]');
const accessibilityScanResults = await new AxeBuilder({ page })
.withTags(['wcag2a', 'wcag2aa', 'wcag21aa'])
.analyze();
expect(accessibilityScanResults.violations).toEqual([]);
});
test('form validation errors are accessible', async ({ page }) => {
await page.goto('/app/cases/new');
// Submit empty form to trigger validation
await page.click('[data-testid="submit-button"]');
await page.waitForSelector('[role="alert"]');
const accessibilityScanResults = await new AxeBuilder({ page })
.include('[data-testid="case-form"]')
.analyze();
expect(accessibilityScanResults.violations).toEqual([]);
});

The .include() method restricts scanning to specific page regions. Use this for testing components in isolation or focusing on recently changed areas.

  1. Record baseline metrics for tracking improvement:
Terminal window
axe https://example.org/app/dashboard --save baseline-$(date +%Y%m%d).json

Compare results over time to verify that accessibility improves and no regressions occur. A typical remediation trajectory for a legacy application:

WeekCriticalSeriousModerateMinorTotal
0123418771
232818756
401512734
6048517
800347

Keyboard navigation testing

Keyboard accessibility ensures that users who cannot use a mouse can access all functionality. This includes users with motor impairments, users with visual impairments who rely on screen readers, and power users who prefer keyboard navigation. Every interactive element must be reachable via keyboard, and the focus order must follow a logical sequence.

  1. Disconnect your mouse or trackpad. On laptops, place a physical object over the trackpad to prevent accidental use. This forces authentic keyboard-only testing.

  2. Navigate to the page under test and press Tab repeatedly to move through interactive elements. Document the focus order by recording each element in sequence:

Focus Order Test: Dashboard Page
Date: 2024-11-16
Tester: [Name]
Tab 1: Skip to main content link (hidden until focused)
Tab 2: Logo/home link
Tab 3: Dashboard nav item
Tab 4: Cases nav item
Tab 5: Reports nav item
Tab 6: Settings nav item
Tab 7: User menu button
Tab 8: Search input
Tab 9: First dashboard card action
...

The focus order should match the visual reading order: left-to-right, top-to-bottom for LTR languages. Focus must never jump unexpectedly across the page or become trapped in a component.

  1. Verify focus visibility on each element. The focused element must have a visible indicator that meets the 3:1 contrast ratio requirement. Default browser focus rings satisfy this requirement; custom focus styles must be verified:
/* Compliant custom focus style */
:focus-visible {
outline: 3px solid #005fcc;
outline-offset: 2px;
}
/* Non-compliant: invisible focus */
:focus {
outline: none; /* VIOLATION: removes focus indicator */
}

Test focus visibility against both light and dark backgrounds if your application supports multiple colour schemes.

  1. Test all interactive components with their expected keyboard patterns:

    ComponentExpected keyboard behaviour
    LinkEnter activates
    ButtonEnter or Space activates
    CheckboxSpace toggles
    Radio groupArrow keys move selection, Space selects
    Select/dropdownEnter opens, Arrow keys navigate, Enter selects, Escape closes
    Modal dialogTab cycles within modal, Escape closes, focus trapped inside
    MenuArrow keys navigate items, Enter activates, Escape closes
    Tab panelArrow keys switch tabs, Tab moves into panel content
    AccordionEnter or Space toggles, Arrow keys move between headers
    Date pickerArrow keys navigate dates, Enter selects, Escape closes
  2. Verify that no keyboard traps exist. A keyboard trap occurs when focus enters a component and cannot exit via Tab or Shift+Tab. Common trap locations include:

    • Modal dialogs without proper focus management
    • Embedded media players
    • Rich text editors
    • Third-party widgets (maps, calendars, chat widgets)
    • Infinite scroll containers

    Test escape routes: Tab, Shift+Tab, Escape, and any documented keyboard shortcuts. If focus cannot exit a component through standard means, the component fails this criterion.

  3. Test skip links functionality. Press Tab immediately after page load. The first focusable element should be a skip link that bypasses navigation:

<!-- Correct implementation -->
<a href="#main-content" class="skip-link">Skip to main content</a>
<!-- The skip link target -->
<main id="main-content" tabindex="-1">
...
</main>

Activate the skip link and verify that focus moves to the main content area, bypassing all navigation elements.

  1. Document findings using a structured format:
KEYBOARD ISSUE: Focus trap in date picker
Location: /app/cases/new - Due date field
Steps to reproduce:
1. Tab to due date field
2. Press Enter to open date picker
3. Attempt to Tab out of date picker
Expected: Focus moves to next field
Actual: Focus cycles within date picker indefinitely
Escape key: Does not close picker
WCAG: 2.1.2 No Keyboard Trap (Level A)
Severity: Critical

Screen reader testing

Screen reader testing evaluates how assistive technology interprets and announces page content. Users with visual impairments rely on screen readers to convert visual interfaces into spoken audio. Screen reader behaviour depends on correct semantic HTML, ARIA attributes, and content structure. Automated tools cannot verify that announcements make sense in context; this requires manual testing with actual screen reader software.

  1. Launch NVDA on Windows or enable VoiceOver on macOS. Configure the screen reader for testing:

    NVDA configuration:

Preferences > Settings > Browse Mode:
- Automatic Say All on page load: OFF
- Use screen layout: ON
- Automatic focus mode for focus changes: ON
Preferences > Settings > Document Formatting:
- Report headings: ON
- Report links: ON
- Report lists: ON
- Report landmarks: ON
- Report tables: ON

VoiceOver configuration (macOS):

System Preferences > Accessibility > VoiceOver > Open VoiceOver Utility:
- Web > When loading a new page: Do Nothing
- Verbosity > Default: High
  1. Navigate to the page under test. Listen to the initial page announcement and document what is read:
PAGE: /app/cases/123
Initial announcement:
"Case Details - CaseTrack. Web content. Heading level 1, Case Details.
Landmark: navigation. Landmark: main."
Assessment:
- Page title announced: YES
- Heading structure present: YES
- Landmarks identified: YES

The page title should identify the page content and application. Heading structure must be logical (no skipped levels). Key landmarks (navigation, main, complementary) should be present.

  1. Test heading structure using screen reader navigation. In NVDA, press H to move to next heading. Document each heading in sequence:
HEADING STRUCTURE: /app/cases/123
H1: Case Details
H2: Beneficiary Information
H3: Contact Details
H3: Household Composition
H2: Case History
H3: Previous Interactions
H3: Notes
H2: Actions
Assessment: Logical hierarchy, no skipped levels

Headings must form a logical outline of the page. Each page should have exactly one H1. Heading levels should not skip (H1 to H3 without H2).

  1. Test form inputs for accessible labels. Navigate to each form field using Tab and document what the screen reader announces:
FORM: /app/cases/new
Field: First name
Announced: "First name, edit text, required"
Assessment: PASS - Label, role, and required state announced
Field: Case type
Announced: "Dropdown, collapsed"
Assessment: FAIL - No accessible label announced
Issue: Missing aria-label or associated <label>
Field: Notes
Announced: "Edit text"
Assessment: FAIL - No accessible label announced
  1. Test dynamic content announcements. Screen readers must announce content changes that users need to know about. Trigger actions that update page content and verify announcements:
ACTION: Submit form with validation errors
Expected: Error messages announced automatically
ARIA live region implementation:
<div role="alert" aria-live="assertive">
Please correct the following errors:
- First name is required
- Case type is required
</div>
Announcement heard: "Alert. Please correct the following errors.
First name is required. Case type is required."
Assessment: PASS - Errors announced immediately

Content changes that require ARIA live regions include: form validation errors, status messages, search results loading, notifications, and chat messages.

  1. Test data tables for accessibility. Navigate to a table and verify that screen readers announce row and column headers correctly:
TABLE: Case list on /app/cases
<table>
<thead>
<tr>
<th scope="col">Case ID</th>
<th scope="col">Beneficiary</th>
<th scope="col">Status</th>
<th scope="col">Last updated</th>
</tr>
</thead>
<tbody>
<tr>
<th scope="row">C-2024-001</th>
<td>Jane Doe</td>
<td>Active</td>
<td>2024-11-15</td>
</tr>
</tbody>
</table>
Navigation test (NVDA Table mode - Ctrl+Alt+Arrow keys):
- Column header announcement: YES
- Row header announcement: YES
- Cell context provided: YES
Reading "Active" announces: "Status column, C-2024-001 row, Active"
Assessment: PASS
  1. Test images and graphics for alternative text:
IMAGE: Organisation logo
Announced: "Organisation logo, graphic"
Assessment: PASS - Descriptive alt text
IMAGE: Chart showing monthly case volumes
Announced: "Image"
Assessment: FAIL - Non-descriptive alt text
Required: Describe chart content or provide data table alternative
IMAGE: Decorative divider
Announced: [Nothing]
Assessment: PASS - Correctly hidden with alt="" or role="presentation"

Informative images require alt text describing their content or purpose. Decorative images must be hidden from screen readers. Complex graphics (charts, diagrams) require either detailed alt text or a linked data table alternative.

Colour contrast verification

Colour contrast testing ensures that text and interactive elements are distinguishable for users with low vision or colour blindness. WCAG 2.1 AA requires minimum contrast ratios of 4.5:1 for normal text and 3:1 for large text (18pt regular or 14pt bold). User interface components and graphical objects that convey information require 3:1 contrast against adjacent colours.

  1. Open the page in Chrome or Firefox DevTools. Access the colour contrast checker:

    Chrome:

1. Right-click element > Inspect
2. Select element in Elements panel
3. Styles panel shows colour values
4. Click colour swatch to open colour picker
5. Contrast ratio displayed under "Contrast"

Firefox:

1. Right-click element > Inspect Accessibility Properties
2. View contrast ratio in Accessibility panel
3. Issues listed under "Checks"
  1. Test body text contrast systematically. Sample text from each distinct background colour combination:
CONTRAST TEST: Body text
Location: Main content area
Foreground: #333333
Background: #FFFFFF
Ratio: 12.63:1
Required: 4.5:1
Result: PASS
Location: Card component
Foreground: #666666
Background: #F5F5F5
Ratio: 4.48:1
Required: 4.5:1
Result: FAIL (by 0.02)
Remediation: Darken text to #636363 for 4.52:1
  1. Test link contrast. Links must be distinguishable from surrounding text by more than colour alone, or the link colour must have 3:1 contrast against surrounding text in addition to 4.5:1 against background:
LINK CONTRAST TEST
Link text: #0066CC
Body text: #333333
Background: #FFFFFF
Link vs background: 6.32:1 (PASS, >4.5:1)
Link vs body text: 2.09:1 (FAIL, <3:1)
Additional distinction: Underline present: YES
Result: PASS - Underline provides non-colour distinction
  1. Test form input borders and focus states:
FORM ELEMENT CONTRAST TEST
Input border:
Border colour: #CCCCCC
Background: #FFFFFF
Ratio: 1.61:1
Required: 3:1 for UI components
Result: FAIL
Input focus state:
Focus ring: #005FCC
Background: #FFFFFF
Ratio: 5.12:1
Required: 3:1
Result: PASS
Input error state:
Border colour: #DC3545
Background: #FFFFFF
Ratio: 4.02:1
Required: 3:1
Result: PASS
  1. Test data visualisations. Each data series in charts must be distinguishable by more than colour alone, or adjacent colours must have 3:1 contrast:
CHART CONTRAST TEST: Monthly case volumes
Series colours:
Series 1 (New): #2E86AB
Series 2 (Closed): #A23B72
Series 3 (Pending): #F18F01
Adjacent colour contrast:
Series 1 vs Series 2: 2.41:1 (FAIL)
Series 2 vs Series 3: 2.87:1 (FAIL)
Alternative distinction:
- Different line patterns: YES (solid, dashed, dotted)
- Data labels present: YES
Result: PASS - Non-colour distinctions provided
  1. Run automated contrast checking across all pages:
Terminal window
# pa11y includes colour contrast in default ruleset
pa11y https://example.org/app/dashboard --reporter json | \
jq '.issues[] | select(.code | contains("color-contrast"))'

Review automated findings, but verify each manually. Automated tools cannot evaluate text over images, gradients, or dynamic backgrounds.

Form accessibility testing

Forms require specific accessibility testing because they involve complex interactions: labelling, validation, error handling, and state management. Each form element must have a programmatically associated label, validation errors must be announced to screen reader users, and error recovery must be achievable without losing entered data.

  1. Verify every input has an associated label. Test using Chrome DevTools:
// Run in browser console to find unlabelled inputs
document.querySelectorAll('input, select, textarea').forEach(el => {
const label = el.labels?.[0]?.textContent ||
el.getAttribute('aria-label') ||
el.getAttribute('aria-labelledby');
if (!label) {
console.error('Unlabelled input:', el);
}
});

Valid labelling methods, in order of preference:

<!-- Method 1: Explicit label association (preferred) -->
<label for="firstName">First name</label>
<input type="text" id="firstName" name="firstName">
<!-- Method 2: Implicit label (wrapping) -->
<label>
First name
<input type="text" name="firstName">
</label>
<!-- Method 3: aria-label (when visible label not possible) -->
<input type="search" aria-label="Search cases">
<!-- Method 4: aria-labelledby (referencing existing text) -->
<h2 id="contact-heading">Contact details</h2>
<input type="tel" aria-labelledby="contact-heading phone-label">
<span id="phone-label">Phone number</span>
  1. Test required field indication. Required fields must be indicated in a way accessible to screen readers:
<!-- Correct: programmatic required indication -->
<label for="email">
Email address
<span aria-hidden="true">*</span>
</label>
<input type="email" id="email" required aria-required="true">
<!-- Form instructions -->
<p id="required-desc">Fields marked with * are required</p>
<form aria-describedby="required-desc">

Screen reader test: Navigate to required field, verify “required” is announced.

  1. Test validation error handling. Submit a form with invalid data and verify:
ERROR HANDLING TEST
Trigger: Submit form with empty required fields
Checks:
[ ] Focus moves to first error or error summary: YES/NO
[ ] Error messages associated with inputs via aria-describedby: YES/NO
[ ] Error messages included in accessible name or description: YES/NO
[ ] Errors announced by screen reader: YES/NO
[ ] Visual error indicators have 3:1 contrast: YES/NO
[ ] Error text is specific (not just "Invalid"): YES/NO

Correct error implementation:

<label for="email">Email address</label>
<input type="email"
id="email"
aria-invalid="true"
aria-describedby="email-error">
<span id="email-error" class="error-message" role="alert">
Enter a valid email address, for example name@example.org
</span>
  1. Test form instructions and help text:
<!-- Instructions associated with input -->
<label for="password">Password</label>
<input type="password"
id="password"
aria-describedby="password-hint password-error">
<span id="password-hint">
Must be at least 12 characters with one number
</span>
<span id="password-error" role="alert"></span>

Screen reader test: Navigate to password field, verify hint text is announced as part of the field description.

  1. Test autocomplete attributes for personal data fields:
<input type="text" name="fname" autocomplete="given-name">
<input type="text" name="lname" autocomplete="family-name">
<input type="email" name="email" autocomplete="email">
<input type="tel" name="phone" autocomplete="tel">
<textarea name="address" autocomplete="street-address"></textarea>

Run automated check:

Terminal window
axe https://example.org/app/registration \
--rules autocomplete-valid \
--exit

CI pipeline integration

Integrate accessibility testing into your continuous integration pipeline to prevent regressions and enforce standards on new code. Configure tests to run on every pull request, with strict thresholds that fail builds when violations exceed acceptable limits.

  1. Add pa11y-ci configuration to your project:
.pa11yci.json
{
"defaults": {
"timeout": 30000,
"wait": 1000,
"standard": "WCAG2AA",
"runners": ["axe"],
"chromeLaunchConfig": {
"args": ["--no-sandbox", "--disable-setuid-sandbox"]
}
},
"urls": [
"http://localhost:3000/",
"http://localhost:3000/app/dashboard",
"http://localhost:3000/app/cases",
"http://localhost:3000/app/cases/new",
"http://localhost:3000/app/reports",
"http://localhost:3000/app/settings"
]
}
  1. Create a CI pipeline stage for accessibility testing (GitHub Actions example):
.github/workflows/accessibility.yml
name: Accessibility Tests
on:
pull_request:
branches: [main, develop]
jobs:
accessibility:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: '20'
cache: 'npm'
- name: Install dependencies
run: npm ci
- name: Build application
run: npm run build
- name: Start application
run: npm start &
env:
PORT: 3000
- name: Wait for application
run: npx wait-on http://localhost:3000 --timeout 60000
- name: Run accessibility tests
run: npx pa11y-ci --config .pa11yci.json
- name: Upload accessibility report
if: failure()
uses: actions/upload-artifact@v4
with:
name: accessibility-report
path: pa11y-ci-results.json
  1. Configure Playwright tests with accessibility checks:
playwright.config.js
import { defineConfig } from '@playwright/test';
export default defineConfig({
testDir: './tests',
projects: [
{
name: 'accessibility',
testMatch: /.*\.a11y\.spec\.js/,
use: {
viewport: { width: 1280, height: 720 },
reducedMotion: 'reduce'
}
}
],
reporter: [
['html', { outputFolder: 'playwright-report' }],
['json', { outputFile: 'accessibility-results.json' }]
]
});
  1. Set threshold-based failure criteria for gradual remediation:
tests/a11y-thresholds.spec.js
import { test, expect } from '@playwright/test';
import AxeBuilder from '@axe-core/playwright';
// Define acceptable thresholds during remediation
const THRESHOLDS = {
critical: 0, // No critical violations allowed
serious: 5, // Reducing over time
moderate: 10,
minor: 20
};
test('accessibility violations within threshold', async ({ page }) => {
await page.goto('/app/dashboard');
const results = await new AxeBuilder({ page })
.withTags(['wcag2a', 'wcag2aa', 'wcag21aa'])
.analyze();
const counts = {
critical: results.violations.filter(v => v.impact === 'critical').length,
serious: results.violations.filter(v => v.impact === 'serious').length,
moderate: results.violations.filter(v => v.impact === 'moderate').length,
minor: results.violations.filter(v => v.impact === 'minor').length
};
expect(counts.critical).toBeLessThanOrEqual(THRESHOLDS.critical);
expect(counts.serious).toBeLessThanOrEqual(THRESHOLDS.serious);
expect(counts.moderate).toBeLessThanOrEqual(THRESHOLDS.moderate);
expect(counts.minor).toBeLessThanOrEqual(THRESHOLDS.minor);
});

Reduce thresholds each sprint until reaching zero. Track progress by committing threshold changes:

# Sprint 1: Initial baseline
THRESHOLDS = { critical: 0, serious: 15, moderate: 25, minor: 40 }
# Sprint 2: 30% reduction
THRESHOLDS = { critical: 0, serious: 10, moderate: 18, minor: 28 }
# Sprint 3: 50% reduction from baseline
THRESHOLDS = { critical: 0, serious: 7, moderate: 12, minor: 20 }
# Sprint 6: Full compliance
THRESHOLDS = { critical: 0, serious: 0, moderate: 0, minor: 0 }

Verification

After completing accessibility testing, verify that your findings are actionable and your testing coverage is adequate.

Confirm automated scanning coverage:

Terminal window
# List all URLs tested
pa11y-ci --config .pa11yci.json --sitemap http://localhost:3000/sitemap.xml --dry-run
# Expected output shows each URL that will be tested
# Compare against application page inventory

Verify screen reader compatibility with multiple assistive technologies. NVDA on Windows and VoiceOver on macOS cover the majority of screen reader users. Document the version tested:

SCREEN READER COMPATIBILITY VERIFICATION
NVDA 2024.3 on Windows 11:
- Navigation: PASS
- Forms: PASS
- Dynamic content: PASS
- Tables: PASS
VoiceOver (macOS 14.1):
- Navigation: PASS
- Forms: PASS
- Dynamic content: PASS
- Tables: PASS
Date tested: 2024-11-16
Pages tested: 12
Issues found: 3 (documented in issue tracker)

Confirm that all WCAG 2.1 AA criteria have been evaluated:

WCAG 2.1 AA COVERAGE VERIFICATION
Perceivable (1.x):
[x] 1.1.1 Non-text content
[x] 1.2.1-1.2.5 Time-based media (N/A - no audio/video)
[x] 1.3.1-1.3.6 Adaptable
[x] 1.4.1-1.4.13 Distinguishable
Operable (2.x):
[x] 2.1.1-2.1.4 Keyboard accessible
[x] 2.2.1-2.2.2 Enough time
[x] 2.3.1 Seizures
[x] 2.4.1-2.4.10 Navigable
[x] 2.5.1-2.5.4 Input modalities
Understandable (3.x):
[x] 3.1.1-3.1.2 Readable
[x] 3.2.1-3.2.4 Predictable
[x] 3.3.1-3.3.4 Input assistance
Robust (4.x):
[x] 4.1.1-4.1.3 Compatible

Troubleshooting

SymptomCauseResolution
axe-core reports “color-contrast” violations on hidden elementsScanner evaluating elements with display: none or visibility: hidden that have inline stylesAdd aria-hidden="true" to hidden content containers, or exclude hidden elements with axe configuration: exclude: ['[aria-hidden="true"]']
NVDA announces “clickable” on non-interactive elementsJavaScript click handlers on <div> or <span> elements without proper rolesReplace with semantic HTML (<button>, <a>) or add role="button" and tabindex="0" with keyboard event handlers
Screen reader announces “image” with no descriptionalt attribute missing or generic file name used as alt textAdd descriptive alt text or alt="" for decorative images. For complex images, add aria-describedby linking to detailed description
Focus order jumps unexpectedly across pagetabindex values greater than 0Remove positive tabindex values. Use only tabindex="0" (add to focus order) or tabindex="-1" (programmatically focusable only)
Modal dialog does not trap focusFocus management not implemented for modalImplement focus trap: on open, focus first element; on Tab at last element, cycle to first; on Shift+Tab at first, cycle to last; on close, return focus to trigger
Form validation errors not announcedError messages not associated with inputs and no live regionAdd aria-describedby linking input to error message element; wrap error container with role="alert" or aria-live="polite"
Colour contrast checker shows different ratios than manual testingAutomated tool measuring computed styles, which may differ from source CSS due to inheritanceTest actual rendered colours; use browser DevTools colour picker on live page rather than inspecting CSS source values
pa11y-ci times out on single-page applicationsJavaScript rendering not complete when scanner runsIncrease wait option in config (e.g., "wait": 5000); add "waitFor": ".main-content" to wait for specific selector
VoiceOver reads content in wrong orderVisual layout uses CSS positioning that differs from DOM orderRearrange DOM to match visual order; avoid relying on CSS flexbox order or grid placement for content sequence
Dynamic content not announced by screen readerContent updates without ARIA live region or focus managementAdd aria-live="polite" to container for non-critical updates; use aria-live="assertive" for critical alerts; or move focus to new content
Keyboard-activated tooltips disappear immediatelyTooltip closes on blur before user can read contentImplement WCAG 2.1 tooltip pattern: keep visible while hover/focus persists, allow Escape to dismiss, keep visible for 1500ms minimum
axe reports “frame-title” violation<iframe> elements without title attributeAdd descriptive title attribute to all iframes: <iframe title="Organisation location map" src="...">
Screen reader announces duplicate contentVisible text and aria-label both present, causing double announcementRemove redundant aria-label when visible text label exists; use aria-label only when visible text is not present or insufficient
Checkbox or radio button state not announcedCustom-styled inputs hiding native elements incorrectlyEnsure native <input> remains in accessibility tree; use opacity: 0 and position: absolute rather than display: none
Skip link does not functionTarget element not focusableAdd tabindex="-1" to skip link target element (usually <main>); verify id matches href anchor

See also