Manual AT Verification
Automated scanning catches roughly 57–70% of detectable WCAG issues. The rest — screen-reader flow, keyboard traps, focus order, dynamic content announcements — requires a human with an assistive technology. This guide equips a partner agency's QA, not just a full-time accessibility specialist, to verify AllyProof fixes by hand.
Who this is for
- Agency QAconfirming a developer's fix before the client sees it.
- Vendor compliance teamssigning off on VPAT self-assessments where automation isn't sufficient.
- Procurement reviewers spot-checking a site they plan to buy software from.
Before you start
Every per-rule fix carries a Verify with …section on the issue detail page — that's the canonical recipe. This document is the broader orientation: how to set up the tools and how to reason about what you hear when the scanner says "passes".
One-time setup
Windows + NVDA (recommended starting point)
- Download NVDA from the NV Access site. NVDA is free; AllyProof does not ship it.
- Install in the default location. On first launch, NVDA reads the dialog aloud — turn your volume up and confirm you hear it.
- Configure Firefox or Chrome as the testing browser. NVDA works with both; Firefox has slightly better accessible-tree behavior.
- Learn the five bindings you'll use 90% of the time:
Tab,H(next heading),D(next landmark),F(next form field),NVDA+Space(toggle browse/focus mode).
macOS + VoiceOver
- Enable via System Settings → Accessibility → VoiceOver. The feature is built in — no install.
- Learn the VO key:
Control + Option. Almost every VoiceOver binding is prefixed with it. - Safari is VoiceOver's native pair. Chrome works but has quirks around dynamic content.
Keyboard-only check (both platforms)
Half of "manual AT testing" is just unplugging the mouse. Tab, Shift+Tab, Enter, Space, and the arrow keys — if you can't reach every interactive element and activate it, the site fails regardless of what any screen reader says.
How to verify a specific issue
- On the AllyProof issue page, read the Verify with … panel — it names the tool and the exact steps.
- Open the affected page in the specified browser, with the named AT running.
- Follow the numbered steps. Each step is a single observable outcome — either the announcement/behavior matches the expectation or it doesn't. There's no judgment call per step.
- If the check passes, leave a manual-test comment on the issue (the comment form has a Manual verification type) with the tool name, OS, and date. This is what your client sees in the audit trail.
- If the check fails, re-open the issue and note exactly where the step broke. A screenshot or a screen-reader transcript is ideal.
Common pitfalls
- Testing in focus mode when you should be in browse mode. NVDA's browse mode is what simulates reading the page; focus mode is for filling a form. If arrow keys aren't stepping through content, press
NVDA+Space. - Running the AT in the wrong browser. A page that reads correctly in Safari/VoiceOver may misbehave in Chrome/VoiceOver because of subtle accessible-tree differences. Match the browser the issue page specifies.
- Treating "the scanner passes" as done. Automation catches syntax; it cannot tell you a button's name makes sense in context. Always run the listed manual check after the scan turns green.
- Skipping keyboard-only for screen-reader-only. Screen-reader users typically navigate with a keyboard; if keyboard traps exist, the screen-reader experience is broken too. Do the keyboard pass first.
What counts as enough evidence
For a VPAT self-assessment or a public accessibility statement:
- Name the tool (e.g. NVDA 2024.4), the browser (e.g. Firefox 128 ESR), and the OS (e.g. Windows 11) used for the check.
- Record the date the check was performed — reviewers will expect it to be within the last quarter.
- For disputed findings, keep a short transcript (copy-paste of the NVDA speech viewer output, or a VoiceOver recording).
When to escalate to an a11y specialist
Partner QA can confidently verify the 30-odd rules listed in the AllyProof rule catalog. Escalate to a dedicated accessibility tester when any of the following is true:
- The issue involves ARIA live regions, custom widgets (comboboxes, trees, grids), or dynamic content behavior — the verification isn't step-by-step mechanical.
- The site is a government, healthcare, or financial target where a mis-call has legal consequences.
- The VPAT reviewer specifically asks for third-party expert attestation — in that case, automated + partner-QA coverage is the floor, not the ceiling.