Accessibility Audit
Test your site with screen reader, low vision, motor impaired, and keyboard-only AI personas to surface WCAG 2.1 AA violations before they affect real users.
Scenario
Accessibility testing is often the last item on a release checklist and the first to get cut under time pressure. SimuTest's accessibility-testers persona pack simulates five distinct accessibility profiles, each with different assistive technology patterns, navigation strategies, and interaction constraints.
Unlike automated scanners that check static HTML rules, SimuTest's AI personas interact with your site dynamically — discovering issues that only appear during navigation, such as focus traps in modals, unlabeled icon buttons revealed after hover state, and keyboard-inaccessible dropdown menus.
What this audit detects
- Missing or inadequate alt text on images and icons
- Keyboard traps in modals, dropdowns, and custom widgets
- Insufficient color contrast ratios (WCAG 1.4.3)
- Missing ARIA labels on interactive elements
- Non-descriptive link text ("click here", "read more")
- Focus order that doesn't match visual layout
- Touch targets below 44×44px on mobile
Configuration
The accessibility-testers pack automatically distributes sessions across all five accessibility profiles. You can also run targeted tests against a specific profile using the personas.profile option.
version: 1
defaults:
model: claude-sonnet-4-20250514
sessions: 100
viewport:
- desktop
- mobile
tests:
- name: "Accessibility Audit"
url: http://localhost:3000
task: "Navigate to the pricing page and sign up for the free tier"
criteria:
- accessibility_score
- navigation_clarity
- task_completion
personas:
pack: "accessibility-testers"
quality_gates:
- metric: accessibility_score
threshold: 8.0
action: fail
- metric: task_completion
threshold: 7.0
action: failNode.js SDK
import { SimuTest } from '@simutest/sdk';
const simutest = new SimuTest({ apiKey: process.env.SIMUTEST_API_KEY! });
async function runAccessibilityAudit() {
const results = await simutest.test({
url: 'http://localhost:3000',
task: 'Navigate to the pricing page and sign up for the free tier',
sessions: 100,
criteria: ['accessibility_score', 'navigation_clarity', 'task_completion'],
personas: { pack: 'accessibility-testers' },
});
console.log('Accessibility score:', results.accessibilityScore);
console.log('Issues found:', results.issues.length);
// Group issues by WCAG criterion
const byWcag = results.issues.reduce(
(acc, issue) => {
const key = issue.wcagCriterion ?? 'uncategorized';
acc[key] = [...(acc[key] ?? []), issue];
return acc;
},
{} as Record<string, typeof results.issues>,
);
for (const [criterion, issues] of Object.entries(byWcag)) {
console.log(`\n${criterion} (${issues.length} issues)`);
for (const issue of issues) {
console.log(' -', issue.description);
}
}
if (results.accessibilityScore < 8.0) {
process.exit(1);
}
}
runAccessibilityAudit().catch(console.error);WCAG 2.1 AA Mapping
SimuTest maps each accessibility finding to the relevant WCAG 2.1 criterion, giving you a compliance-ready report that you can share directly with legal and compliance teams.
| SimuTest criterion | WCAG 2.1 criterion | Level | Issue type |
|---|---|---|---|
| alt_text | 1.1.1 Non-text Content | A | Missing or inadequate alt text |
| color_contrast | 1.4.3 Contrast (Minimum) | AA | Text contrast below 4.5:1 ratio |
| keyboard_navigation | 2.1.1 Keyboard | A | Elements unreachable by keyboard alone |
| focus_trap | 2.1.2 No Keyboard Trap | A | Focus becomes trapped in a component |
| focus_visible | 2.4.7 Focus Visible | AA | Keyboard focus indicator not visible |
| aria_labels | 4.1.2 Name, Role, Value | A | Missing ARIA labels on interactive elements |
| touch_target | 2.5.5 Target Size | AAA | Touch targets below 44×44px |
Accessibility Profiles
The accessibility-testers persona pack distributes sessions evenly across five profiles, each with distinct simulated behaviors and constraints.
| Profile | Simulated behavior | Primary issues found |
|---|---|---|
| Low Vision | 200% browser zoom, relies on text scaling and sufficient contrast | Layout breakage at zoom, contrast failures |
| Blind / Screen Reader | Navigates by headings, landmarks, and ARIA roles using virtual cursor | Missing alt text, unlabeled buttons, illogical heading order |
| Motor Impaired | Keyboard-only navigation, no mouse, switch access simulation | Focus traps, non-keyboard-accessible widgets |
| Cognitive | Prefers simple language, avoids time limits, needs clear error messages | Jargon-heavy copy, timeout warnings, unclear errors |
| Color Blind | Deuteranopia simulation — cannot distinguish red from green | Color-only status indicators, red/green UI elements |
Expected Results
A typical accessibility audit against a mid-complexity SaaS application surfaces 8–15 distinct issues. Issues are ranked by severity and mapped to WCAG criterion, affected persona group, and session frequency.
Keyboard trap in navigation dropdown
Affects 100% of keyboard-only and screen reader sessions. The main navigation dropdown opens on hover but cannot be opened or closed with the keyboard. Tab-key users skip the dropdown entirely; focus is lost when it appears via mouse simulation.
Icon buttons missing accessible names
Affects 100% of screen reader sessions. The share, bookmark, and settings icon buttons in the dashboard toolbar have no aria-label or visible text. Screen readers announce them as "button" with no context.
Insufficient color contrast on secondary text
Affects low vision and color blind sessions. The muted gray text used for descriptions and captions (#9CA3AF on white) has a contrast ratio of 2.85:1, below the required 4.5:1 for normal text.
Color-only error state indicators
Affects color blind sessions. Form validation errors are indicated only by a red border color, with no icon or text label indicating the error state. Deuteranopia personas cannot perceive the red border and proceed without correcting the field.
On this page