Quickstart
Run your first AI-powered A/B test in under 5 minutes.
Overview
SimuTest lets you understand how real users experience your website — without waiting for real traffic. Paste any URL, and SimuTest sends AI-simulated users to interact with your site. Each session thinks out loud using Claude's extended thinking, producing a detailed UX report with thinking traces that show exactly where users struggled, hesitated, or succeeded.
How it works
- Paste your URL and describe the task you want users to complete
- SimuTest spawns AI sessions that navigate your site and think out loud
- Results stream in real time — watch sessions as they run
- Get a scored UX report with thinking trace highlights and actionable issues
Step 1: Paste Your URL
Go to app.simutest.dev and sign in. On the dashboard, click New Test and paste the URL of any live webpage — your landing page, checkout flow, pricing page, or any URL you want to evaluate.
For your first test, select Quick Test (50 sessions). This gives you enough data for statistically meaningful results while completing in under 10 minutes.
Tip: SimuTest works on any publicly accessible URL. For testing staging environments or localhost, use the SDK — see SDK Setup.
Step 2: Configure Your Test
Before running, configure what SimuTest should test and how:
| Option | Description | Default |
|---|---|---|
| sessions | Number of AI user sessions to run | 50 |
| model | Claude model for simulated users | claude-sonnet |
| viewport | Screen sizes to simulate | desktop, mobile |
| task | Natural language description of what users should try to do | — |
The task description is the most important setting. Be specific: instead of "browse the site", write "Find the pricing page and sign up for the Pro plan". Clear tasks produce more actionable results.
Test configuration example:
url: https://your-site.com
task: "Find and purchase the premium plan"
sessions: 50
model: claude-sonnet
viewport: [desktop, mobile]Step 3: Watch Results
Once your test starts, you're taken to the real-time dashboard. You don't need to wait — results stream in as sessions complete.
- 1
Progress bar
Shows sessions completed out of total, estimated time remaining, and current success rate.
- 2
Live session feed
Each completed session appears with its outcome (success/failure), task completion time, and a snippet of the AI's thinking at key decision points.
- 3
Emerging issues
Common friction points are surfaced as they appear across multiple sessions — you'll start seeing patterns well before all sessions complete.
Step 4: Read Your Report
When all sessions finish, SimuTest generates a full UX report. The report is structured to give you both a high-level summary and deep diagnostic detail.
Overall scores
Task completion rate, average time-on-task, error rate, and a composite UX score from 0–100. Scores are benchmarked against SimuTest's dataset of similar page types.
Thinking trace highlights
The most revealing moments from AI sessions — where users expressed confusion, reconsidered their path, or abandoned the task. These traces are the most valuable part of the report.
Top issues
Ranked list of UX problems by frequency and severity, each with affected session count, example quotes from thinking traces, and suggested fixes.
Session replay
Click any session to see the full interaction timeline: page states, clicks, scrolling, and the complete thinking trace alongside each action.
What's Next
You've run your first test. Here's where to go from here:
On this page