For product and data teams with a chart-heavy feature
Dashboard Audit & User Testing
A written audit of your dashboard or chart-heavy feature, tested against real users completing real tasks
- You get a prioritised backlog your engineers can execute without me
- You see how real users actually read your charts — and where they get stuck
- Your team leaves with a shared understanding of what is working and what is not
Who this is for
You shipped a dashboard or chart-heavy feature with a generalist frontend team. It works, but the code is brittle, the interactions feel off, and nobody is sure whether it is actually helping users make decisions.
Typical scenarios:
- A product team at a Series A–C SaaS company whose analytics view is growing unmaintainable and whose PMs keep asking for chart changes the frontend team pushes back on
- An ML or AI product company that shipped model output, confidence, or attribution views and is seeing low engagement or user confusion in support tickets
- A design-system team whose chart components are inconsistent across the product and who need a roadmap to standardise
- A leadership team considering a rebuild and wanting an external senior opinion before committing budget
If your frontend team is using Recharts or a light D3 wrapper and you know the output is not what it should be, this is the right service.
What happens, step by step
Kick-off and access
- Within 48 hours of booking, you send a short brief (4 questions) and repo access.
- We agree on which surfaces are in scope and which user tasks matter most.
You know what is being reviewed and what is being left out, in writing, before any work starts.
Code and design review
- I go through the codebase, the rendered output on real devices, and the visual encodings.
- Rubric covering architecture, performance, accessibility, chart-type fit, and interaction patterns.
A running document of issues is shared with your team mid-review. No surprises at the end.
User testing round
- Five to eight real users are given specific tasks: find a stat, answer a question, complete an action.
- We measure time to answer, error rate, and where they got stuck.
You see, in video form, exactly where your users fail — not guesses, not heuristics.
Written report and walkthrough
- Report: must-fix, should-fix, and nice-to-have, tied to observations.
- Code-level annotations on the 10–20 highest-leverage files.
- 90-minute walkthrough with your engineering and product team.
Your team leaves the walkthrough knowing exactly what the next sprint looks like.
Optional retest
- After your team ships the must-fix items, we retest with five more users against the same tasks.
- You see measurable improvement in time-to-answer and accuracy.
A before/after number, not a vibe check.
Typical engagement
What's Included:
- Full code and design audit
- Visual encoding & accessibility review
- Architecture & performance assessment
- Written report & prioritized backlog
- 90-minute team walkthrough
What's Included:
- Full code and design audit
- Visual encoding & accessibility review
- Architecture & performance assessment
- Written report & prioritized backlog
- 90-minute team walkthrough