How we do design and usability audits
Design and usability audits are among the most important things we do at Res.im. Depending on the specific situation these might also be called “expert reviews” or “heuristic evaluations.” Whatever name we use, we always follow the same basic approach to reviews and audits: user-centred, structured, collaborative, and actionable.
The first thing we do in a usability audit is understand the users’ primary scenarios, motives, and needs. In some cases, like when we’re hired to help optimize an interface for a very specific task (like finding and buying a product) user needs and business goals are closely aligned and we can start right away.
For more complicated or multi-faceted interfaces, like B2B websites or government intranets, we might do some light preliminary research, like looking at analytics and doing a top tasks survey (if time allows).
It’s also important to be familiar with a broad range of behaviours. Frequent exposure to user research and usability testing helps us maintain the healthy degree of empathy, objectivity and humility to see things through a user-focused lens. It makes it easier to recognize problem spots and explain why they’re problems.
Another distinguishing factor of a usability audit (compared to, say, a design critique) is formalized, structured outputs. Critiques are conversations in themselves. Audits build foundations for larger conversations and decisions.
The formality and thoroughness of an audit requires a pre-defined framework or checklist. Jakob Nielsen’s set of usability heuristics is one of the most frequently cited frameworks, despite its datedness (1995) and the fact that Nielsen’s own consultancy uses a more sophisticated framework for its own audits.
The framework Res.im uses has evolved through years of UX work and fine-tuning to make it comprehensive and adaptable without being over-complicated.
We organize our UX principles around four core pillars:
Each of those is broken down into another 3-5 qualities or attributes that every interface should have, like “focuses user attention on completing key tasks,” “uses language familiar to entire target audience,“ and “engages users through interaction and feedback.”
A structured framework is important but not sufficient for doing a thorough UX audit.
There’s no way to shortcut the diligence required to actually go through an experience, step by step, task by task, looking for potential issues and opportunities to improve it.
Usability audits are more difficult to automate than technical performance, SEO, or accessibility audits. There’s no browser plugin for finding unnecessary steps in the user flow or places where visual hierarchy doesn’t align with user priorities. Reliance on human labour means that usability audits are subject to biases and blind spots, even when experts are involved.
A skilled, experienced reviewer can find a lot of usability issues, but other team members will see things from different perspectives. One team member might have a better eye for design while another has more familiarity with a specific type of user behaviour in usability tests.
Res.im’s usability audits usually include input from researcher, strategist, and design perspectives. Developers also might be consulted, especially if the scope of the audit includes making recommendations.
Actionability isn’t necessary for every audit, but it’s something Res.im likes to emphasize. We like seeing results.
We usually find some quick wins or low-hanging fruit — recommendations that can be addressed right away with minimal effort, like changing a label. There are also usually some bigger, longer-term recommendations that require more planning and budgeting.
There might also be some “question marks” that would benefit from follow-up testing or research. This may be the case when an interface has something proprietary or unique that we haven’t tested before. We can make informed guesses, but many hours of user exposure have taught us when it’s worth testing our assumptions.
The best design and usability audits provide clear takeaways and recommendations. Our job isn’t done until they’ve been clarified and questions have been answered.