-|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|- |-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-| -|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|- |-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-| -|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|- |-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-| -|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|- |-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-| -|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|- |-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-| -|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|- |-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-| -|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|- |-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-| -|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|- |-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-| -|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|- |-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-| -|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|--|- |-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-||-|
In Part 1 of our usability testing series, we covered what usability testing is, why it matters, and when to do it. Now let's get into the how. Here are our UX agency's favorite strategies and best practices for gathering the data you need to make smart, user-centered product decisions.
Effective usability testing comes down to three steps: define your user and business objectives, screen the right participants, and conduct tests that observe real behavior without leading the witness. Get those right, and you'll have everything you need to improve your product with confidence.
What Are User and Business Objectives in Usability Testing?
Step 1: Determine User and Business Objectives
Before you test anything, you need to know what you're testing for. That means documenting two distinct sets of goals.
User objectives are about your customers. Why are they using your product? What tasks are they trying to complete, and what problems are they trying to solve?
Business objectives are about your organization. Why are you doing usability testing in the first place? What do you hope to learn?
Both sets of goals matter, and here's why. If you focus solely on business objectives, you might miss whether users actually want or need your product. If you focus solely on user objectives, you might build something customers love that isn't viable for your business.
By outlining what both stakeholders are trying to accomplish and why, your team will be able to ask the right questions during testing and determine whether your product is actually meeting those goals.
How Do You Screen Participants for Usability Testing?
Step 2: Screen Your Participants
Before diving into user testing, make sure your participants are part of your product's target audience. If you're designing a website hosting and management product, for example, you wouldn't want to test someone who has never managed a website. That person can't relate to the day-to-day tasks, challenges, and frustrations of your actual users.
Have a screener distribute a short survey to identify users who represent your ideal customers. We like SurveyMonkey, Google Forms, and Google Consumer Surveys because they make survey creation, distribution, and analysis easy. Bonus: if you don't already have a list of potential participants, you can use these services to purchase responses from users who fit your target demographic.
What Questions Should You Ask When Screening Usability Test Participants?
Focus on behaviors, not demographics. The answers to demographic questions like age and gender don't impact a user's decisions as much as their context and behavior do. Instead, build your screener around:
- Occupation
- Work responsibilities
- Personal interests
- Digital behaviors and proficiency
- Purchase-related intentions
- Workspace and environmental conditions
- Cultural norms and biases
These answers will help you zero in on the people who are the best fit for your test. And while you can usually identify 80% of major UX issues with as few as 6 participants, the more data you can gather, the better.
How Do You Conduct a Usability Test?
Step 3: Conduct Usability Tests and Analyze Results
Your primary goal during testing is to observe how real people would use your product in a real environment. Tests can be conducted remotely or in person, as long as users can perform tasks naturally.
What Tools Should You Use for Usability Testing?
Here are some of our favorites:
- Remote testing: GoToMeeting
- In-person testing: Techsmith Morae
- Testing prototypes: InVision for clickable prototypes, Lookback for recording sessions
- Testing existing sites and apps: FullStory and Hotjar
How Do You Set Up a Usability Test Session?
Before the test begins, set the user's expectations about what the test is for and approximately how long it will take. More importantly, reassure them that the interface is being tested, not them. If users feel like they're the ones on trial, they won't admit when they can't find something, or they'll give you the answer they think you're looking for. Either way, your results get skewed.
During testing, use a "think aloud" protocol. Ask participants non-leading questions and have them talk through how they would complete basic tasks without any guidance. Ask them to wait for permission before clicking through to the next step. Then use follow-up questions to learn whether their hopes aligned with their expectations and how they're feeling in the moment.
What Does a Good Usability Test Conversation Look Like?
Let's say you're designing an app and want to test whether the navigation is clear. The conversation might go something like this:
Researcher: "Show me how you would change your notification settings. Where would you look first?"
Participant: "I would click on my avatar because that's usually where settings are."
Researcher: "Ok, please do that."
::Participant clicks on avatar::
Researcher: "Is that what you expected to see?"
Participant: "Kind of. It has my contact and billing information, but not my notification settings."
Researcher: "What did you hope to see?"
Participant: "I hoped I could update other settings here, like my notifications and password."
In this test, researchers would see that the interface didn't work as intended and caused frustration. If they hear similar feedback from other users, they should consider moving notification settings to the main account settings screen, making the navigation clearer, or trying a completely different approach. As a designer or product owner, it may not be the feedback you wanted to hear, but it's the feedback you needed to hear.
What Do You Do After Usability Testing?
Once testing is complete, review the recorded sessions and data with your team. Map the findings back to the objectives you outlined in Step 1, and look for the gaps. That's where your roadmap lives.
Best Practices From the Pros
The process of facilitating usability tests is a science in and of itself. A well-designed and executed test will produce useful feedback. Common mistakes, though, can lead to biased or inadequate results. Here's what we've learned over the years.
What Should You Do During Usability Testing?
- Contact way more users than you'll need. You won't hear back from everyone, so it's good to have backup testers.
- Provide a gift to thank users for their participation. Be careful not to bribe them, though, or they may give skewed answers. We've found that something like a $25-50 Amazon gift card usually strikes the right balance.
- Write a script, but be prepared to deviate from it. Follow the user's natural path. There's so much gold and so many a-ha moments there, so don't be a slave to the script.
- Use a "think aloud" protocol. Encourage users to talk through their thought process and share whatever comes to mind.
- Ask open-ended follow-up questions during testing. (e.g., "Can you tell us more about that?")
What Should You Avoid During Usability Testing?
- Don't focus too much on demographics in the screening process. Age and gender don't drive decisions the way context and behavior do.
- Don't lead users down a certain path or explain the interface during testing. Saying things like "This is helping you, right?" or "Let me explain what this does..." will skew the data.
- Don't have different user segments complete the same test. Current customers and potential customers have unique needs and workflows. Separate them to get focused results.
- Don't force users to perform tasks that aren't relevant to them. Irrelevant questions lead to irrelevant answers that cloud the rest of the data.
- Don't make users feel like they are the ones being tested. Keep them in a positive, productive mindset. You want honest answers, not answers they think you want to hear.
- Don't go longer than 45 minutes per test. Users get fatigued after that and may stop providing quality feedback.
Now You Know...and Knowing Is Half the Battle
Don't you want to know that users love your product and that it meets their needs? Wouldn't you want to realize sooner rather than later if you're drifting off course, so you can avoid wasting resources and putting your company at risk?
You may be able to find out if your product functionally works just by trying it yourself, but you can't know the answers to those questions without putting on your lab coat and testing it with real users. Only then can you validate your hypothesis and know without a doubt that you're designing products that will lead your users and your business to success.
Thinking about doing usability testing for your product? Gathered some initial results but not sure what to do with them? Reach out to the UX team at Drawbackwards for help navigating the process and taking your product from good to great.
FAQ
How many participants do you need for usability testing? You can identify around 80% of major UX issues with as few as 6 participants. More participants will give you more data, but a small, well-screened group is enough to surface meaningful patterns.
How do you make sure usability test results aren't biased? Reassure participants that the interface is being tested, not them. Use non-leading questions, follow a think-aloud protocol, and avoid explaining or guiding users through the interface during the session.
How long should a usability test session last? Keep sessions to 45 minutes or less. After that, users tend to get fatigued and the quality of feedback drops off.
What's the difference between user objectives and business objectives in usability testing? User objectives describe what your customers are trying to accomplish with your product. Business objectives describe what your organization wants to learn or validate through testing. You need both documented before you start, or you risk building something that serves one side but not the other.
Should different customer segments be tested separately? Yes. Current customers and potential customers have different needs, mental models, and workflows. Running them through the same test muddies the results. Separate segments make it easier to get focused, actionable feedback for each group.
Get Educated