Moderated Usability Testing: A Tactical Guide

Kritika Oberoi
Closing the Loop with User Research
8 min readFeb 21, 2022

--

Photo by Alvaro Reyes on Unsplash

Usability testing is the cheapest, fastest way to iterate on your product. You learn what not to build before even a single line of code is written, saving your team weeks of engineering time. As a bonus, you get to see users react in the most unexpected ways to your designs — confirming that people are not so predictable after all.

But like with most things in a business, decisions often comes down to 💰 at the end of the day. So here are some examples of how usability testing, or the lack-thereof has impacted real products and companies:

  1. SoundCloud found over 150 usability issues by testing their mobile app before release, meanwhile…
  2. Citibank accidentally transferred $900 million instead of $8 million to a client because they relied on software with poor usability

Despite the obvious value of usability testing, many teams struggle to conduct it because they don’t know how to do it well. One of the most common statements I hear from PMs and Designers is, “I’m not sure which questions to ask”.

This article will give you an overview of how to run moderated usability tests and arm you with real life usability testing scripts used by expert researchers and designers at companies like Growens, PandaDoc, 15Five, and Obvious.

The Basics

First thing’s first — here are some basic guidelines to help you run a successful usability test.

What To Do

1) Keep the Scenarios Realistic: Give your user a task that is representative of what they might actually be doing on your software. For example, when we were running usability tests for a time-stamped note-taking feature at Looppanel, we asked users to take notes during a mock user interview (a user interview inside the real user interview — meta, I know!).

2) Talk Out Loud: With a usability test you’re trying to simulate how your user would feel the first time they interact with your product. What you want to know is — what’s going on in this person’s mind? How are they reacting to the product the first time they see it? Are they able to navigate through it successfully?

Since we don’t live in a world where you can read someone’s mind (thank god!), we do the next best thing with usability tests — ask users to Think Out Loud. It is exactly what it sounds like — users stream-of-consciousness share all their thoughts, feelings, questions as they’re navigating through the prototype.

Did they look at a button on the right-hand-side? Why? What did they expect it to do? You really want All. The. Details.

3) 5 Whys: You’re running a moderated usability test so you can understand the root of your user’s confusion and what their true needs and expectations are. You won’t get to the root the first time you ask them a question, so keep probing with Why until you hit the depth of the reason.

For example, when I asked users about emojis we were testing in our note-taking feature, many participants responded that they found them “confusing”. Although that’s insightful, if I don’t understand why they found them confusing, I don’t know whether to replace those emojis with new ones, with text, or remove them altogether.

By probing with ‘Why’ we found that emojis were open to interpretation for users — they weren’t sure what ⭐ or ✅ should mean. Given the rapid, time-constrained nature of taking notes during a call, they often chose to skip using the emojis altogether because it simply took too long to figure what to do with them.

Knowing why they were confused allowed us to step back and remove emojis from the note-taking view altogether, replacing them with a simple bookmark to time-stamp key moments that didn’t require too much hemming and hawing to figure out.

4) Observe, Observe, Observe: A usability test gives you a wealth of information — some in the user’s facial expressions, others in the movements of their mouse, and still some more in the hesitant tone of their voice when they’re talking out loud.

You’ll want to try to pay attention to all of the above. It does get pretty challenging, which is why it helps to (1) record the session in case you need to go back to it, (2) have a note-taker on the call with you to notice anything you missed!

Bonus! Tips to Make Your Life Easier:

  1. Honesty is the Best Policy! Because people are often trying to be nice rather than honest, it’s helpful to explicitly clarify to them upfront — it’s okay to be brutally honest! I won’t be hurt, I’ll actually appreciate your candid feedback.
  2. Bring Your Own Note-taker: Get someone on your team to observe and take notes during the call. It’s nearly impossible to talk to users, observe what they’re doing, take notes, and think at the same time. But having good notes will save you SO much time in the analysis phase when you’re trying to find patterns across calls. Whenever possible, cajole, bribe or otherwise convince someone on your team to join your usability tests and take notes or bookmark key moments so it’s easy for you to collate your findings later.
  3. Always Be Recording: You never know what moment you’ll want to go back to, or who you’ll want to send a clip of a user rage clicking on the prototype — so always be recording!

What Not To Do

1) Don’t Ask Leading or Close-Ended Questions: “Do you like Feature X?” is a bad question to ask for 2 reasons. First of all, your user’s likely response to a question like that will be “Yes, I like this” or “No, I don’t” because of the close-ended nature of the question. As we covered above, what you really want to know is why! Second, your question isn’t neutral (you’ve embedded the word “like”) and because people often default into responding politely, they’re more likely to say “Yes, sure I do!”

A better way to phrase the same question is, “How do you feel about X feature and why?” You’re not leading the user in any particular direction, and you’re more likely to get lots of amazing insight, beyond just a simple “Yes” or “No”.

2) Don’t Answer Every Question: The point of a usability test is to see when your users are confused, which they will often be. When they are confused, they’ll default into asking questions — “What does this icon mean?”, “Where’s the right tab?”, and on and on. These questions are actually amazing data points — they point you to where your users are really, truly confused and where their expectations from your product were not met.

What you do not want to do is start answering all these questions during the test itself — that’ll take you off track from the task you’re testing for and may give the user information mid-task that wouldn’t otherwise be available in a natural setting.

Instead what I often do is let users know upfront that I may not answer all their questions (but they should keep them coming anyway), note their questions down and then answer them at the end of the task. That way I can dig into the reason these questions came up at all (more why follow-up questions!).

Caveat: Answer your user’s questions if they are blocking their progress on the task! (e.g., if the user is struggling with a prototype limitation, you can jump in to answer their question)

3) Don’t Tell Users What To Do: Usability tests can be one of the biggest challenges of self-control (think Marshmallow test) because your job is to watch your user struggle through your design, but can’t say anything. At All.

You’re trying to see what your user would do if you weren’t around so you have got to simulate that experience. Sadly that means zipping it and silently watching them struggle through your prototypes ​​😭

📜 Let’s Get Scripted!

Now that the basics are covered, it’s time to get into the actual usability test interview scripts.

A script is very important for any user research interview, but with a usability test it’s even more crucial since you need to guide users through very specific tasks. Usability test interview scripts have a 4 core parts:

1) 📝 Introduction: This is where you explain to the users what a usability test is, what the expectations from them are, as well as ask for permission to record the call and confirm that any legal documentation your company may require is signed!

2) 🔥 Warm Up Question: It’s always best to let your user “warm up” to the interview, getting them comfortable with some contextual questions that can help you evaluate their responses as well. Typically you’ll want to ask a bit about who your users are and their experience with your product or competitors, if any.

3) 🧩 Usability Tasks: This is the meat of the matter. This is where you’ll set up specific realistic “tasks” for your users (e.g., set up a new Google Doc and invite your teammate to collaborate on it). You’ll watch them navigate through your prototype, noticing when they click the wrong part of the screen, when their faces scrunch up in confusion, and even timing them on how long it took to complete the task.

4) 🎁 Wrap Up Questions: The Outro! Ask your final follow-up questions and allow your participants to ask any they might have been holding onto as well.

One of my favorite questions for this part of the call is, “Is there anything I didn’t ask you that you think I should be aware of?”. Some really magical insights come up with just that one single open ended question.

Usability Testing Guides from Real Life User Researchers

Many incredible teams like Growens, PandaDoc, 15Five, and Obvious have already created usability testing scripts that their design & research teams are able to leverage every time a new study comes up.

To help you get kicked off faster, these expert teams were gracious enough to let us share their knowledge & resources with you:

PandaDoc’s Usability Testing Script

Growens’ Usability Testing Script

15Five’s Usability Testing Script

Obvious’ Usability Testing Script

✔️ Checklist for Every Call

As one final parting gift from me to you, I wanted to share a checklist I usually keep handy during usability tests with reminders of the key tasks I need to complete. Pull this out before every call to make sure you haven’t missed anything!

1) Before the call

a) Reminders: Send a reminder email to your participant to confirm that:

  1. They can still attend the call at the agreed time
  2. They will have reliable internet access
  3. They will be able to join the call from their computer and share their screen

b) Note-takers: Ensure that your note-taker:

  1. Is added to the call invite
  2. Has a link to a document or app (like Looppanel!) where they can take notes during the call

c) Prototype: Keep your prototype link ready to share with the participant

2) During the call

a) Record the call: Ask the participant for permission and start recording the call

3) After the call

a) Debrief: Clean up your notes and check-in with your note-taker for a 5 minute debrief. During the debrief, cover these topics:

  1. Key takeaways and observations that you both may have had
  2. Anything that didn’t work in the script or usability test set up itself (e.g., did the prototype not load correctly?)
  3. Questions you’d like to add or remove from the script based on your findings so far

b) Analysis of Findings (this is a whole other post!)

That’s all folks — happy testing!

Looppanel streamlines User Research with transcripts you can trust, one-click sharing of video clips, time-stamped note-taking and more. We help you get from user conversations to shareable, actionable insights in minutes.

--

--

Kritika Oberoi
Closing the Loop with User Research

Building Looppanel to streamline User Research — from conversation to actionable, shareable insight [https://www.looppanel.com/]