Having a good usability testing script is essential for a successful study. But how do you write one? We’ll explain everything you need to know in this chapter!
What is a usability testing script?
A usability testing script is a document that serves as a plan for your usability testing study session. It contains all the information about the researcher’s interaction with the participant step by step, including the tasks and the questions that are going to be asked.
It’s most common to write a script when planning a moderated usability testing study, as a tool for the moderator to conduct testing consistently and without omitting any parts.
When running an unmoderated test, you might not need one, as there is no direct interaction between a researcher and the participant. An unmoderated study is usually set up with the help of a usability testing platform like UXtweak which sticks to a script based on how you set it up.
Check out our Moderated vs. Unmoderated Usability Testing guide to learn more about both methods.
Why do you need a usability testing script?
A good usability testing script will be your roadmap that helps to not get lost in a variety of user sessions and makes sure you don’t forget anything.
Here are the main benefits of having a usability testing script:
- Stores all the things you need to say in one place, making it easy to interact with the participants and avoid chaos.
- Encourages you to prepare tasks and questions and think them through beforehand.
- Assures consistency: you give all testers the same tasks and ask the same questions. It simplifies the analysis process later and helps to avoid confusion.
- Helps you make sure nothing is forgotten and the test goes according to plan.
How to write a good usability testing script?
First things first, define a clear goal you want to achieve with your study. Don’t make it too vague and avoid setting more than 1-3 goals per study. This will allow you to laser focus on solving and researching the most crucial problems and simplify the analysis process later on.
Intro
Start by writing a good introduction. The beginning of the session is when you’ll have your first interaction with the study participant, introducing yourself and setting the general mood for the study. You have to make them comfortable from the first minutes of the session so a great tip is to start with a small talk: ask them something about their day or just say something relatable.
Setting a comfortable environment is important because it allows the participant to be as honest as possible during the test and assures they are not afraid of expressing their confusion or thinking out loud.
This is also the time to instruct them on the process of the study and once again ask for their consent. Explain what you need them to do, how long the study is going to take and ask if they have any other questions before the test starts.
Pre-study questions
Before respondents start completing tasks, it’s a great time to ask them a couple of background questions.
Typical pre-study questions are used to determine respondent’s demographics, computer skills, experience with the specific software or a company, which product you’re going to test. Depending on the answers, you may personalize tasks and following questions so the tester may more easily relate to them.
Tip: If you think that asking a question could influence the outcome of the testing (e.g., by bringing up something the tester wouldn’t consider on their own), there’s no shame in saving that question until after the tasks are completed.
Don’t forget to remind testers that there are no wrong or right answers and that it’s the website you’re testing, not them or their skills.
Defining the tasks
Here’s what to keep in mind while writing the tasks for your usability test:
- They should be realistic – Make sure your tasks represent common user activities on the website. (E.g.,. Find a pair of white sneakers and add them to the cart.)
- Formulate tasks as scenarios – This will help respondents understand them better and therefore, look for the solutions like they would in real life. (E.g., Instead of saying “Buy a digital camera” try “You’re in the market for a digital camera. Find the one you like and add it to the cart”.)
- Don’t give away the solutions – Make sure your task doesn’t contain any hints that could potentially help respondents find the solution. (E.g., If your menu contains the label “digital cameras”, instead of using the same term, use descriptive phrasing such as “You’re in the market for a device that will allow you to take high quality photos of your family. …”)
We have a whole guide explaining the process of writing a good usability testing task. Check it out for more tips!
Note down all of the questions that you would like to ask regarding the individual tasks. Some questions may be conditional depending on the respondent’s actions. There are two possibilities for timing your questions. Consider their pros and cons when writing your script:
- During tasks – Questions are asked immediately, while the subject is actively on tester’s mind/in front of their eyes. Interrupts the natural task flow, risk of modifying thinking process.
- After tasks – Testers should still freshly remember the subject, but answers may differ from impulsive ones. Does not intrude on task completion.
Don’t forget to ask how they find the overall difficulty of the tasks. Was it hard or easy to complete?
Avoid adding too many tasks to your usability testing script. We recommend having a maximum of 8 tasks per session, ideally around 5. If you add more, your study may become too lengthy and result in exhausting the respondent. This of course depends on the complexity of the tasks. Conduct a quick pilot with a friend or a colleague. A moderated testing session should not take longer than an hour to complete (unless you’re taking breaks).
Post-study
After all the tasks are completed, it’s time to ask the respondent for some additional feedback.
Give them room to express themselves, and you might receive additional comments for qualitative user study analysis that will also be helpful.
You can ask about their general impression from the test, website’s features and design. Whether they think the product was useful and if they would recommend it to a friend or a colleague.