Usability Testing

7 Videos
status pause Introduction to Usability Testing 03:48 status pause Getting Started 03:14 status pause Building a Plan – Tasks 02:44 status pause Building a Plan – Questions 02:20 status pause Common Testing Errors 03:13 status pause Compile and Analyze 02:11 status pause Conclusion 01:08
previous Select Series

Usability Testing Part 4: Building a Plan – Questions

18 views • October 07, 2021

Usability Testing
Part 4: Building a Plan – Questions

 

icon
In this video, we look at the kinds of questions you might want to ask your test subjects.

Earlier, we talked about how test plans consist of a series of tasks and questions.

Questions can either be asked to assess the initial reactions to an app or website, or they can be used as a follow-up to a task.

There are a few different kinds of questions you can ask.

Multiple-choice questions are great for determining user preferences, especially in the early planning stages of a product.

They're especially suited to testing software because they also have a menu-like structure. If you think about it, navigating a website or program is a lot like answering a multiple choice questionnaire.

The key to writing effective multiple-choice questions is to cover all the bases. Always include a "none of the above" or "other" choice.

Scaled responses are great for quantifying test answers. This is that type of question you're often asked, which starts with the words, "On a scale of one to five. . . "

And if you need more subjective information to supplement scaled responses, you can always follow up with something like, "Why did you choose your response?"

Open-ended, written responses are good for gaging user preference, but they generally take more time and effort than multiple-choice or scaled ratings--so you want to use them sparingly. But you do want to use them. They have the added benefit of assuring testers that their opinions really matter.

Whatever the style of question you're asking, it's important to avoid leading respondents.

It's tempting to phrase a question to encourage the answer you want to hear. "Did you feel good about completing your task?" is subtly fishing for that positive response because most people don't want to appear negative or critical. Well, unless they're making a comment on YouTube.

"How do you feel about the task you were given?" is a much better question.

To sum up, always consider your stakeholders and remind yourself what it is you're trying to learn at this stage of the project, then decide on the type of questions you're going to ask: multiple choice, scaled rating, or written response. That choice will often be determined largely by the stage the project is in.

Finally make it clear to your testers that it is the application that is being tested, not them. This is one test where there are no wrong answers.

Transcript Earlier, we talked about how test plans consist of a series of tasks and questions.
Questions can either be asked to assess the initial reactions to an app or website, or they can be used as a follow-up to a task.
There are a few different kinds of questions you can ask.
Multiple-choice questions are great for determining user preferences, especially in the early planning stages of a product.
They're especially suited to testing software because they also have a menu-like structure. If you think about it, navigating a website or program is a lot like answering a multiple choice questionnaire.
The key to writing effective multiple-choice questions is to cover all the bases. Always include a "none of the above" or "other" choice.
Scaled responses are great for quantifying test answers. This is that type of question you're often asked, which starts with the words, "On a scale of one to five. . . "
And if you need more subjective information to supplement scaled responses, you can always follow up with something like, "Why did you choose your response?"
Open-ended,
written responses are good for gaging user preference, but they generally take more time and effort than multiple-choice or scaled ratings--so you want to use them sparingly. But you do want to use them. They have the added benefit of assuring testers that their opinions really matter.
Whatever the style of question you're asking,
it's important to avoid leading respondents.
It's tempting to phrase a question to encourage the answer you want to hear. "Did you feel good about completing your task?" is subtly fishing for that positive response because most people don't want to appear negative or critical. Well, unless they're making a comment on YouTube.
"How do you feel about the task you were given?" is a much better question.
To sum up, always consider your stakeholders and remind yourself what it is you're trying to learn at this stage of the project, then decide on the type of questions you're going to ask: multiple choice, scaled rating, or written response. That choice will often be determined largely by the stage the project is in.
Finally make it clear to your testers that it is the application that is being tested, not them. This is one test where there are no wrong answers.
OR
OR

Other videos you may like

Want to get Started?

Tell us about your project
Contact Us