Decrypt a forever neutral rating from a usability test

18 Aug 2010 - 10:26pm
3 years ago
7 replies
775 reads
Wenbo Wang
2008

Hi Folks,

Today during my usability testing session, I encountered a user with "forever indifferent" results. I feel it may be interesting to share it with you guys.

The story is: during debrief, the user shared nothing except his occupation; no comments, no thoughts. The user finished all tasks successfully, and at an incredible speed, each step costing him no more than 30 seconds. I tried to communicate during the post-testing survey and casual chat, his rating is forever neutral, or no preference between two choices.

More information on the usability testing setup: this is a test of the navigation & IA  of a recently redesigned website (still on staging server). During the test, my team did observe a few behaviors of the user, like "turn-to-search-box-first", but that's not enough. 

It's the first time I had this situation. Any thoughts are most warmly welcomed. Thanks!

Best,

Wenbo

Comments

18 Aug 2010 - 11:54pm
Mathew Sanders
2009

When recruiting participants I like to filter participants by asking them a non-topic question like what super power they would choose, or what was the best movie they've watched lately -- and *why*.

Usability evaluations are a bit of a weird experience - being asked to 'think aloud' isn't something that everyone is comfortable doing. I use these questions to see how articulate and comfortable people are with taking about random things. 

I know this adds a bias to the evaluation - but no matter what way you look at it usability evaluations are already full of bias, so it's about understanding it exists and considering this in the interpretation.

Anyway, hope that gives you something to think about - it works for me :)

19 Aug 2010 - 12:06pm
bkillam
2010

If the test did include the requirement to "think aloud", that could be part of the issue. Though this is included many usability evaluation, the approach generates both increased anxiety and biases any performance due to split attention. (Not to mention it asks for data the user cannot possibly provide.)

In this case, given the participants good performance, it may there was just a lack of rapport. There might not have been enough time in testing to develop the relationship of trust. (It really has to be developed before any testing starts.) Then again, it could just be the person, or their attitude at the moment. Or, since the task were so easy for the participants, they didn't understand why this was such a big deal and questioned what was really going on.

BTW, Mathew, do you really recruit and "filter" participants by this approach or is it something you do prior to testing as an ice breaker?

Bill

Bill Killam, MA CHFP/CUXP President, User-Centered Design, Inc. 20548 Deerwatch Place Ashburn, VA 20147 703-729-0998 (Office) 703-626-6318 (Mobile) http://www.user-centereddesign.com

On Aug 19, 2010, at 1:41 AM, Mathew Sanders wrote:

> When recruiting participants I like to filter participants by asking them a non-topic question like what super power they would choose, or what was the best movie they've watched lately -- and why. > > Usability evaluations are a bit of a weird experience - being asked to 'think aloud' isn't something that everyone is comfortable doing. I use these questions to see how articulate and comfortable people are with taking about random things. > > I know this adds a bias to the evaluation - but no matter what way you look at it usability evaluations are already full of bias, so it's about understanding it exists and considering this in the interpretation. > > Anyway, hope that gives you something to think about - it works for me :) > >

19 Aug 2010 - 9:58pm
Wenbo Wang
2008

Hi Bill,

Yes I was using "think aloud" method. That may be part of the issues. Your insights are really to the point: usability testing moderators need to build the relationship between the participants and themselves.

Another btw, since I recruited quite frequently from craigslist, I noticed there's an increasing amount of participants who are looking for a job. Their rating and feedback are sometimes cynical. It can possibly be another topic to discuss.

Wenbo

20 Aug 2010 - 12:05am
Mathew Sanders
2009

Wenbo - that's a great point around people who are almost becoming semi-professionals as participants in usability evaluations.I've had a couple of people in the past who were very articulate and quick to respond and have detailed answers, but then found that they had been involved in several previous studies, and were keen to go on any list we kept for future evaluations. 
In this case I'm also suspicious about how reliable their feedback is, or if they are giving answers that they have been 'trained' to give in evaluations. Always something to keep in mind! 

On 20 August 2010 15:57, Wenbo Wang <wbwong@umich.edu> wrote:

Hi Bill,

Yes I was using "think aloud" method. That may be part of the issues. Your insights are really to the point: usability testing moderators need to build the relationship between the participants and themselves.

Another btw, since I recruited quite frequently from craigslist, I noticed there's an increasing amount of participants who are looking for a job. Their rating and feedback are sometimes cynical. It can possibly be another topic to discuss.

Wenbo

(((Ple
20 Aug 2010 - 12:05am
Mathew Sanders
2009

Hi Bill, you bring up a good point around the distinction between this type of question in recruitment and as an ice breaker.I actually managed to avoid as much recruiting as possible (cold calling people gives me the chills) but I ask our recruitment company to use a question like this to ensure they find people who are good at articulating themselves, and talking to people they've just met with some ease.
Having said that, rapport is so important and I like to start with a warm up that helps put people with ease. I've found that instead of having a list of questions it can be nice to just ask how their day has been, and the environment helps a lot as well. We've found that starting by sharing a coffee or tea can be a great ice breaker :)
Steve Krug promotes reading from a script to make sure that you've covered everything, but I prefer to avoid this because it makes me sound like a robot!

On 20 August 2010 05:27, bkillam <bkillam@user-centereddesign.com> wrote:

If the test did include the requirement to "think aloud", that could be part of the issue. Though this is included many usability evaluation, the approach generates both increased anxiety and biases any performance due to split attention. (Not to mention it asks for data the user cannot possibly provide.)

In this case, given the participants good performance, it may there was just a lack of rapport. There might not have been enough time in testing to develop the relationship of trust. (It really has to be developed before any testing starts.) Then again, it could just be the person, or their attitude at the moment. Or, since the task were so easy for the participants, they didn't understand why this was such a big deal and questioned what was really going on.

BTW, Mathew, do you really recruit and "filter" participants by this approach or is it something you do prior to testing as an ice breaker?

-------- BILL ----------------------------------------------------------------

Bill Killam, MA CHFP/CUXP
President, User-Centered Design, Inc.
20548 Deerwatch Place
Ashburn, VA 20147
703-729-0998 (Office)
703-626-6318 (Mobile)
http://www.user-centereddesign.com

On Aug 19, 2010, at 1:41 AM, Mathew Sanders wrote:

> When recruiting participants I like to filter participants by asking them a non-topic question like what super power they would choose, or what was the best movie they've watched lately -- and /why/.
>
> Usability evaluations are a bit of a weird experience - being asked to 'think aloud' isn't something that everyone is comfortable doing. I use these questions to see how articulate and comfortable people are with taking about random things.
>
> I know this adds a bias to the evaluation - but no matter what way you look at it usability evaluations are already full of bias, so it's about understanding it exists and considering this in the interpretation.
>
> Anyway, hope that gives you something to think about - it works for me :)
>
>

(((
19 Aug 2010 - 9:48pm
Wenbo Wang
2008

Hi Mathew,

Great idea. Asking non-topic question (or even close-end question) may bring surprises.  The extra bias won't be a huge problem, as long as we can extract valuable and correct findings from the analysis. In retrospect, I was totally at a loss; next time I'll use ur idea!

Thanks,

Wenbo

19 Aug 2010 - 10:39am
Paul Eisen
2007

Were you presenting a single option or multiple alternatives of your design? This fence-sitting condition can typically be avoided by A/B testing.

Syndicate content Get the feed