Questioning User Research (was Eyetrackingarticle on UXMatters)

12 Jul 2006 - 12:06pm
8 years ago
3 replies
380 reads
Sabine Junginger
2006

Hi Dave,

One of the biggest issues I notice in the application and understanding of user research is the confusion on its quantitative and qualitative usage. There is a persistent perception among designers that they need to produce "numbers for the marketing guys." User research as part of design research tends to be qualitative. It uses methods of observation, immersion, scenarios, and engages participants in activities like card-sorting. User research allows the design team to learn more and more about the logic users apply in accomplishing a task but also about the circumstances they are facing, the problems they experience and the opportunities for alternative approaches to a task. When user research is reduced to producing quantifiable questionnaire responses, it is not very helpful. I have seen too many user research efforts targeting a quantifiable outcome instead of really inquiring into how people think, act and work. By doing so, they are limiting their findings from the b!
eginning, channeling their inquiry rather than opening it. It turns out that the persuasive power of user comments and actions, especially when observed first hand by people in an organization can be a strong ally to the designer. This is not to say that there is no room or need for quantification. To each its time and place.

Best,
Sabine
°·..·••

>
> I tend to agree w/ Chris. I'm very suspicious to most user research,
> especially quantitative stuff. I'm just to empathetic to get into numbers.
> But quantitative is not the only type of user research and something I
> read today from Mark Hurst was nice to get a chance to read. It is in this
> week's "episode" of Good Experience:
> http://goodexperience.com/blog/index.php.
>
> The blog is a preview to a case study that Mark did for del.icio.us and
> the user research he did for them as part of their design work. Iike the
> wholistic approach and importance of user research.
>
> The case study is at: http://creativegood.com/casestudies/delicious.html
>
> - dave

Comments

13 Jul 2006 - 9:23am
Sabine Junginger
2006

That's exactly my point, Mark: "we often make assumptions and we miss opportunities." If you need any evidence for this look at all the products around you (consumer products, services, and other design results) and how they defy people's "common sense." To assume that every designer has a good grasp of common sense is to start the design process with a great fallacy. It also misses the point that common sense is situational and contextual and requires to grasp a particular situation and as some have pointed out, cultural aspects. How does one even understand when one's own common sense is at odds with that of someone else? As an example, yesterday, a man on the radio said "I will vote for Ralph Reed because I am going to his church. He is a Christian. That is why I vote for him." Made perfect sense to the man.

One more thought on your previous post: Could one of the reasons for the disconnect between common sense and designed products be that designers like to distinguish between "design" on the one hand and "research" on the other? This clearly provides opportunities for erroneous interpretations. Bruce Archer pointed to the dangers of treating design "relay" style. If research is not integrated into the design process and designers do not actively take part in it, findings and discoveries are easily taken out of context and misinterpreted. One of the projects I have been working on has shown the value of early testing that you also mentioned. Get a prototype and let real people work with it. It probably defies the classic definition of usability testing but the insights that can be gained concerning usability issues of a particular product are tremendous. Final usability testing (of the finished product) can at best affirm what has already been known (which means that it shares !
a similar problem as focus groups, which also can only affirm at best).

Sabine

> [Please voluntarily trim replies to include only relevant quoted
> material.]
>
> I agree with your statement here Robert. But I would still maintain that
> heuristic evaluations is hopefully based upon a deep knowledge base (good)
> and previous experiences (fine - but dangerous). When we rely on common
> sense, we often make assumptions and we miss opportunities.
>
> Mark
>

13 Jul 2006 - 9:48am
Becubed
2004

On 13-Jul-06, at 10:23 AM, Sabine Junginger wrote:
> If research is not integrated into the design process and designers
> do not actively take part in it, findings and discoveries are
> easily taken out of context and misinterpreted.

A terrific point, Sabine. I submit that up-front in-context
"research" with customers is not in fact research at all -- it's
simply part of the design process. It informs and inspires.

Quick example: I've recently designed a desktop app used in factories
for building roof trusses (those pre-fab wooden things that hold up
your house's roof). Round 1 of design resulted in a damn solid
interface that got the stamp of approval from customers in a
usability test. Then we entered round 2 -- this time with the benefit
of spending time on-site at truss plants. I tell you: the interface
sure changed as a result of our appreciation of the context in which
these people work and the factors that motivate them.

With respect to the discussion around common sense: I'm totally on
board with the observation that common sense alone can guide us well.
Even in a vacuum of customer insight, a skilled interaction designer
should be able to come up with a fairly elegant, well-behaved
product. But is it the right product? Does it behave in the best way
for the target customers? Is it nuanced to a degree that people will
not simply use it, they'll delight in using it?

--
Robert Barlow-Busch
Practice Director, Interaction Design
Quarry Integrated Communications Inc.
rbarlowbusch at quarry.com
(519) 570-2020

This e-mail message (including any attachments) is intended only for
the use of the individual to whom it is addressed and may contain
information that is privileged, proprietary, confidential or subject
to copyright. If you are not the intended recipient, you are
notified that any use, dissemination, distribution or reproduction of
this communication is strictly prohibited. If you have received this
communication in error, please notify the sender and delete this e-
mail message immediately.

13 Jul 2006 - 10:40am
Mark Schraad
2006

Absolutely! The design and research components are not separate! They
need to be interlaced. I am 100% opposed to the standard waterfall
process mentality. I can't think of a planning process more prone to
quality compromise. I realize that some situation demand it, but
those situations should be prioritized for review and the process
redesigned.

On Jul 13, 2006, at 9:23 AM, Sabine Junginger wrote:

> One more thought on your previous post: Could one of the reasons
> for the disconnect between common sense and designed products be
> that designers like to distinguish between "design" on the one hand
> and "research" on the other? This clearly provides opportunities
> for erroneous interpretations. Bruce Archer pointed to the dangers
> of treating design "relay" style. If research is not integrated
> into the design process and designers do not actively take part in
> it, findings and discoveries are easily taken out of context and
> misinterpreted. One of the projects I have been working on has
> shown the value of early testing that you also mentioned. Get a
> prototype and let real people work with it. It probably defies the
> classic definition of usability testing but the insights that can
> be gained concerning usability issues of a particular product are
> tremendous. Final usability testing (of the finished product) can
> at best affirm what has already been known (which means that it
> shares !
> a similar problem as focus groups, which also can only affirm at
> best).

Syndicate content Get the feed