The Traceability of Research into Design

26 Oct 2010 - 9:43am
3 years ago
21 replies
1877 reads
Joe Sokohl
2004

In a recent convo on Twitter, Janna DeVylder, Dave Malouf, Steve Baty and I were discoursing about the difficulties of traversing the Research > Design chasm. Some really great thoughts ensued. So, I thought I'd keep the convo going here.

It's a topic I'm keely interested in. Having helped present From Design to Research at Agile '09, and having worked on this for many years, I still realize I have lots to learn.

I do feel we need to ensure our project stakeholders understand how our research leads to and informs our designed artifacts. We have an ethical requirement to trace what elements our artifacts consist of to appropriate levels of research.

Now, I don't necessarily think this means we have to say, "The reason we have a 6px radius on the top left and top right of the login box but a 0px radius on the bottom is that 6 of 9 users we observed in the field indicated that those radiuses would work better for them."

On the other hand, we can't do a bunch of research, then wander into our whiteboard-walled conference rooms and sketch designs full-born from our heads, like Athena from Zeus. Our research must infomr our design in concrete ways.

How concrete, and to what documented extent, is variable. Ye twe cannot fail to practice rigor in our approach to our designs. To do otherwise borders on the unethical.

Well, anyway, that's how I see it.

Comments

26 Oct 2010 - 11:04am
Greg Petroff
2004

In my work when we have captured video as part of talking to users we tag the timecode as we write up insights that seem interesting or relevant from the interview (something they told us, something we observed them doing etc.). We then print sticky notes with an insight or observation (1 per sticky) from all interviews and use these for synthesis. If we find an actionable insight that we can build / design on their is usually 1-2 clips that support it, and since we have the time code it's super easy to find them. It does not really take much more time then the normal process of reviewing your interview. By the way we have found that we find lots of stuff we would have missed by revisiting the audio or video from an interview over what we captured in our notebooks during an interview.

If we need to at a later date "defend" an insight it makes it really easy to play back the clips. You only have to do this once or twice with stakeholders for them to A. get it and B. trust you in the future.

 

26 Oct 2010 - 1:05pm
dantemurphy
2010

I have a process that works very well, and is integrated with the way observers (whether on site or remote) take notes.  The short version of the process is that you de-duplicate the categorized individual insights, number them, then follow the numbers through affinity mapping and story-plotting until you create a list of features and services that are the tactical foundation of a cohesive strategy statement.

(PLUG: If anyone is interested in learning more, I was planning to do this as a workshop and would be happy to discuss timing and logistics for a session wherever you might be.)








 

26 Oct 2010 - 12:56pm
dantemurphy
2010

I have a process that works very well, and is integrated with the way observers (whether on site or remote) take notes.  The short version of the process is that you de-duplicate the categorized individual insights, number them, then follow the numbers through affinity mapping and story-plotting until you create a list of features and services that are the tactical foundation of a cohesive strategy statement.

(PLUG: If anyone is interested in learning more, I was planning to do this as a workshop and would be happy to discuss timing and logistics for a session wherever you might be. [dmurphy at digitashealth dot com])

26 Oct 2010 - 9:33pm
Tania Schlatter
2007

Ethics aside, ensuring that project stakeholders understand how user research informs our designs is essential to get the consensus and buy-in needed to keep projects moving forward. To do this, we learned to include high-level wireframes when we review research findings with client teams, even if they are conceptual and based on what we think is most important. This way, even if things change, there is no gap between research and design. We immediately start showing how what we learned can be incorporated in designs. Research is then a central thread in presentations and discussions during the design process. This works well for client teams because it reduces subjectivity and makes decision making (and justifying) easier.

27 Oct 2010 - 12:02am
Elizabeth Bacon
2003

I find the usage scenario (or context scenario in Cooper parlance) is an excellent tool for the transition from research to design. The research encapsulates what we know about the world today, what people are doing and wanting and experiencing. Then we aim to design for what we can envision that makes the world a better place for people & technology. Scenarios function as a verbal communication of that story we want to weave, a narrative that solves problems and captures necessary behaviors. I actually lay scenarios out side-by-side with user experience requirements for that user-centered traceability. If features are considered for feasibility, a discussion can be had about value as well as scope for coherent sets of features (as identified to be a related sequence, in a context scenario) that keep a product focused. Stories or scenarios at this level create a vision that connects research with design. The wireframes then proceed with a solid interaction framework concept.

 

Cheers,
Liz

27 Oct 2010 - 8:05pm
Laura Keller
2007

Apologies as I redundantly have this reply on the site as well; as you can tell, I don't post much :).

I agree with the methods outlined previously, especially the piece around showing insights and design implications sooner rather than later to stakeholders (rather than it appearing as if it's a black box). We've started integrating a check-point step mid-research to outline themes we are seeing/hearing/observing to start socializing them with clients who need to buy into the final insights and recos.

But ultimately - and it's not rocket science I know - the traceability of design impact to research comes much farther upstream, at the start of the funnel:

  1. Defining the problem to be solved with the stakeholders (whether it's an interface 'problem' or a strategy 'problem')
  2. The methodology (e.g., ethnography or lab-based... eye-tracking or traditional UT....) to gather insights to address that problem and ensuring stakeholders understand what they will and won't be getting ('hey, we won't be able to tell you whether 6 px or 0 px is ideal with this - is that ok?')
  3. Data capture necessary (e.g., high level note-taking or detailed, marked up video review)

Much of what will go into these decisions may be driven by hard variables (budget, timing) and soft variables (how crazy the stakeholders are, whether they want accurate or precise because there's a difference) and, importantly, what hypotheses for the design solutions already exist. As much as we want to remain high-integrity and high-rigor with insight gathering, the fact of the matter is that we all have hypotheses or bias - and that's ok - so long as we have checks/balances to ensure they don't outweigh what's true.

  • Laura

On Wed, Oct 27, 2010 at 1:44 AM, Elizabeth Bacon wrote: > I find the usage scenario (or context scenario in Cooper parlance) is an > excellent tool for the transition from research to design. The research > encapsulates what we know about the world today, what people are doing and > wanting and experiencing. Then we aim to design for what we can envision > that makes the world a better place for people & technology. Scenarios > function as a verbal communication of that story we want to weave, a > narrative that solves problems and captures necessary behaviors. I actually > lay scenarios out side-by-side with user experience requirements for that > user-centered traceability. If features are considered for feasibility, a > discussion can be had about value as well as scope for coherent sets of > features (as identified to be a related sequence, in a context scenario) > that keep a product focused. Stories or scenarios at this level create a > vision that connects research with design. The wireframes then proceed with > a solid interaction framework concept. > > > > Cheers, > Liz > >

27 Oct 2010 - 3:37pm
Laura Keller
2007

Agree with the methods outlined above, especially the piece around showing insights and design implications sooner rather than later to stakeholders (rather than it appearing as if it's a black box). We've started integrating a check-point step mid-research to outline themes we are seeing/hearing/observing to start socializing them with clients who need to buy into the final insights and recos.

But ultimately - and it's not rocket science I know - the traceability of design impact to research comes much farther upstream, at the start of the funnel:

  1. Defining the problem to be solved with the stakeholders (whether it's an interface 'problem' or a strategy 'problem')
  2. The methodology (e.g., ethnography or lab-based... eye-tracking or traditional UT....) to gather insights to address that problem and ensuring stakeholders understand what they will and won't be getting ('hey, we won't be able to tell you whether 6 px or 0 px is ideal with this - is that ok?')
  3. Data capture necessary (e.g., high level note-taking or detailed, marked up video review)

 

Much of what will go into these decisions may be driven by hard variables (budget, timing) and soft variables (how crazy the stakeholders are, whether they want accurate or precise because there's a difference) and, importantly, what hypotheses for the design solutions already exist. As much as we want to remain high-integrity and high-rigor with insight gathering, the fact of the matter is that we all have hypotheses or bias - and that's ok - so long as we have checks/balances to ensure they don't outweigh what's true.

 

 

 

28 Oct 2010 - 1:59pm
Dave Malouf
2005

If people go back to the conversation, a view that I was positing is that "traceability" feels like contingency for bad collaboration, and worse for design that is too data-driven.

a) If your team is part of the research and collaborate towards the design (co-design), then the empathy and insights derived from the research are co-created and become part and parcel of the created culture of the design team itself.

b) Even if ya can't do that (and I know most teams refuse/can't to work that closely) having such methodical traceability methods begs a culture among stakeholders that will constantly ask for "data defense" of every design decision.

Be very weary of the frames that you decide to work within. Be sure they map against your needs as well as the needs of your collaborators.

-- dave 

1 Nov 2010 - 9:01am
Tania Schlatter
2007

Dave,

I'm not sure what you mean by "go back to the conversation." Also looking for clarification - it sounds like your scenario "a" is ok, and "b" can be dangerous?

Tania

1 Nov 2010 - 11:03pm
Dave Malouf
2005
I meant go back to the Twitter conversation that Joe alluded to (unfortunately it's not that easy to do). "dangerous" is in the eye of the beholder.
31 Oct 2010 - 7:33pm
dszuc
2005

Good topic.

Inherent in all this is a place for all the insights and designs to live permanently. So many times it is put up on a wall only to be removed quickly and then forgotten. Permanence allows people who were directly involved in the project to talk around and to the pieces on the wall and also invite outsiders in.

Suggest there is also an important role to be played by a person documenting the key points from the discussions that happen and to map it back as to how this will take the design forward, forward, forward.

Make the research live and breathe through stories and design and make it feel less like "research"

rgds,

Dan

1 Nov 2010 - 3:15pm
Paul Bryan
2008

I agree with Laura's statement: "The traceability of design impact to research comes much farther upstream, at the start of the funnel:Defining the problem to be solved..."

In the design research projects I've been involved with, the only ones that really impacted the final design were those that were specifically formulated to answer critical design questions. How many of our customers are likely to use personalization features if we offer them? How can we reduce abandonment in the cart? What criteria drive decisions about financial instruments? What behavioral patterns impact the use of mobile technology for shopping?

The methods you use for collecting data to answer these questions is as important as identifying the question to be answered. If the question involves "how many" or "how much" or "how often," you need a statistical sample to answer it reliably. If you haven't segmented your user base, then you probably need to do depth interviews. If you don't know the size of the opportunity, you probably need primary and secondary market data. If you haven't identified the key variables that influence behavior, then you probably need to do an ethnographic study. 

To the point about research constraining design in unwanted ways, I can't say that I've seen that. But I've seen file cabinets full of research data that didn't have any direct connection to the final design. And oddly enough, everybody involved seems ok with that. Except the analytics manager and the business owner.

Paul Bryan

Blog: http://www.virtualfloorspace.com

LinkedIn: http://www.linkedin.com/in/uxexperts

 

1 Nov 2010 - 11:12pm
Dave Malouf
2005
I often find that conversations that dealmin long term knowledge repositories & traceability are not well founded. While I appreciate the desire for efficiency, I think that as teams move on or players are changed there comes a point where the total held empathy of any groups falls below a useful reshold & that for the meaningful gooks of design research which for me are direct observation to gain empathy with the people we are designing for, archived artifacts are not very productive. I'm sorry to say this but empathy can not be simply acquired by reading a report or even a story. If pushed, I've noticed that video stories are the most compelling. There is probably some Fitts-like law we can create about degrees of separation from direct observation and the type of artifacts that convey data to the level of empathy conveyable. If y'all noticed I have not once suggested that tracing X datapoint to Y design decision is part of the plan. That is a fool's errand that will only lead to mis-interpretations by unqualified & inexperienced people further up the chain. - dave
2 Nov 2010 - 11:07pm
Christopher Rider
2009
If y'all noticed I have not once suggested that tracing X datapoint to Y design decision is part of the plan. That is a fool's errand

Hear hear!

The Agile folks are confused in many ways, but they have one thing dead to rights. Documentation is a poor substitute for actual communication.

--
Christopher

3 Nov 2010 - 9:05pm
RMattB
2010

Christopher, can you elaborate on other things the agile folks need to get clearer on?

Matt

4 Nov 2010 - 10:37am
Christopher Rider
2009

A full critique of agile is a topic for a different thread, but my main complaint is this: The agile focus tends to emphasize the small over the big. A lot of companies naturally suffer from 'tyranny of the urgent.' The focus on 2-4 week iterations just encourages this tendency.

"Design", as practiced by "Designers", is entirely superfluous, (say the Agilists) because (so it's assumed) the market is evolving so fast that any design will be out of date before it can be implemented. .

Although I've never seen this in person, an exceptionally talented team, working on a relatively small scale problem, can apparently really rock with Agile. And this makes sense, because Agile was originally concieved under the assumption that an exceptionally talented team could generally deliver on a relatively small scale problem regardless of what process was in play. So the ideal process for an exceptionally talented team should be designed mostly to stay out of the way.

Unfortunately, exceptionally talented teams are, by definition, rather rare. As are, in my admittedly limited experience (three different attempts to implement agile over about a 7-year period) successful implementations of agile.

3 Nov 2010 - 4:47pm
Paul Bryan
2008

The purpose for business-related research - regardless whether its the business of design or the business of making airplanes that don't fall out of the sky or the business of making beverages that consumers find tasty -  is to make decisions. If the research that is conducted doesn't impact the decisions that are made related to the topic of research, then either the research approach is faulty or the design process is broken. If there is not a direct connection between design research and design execution, it is like my pastor said to couples who don't enjoy sex: you're not doing it right.

Paul Bryan

Blog: http://www.virtualfloorspace.com

LinkedIn: http://www.linkedin.com/in/uxexperts

3 Nov 2010 - 10:31pm
Dave Malouf
2005

Paul, I can't disagree more w/ your take on this. Well, let me rephrase that. I can't agree with the absoluteness of your position.

I do believe that research impacts design decisions, but not in nearly as cold a direct manner as you are suggesting. "Insight" & "empathy" are not traceable. They are embedded in our soul and pocketed as inspiration for our future creativity, Heck research I did years ago impacts my design decisions today, as well as a host of other experiences that I go through intentionally tied to my work and unintentionally.

To create such a tight connection between research and design is short sited. By that I mean, if every decision has to be tied to research where is their room for just dumb luck and random inspiration in such a model.

Now that being said, I do not want to be absolutist in the other direction. Drawing connections between decisions and research is not bad at all. It helps sell design decisions when the story is not good enough on its own. I just don't believe in a completely data-driven paradigm for designers. it is soulless and heartless and breads the type of design that I would not want to be a part of, personally. It's not why I'm a designer. But in to each their own. If it works for ya, then keep doin' it. Me? i teach research as a tool for insight and empathy and not for actual design direction/decision.

-- dave

5 Nov 2010 - 10:59am
Joe Sokohl
2004

I tend to agree more with Paul than with Dave on this--but there is a sense of degree, there's a sense of project and team type, so absolutes are difficult.

I do feel we need to understand the why behind each design element. We need more rigor in understanding of human factors, psychology, rhetoric, emotion, and Weltanshauung, among other elements.

5 Nov 2010 - 1:05pm
Dave Malouf
2005

I think the issue here is that we aren't considering the various factors of scale & the different types of planes we are designing on. The question isn't yes/no bit when/how/what.

There is a continuum that I'm reacting to. Data will always be important, and it will always be helpful to connect data to decisions, but a culture that requires that connection for ALL aspects of design is as equally problematical a culture that refuses to use data at all.

-- dave

On Nov 5, 2010, at 12:00 PM, Joe Sokohl wrote:

> I tend to agree more with Paul than with Dave on this--but there is a sense of degree, there's a sense of project and team type, so absolutes are difficult. > > I do feel we need to understand the why behind each design element. We need more rigor in understanding of human factors, psychology, rhetoric, emotion, and Weltanshauung, among other elements. > >

5 Nov 2010 - 3:58pm
Paul Bryan
2008

Design decisions are always negotiable with talented people, because their decisions are firmly rooted in rationale. I've worked alongside some of the best designers in the world, and I would never tell them how to design, or try to make decisions about design execution for them. But they need to adhere to the design strategy guidelines developed from data.

For example, if research has determined that there are 5 basic behavioral segments for cellphone purchases and the design strategy is that we will address the top 3 segments directly in the user interface based on a given set of characteristics, then when we do the first internal design review I want to understand the design rationale of how we are specifically engaging those three segments. The designers are free to make all the design decisions required to develop and support their perspective, but I want to see wires connected from the research to the design. 

If research has shown that flight attendants have on average 20 minutes before flight checkin to get a download of everything new they need to know before flying, then the design solution needs to be based on a rationale that specifically reflects this research data. More ATM, less ACM. Direct wires from research to design.

If research shows that millennial females are shopping for apparel based on occasion and the personality that the clothing communicates (see http://vimeo.com/12526177  for example data), then the selection filters and taxonomy mapping that are designed into the shopping experience need to directly address occasions and personality, rather than generic patterns that a designer has pulled from Smashing Magazine or a competitor site. Direct wires from research to design. 

I'm very familiar with the heroic vision of design. Take a look at the Making of Aeron video at: http://www.hermanmiller.com/Products/Aeron-Chairs. Heroic design at its best. Then take a look at the research that was definitively reflected in the final product: http://www.foam-mattress.com/pdf/Aeron-Design.pdf.

/pb

 

 

 

 

 


 


 

 

Syndicate content Get the feed