Gathering Data from Training

11 Oct 2008 - 5:28pm
5 years ago
8 replies
946 reads
Bojhan
2007

Hi Everyone,

I work with a lot of developers who give training on the system they
develop, recently they noticed that a lot of their recommendations in
usability issues come from their previous experience training people.
They are trying to setup a system of gathering useful information from
there trainings, they already hold diary during the training in which
they collect :

- Observations of users doing a task
- Observations of a user helping another user
- Questions

The training is quite long, so they are very hesitant to run usability
tests just behind a training session. Any suggestions on how to
integrate usability testing into a training session or just gathering
data from training sessions?

Best Regards,

Bojhan Somers

Comments

11 Oct 2008 - 7:08pm
DampeS8N
2008

As a general rule, if the system you developed needs a training course
to understand, it is poorly designed.

Usability study bests not having anything but it doesn't do a very
good job of telling you how to fix problems. It can really only tell
you where there are problems.

That said,

I wouldn't try to turn training into usability testing, your users
are already strained and adding any more complication to the process
can only stand to hurt them. You'll end up with not only a
reputation for a hard to learn system, but with a cludgy and
lackluster training system as well.

My suspicion is you have hit a wall with your current observations.
You can't seem to get any more information out of people, and you
hope that one of us knows a magical spell you can utter to them to
get them to tell you how to fix it.

Let's tackle this from another angle. People were very happy with
dial phones. They worked, they got the job done, they were easy to
use. So why then were they replaced by touch-tone?

Touch-tone is a great deal faster.

Your users are using a dial, and they can't fathom anything beyond
that idiom. They will tell you that they wish it could go faster,
that perhaps a system with a faster moving dial will help them. But
they can't tell you that they need a touch-tone phone because it
doesn't exist yet.

At best, they will tell you what they like about your competition.
Stuff that you can find out cheaper just by looking at your
competition yourself.

What you need isn't a better way to do usability study, you need a
better process for design.

Since you said "I work with a lot of developers who give training on
the system they develop..." I'm going to assume you don't have an
IxD team, or at least one that doesn't do design up front. It sounds
like your programmers are doing the design.

That is the hole you need to patch first. You are looking for a
better way to bail water, when the fix is to patch the roof.

Of course, I might have misunderstood you. Don't take me as the
be-all and end-all. Also, I've been accused of coming off as
abrasive. If this sounded abrasive please try to look past that. I'm
trying to help, not insult you. :)

Will

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Posted from the new ixda.org
http://www.ixda.org/discuss?post=34208

11 Oct 2008 - 7:42pm
Bojhan
2007

Hey, William Brall

I kind of expected this response, sadly this is a open source project so
the prospects of changing the culture around of designing up front as
much as I would want them to, is a slow process.

As you said, we don't want to turn training into usability testing. But
there is so much valuable user feedback in these kind of sessions and I
think we’re wasting it if it doesn’t find its way to people working on
the usability issues.

It might be a strange idea, but these are people who are involved in
every part of the project and are plainly aware of the issues and would
love to share the things they learn during training to the many other
developers.

Bojhan

William Brall schreef:
> As a general rule, if the system you developed needs a training course
> to understand, it is poorly designed.
>
> Usability study bests not having anything but it doesn't do a very
> good job of telling you how to fix problems. It can really only tell
> you where there are problems.
>
> That said,
>
> I wouldn't try to turn training into usability testing, your users
> are already strained and adding any more complication to the process
> can only stand to hurt them. You'll end up with not only a
> reputation for a hard to learn system, but with a cludgy and
> lackluster training system as well.
>
> My suspicion is you have hit a wall with your current observations.
> You can't seem to get any more information out of people, and you
> hope that one of us knows a magical spell you can utter to them to
> get them to tell you how to fix it.
>
> Let's tackle this from another angle. People were very happy with
> dial phones. They worked, they got the job done, they were easy to
> use. So why then were they replaced by touch-tone?
>
> Touch-tone is a great deal faster.
>
> Your users are using a dial, and they can't fathom anything beyond
> that idiom. They will tell you that they wish it could go faster,
> that perhaps a system with a faster moving dial will help them. But
> they can't tell you that they need a touch-tone phone because it
> doesn't exist yet.
>
> At best, they will tell you what they like about your competition.
> Stuff that you can find out cheaper just by looking at your
> competition yourself.
>
> What you need isn't a better way to do usability study, you need a
> better process for design.
>
> Since you said "I work with a lot of developers who give training on
> the system they develop..." I'm going to assume you don't have an
> IxD team, or at least one that doesn't do design up front. It sounds
> like your programmers are doing the design.
>
> That is the hole you need to patch first. You are looking for a
> better way to bail water, when the fix is to patch the roof.
>
> Of course, I might have misunderstood you. Don't take me as the
> be-all and end-all. Also, I've been accused of coming off as
> abrasive. If this sounded abrasive please try to look past that. I'm
> trying to help, not insult you. :)
>
>
>
> Will
>
>
> . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
> Posted from the new ixda.org
> http://www.ixda.org/discuss?post=34208
>
>
> ________________________________________________________________
> Welcome to the Interaction Design Association (IxDA)!
> To post to this list ....... discuss at ixda.org
> Unsubscribe ................ http://www.ixda.org/unsubscribe
> List Guidelines ............ http://www.ixda.org/guidelines
> List Help .................. http://www.ixda.org/help
>
>

11 Oct 2008 - 10:17pm
jmcatee
2008

Bojhan,

I am also in a similar situation. The company I work for builds
supply chain management products. Everything from warehousing and
transportation to forecasting of goods. The system is quite complex
and has a fairly high learning curve. We also have training courses
and consultants that go on site to install the software. The
consultants are not UI people but I am sure they see many things that
would be helpful. We are hoping to be able to engage them for feedback
but haven't come up with a strategy for it yet.

Jamie

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Posted from the new ixda.org
http://www.ixda.org/discuss?post=34208

11 Oct 2008 - 9:16pm
DampeS8N
2008

You'll do much better educating your developers than you will mining
your userbase. Have them read a few books. Umm.. Off the top of my
head:
The Inmates Are Running the Asylum - Alan Cooper
Why Software Sucks - Can't find it to see who wrote it, think it was
Platt.

About Face 3.0 is most likely too thick, but would be good too if you
can get them to read it. That is also by Cooper.

This kind of change has to come from inside, the users know if they
can figure something out. They are somewhat trained to expect to have
to be trained, and it is only you who can untrain them.

I think you are already doing a fantastic job of getting the word
back to the developers, you just need to get them to realize they are
the problem. That they can't make what works for them.

It is kind of like trying to tell a planet of all men that women need
to sit down to use the bathroom. Programmers are the men, users are
the women.

And the women won't get that the men stand up, so you can't expect
them to know what the problem is either.

Even if you do IxD in the middle of the project, repackage what the
programmers do through another layer.

Have them build an API for internal use, and that will make being
more flexible with your user experience possible.

Another alternative is to take a pause in between development, once
you release a new version, and have someone, like myself, do some IxD
for your Open Source project. I know I'd be honored to do that, and
I'm sure many others here would love to contribute to open source
more than we get to.

But in the end, you can't fix the problem without treating the
cause.

Maybe someone else here can give you more ideas about how to work
around your problem. I do contract work for the government, maybe
someone else will come along who has worked through these issues in
an os project. Us IxDs are nothing if not capable of finding ways to
weasel in our ideas. We sorta have to do that all the time.

Best of Luck,
Will

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Posted from the new ixda.org
http://www.ixda.org/discuss?post=34208

12 Oct 2008 - 9:57am
Andy Edmonds
2004

I'd recommend instrumenting with web analytics your training application
usage. While it's not a completely controlled scenario, materials at
least tend to be consistent and users are given tasks at various stages
of experience with the product.

You should be able to identify persistent user errors and document
learning where it occurs by observing decreases in time and error rate
in common sequences.

If variability is high between trainers and specific sessions, you'll
need more data and more rigor around tieing sessions to trainers so you
can partial some of that out. Google analytics is not going to give you
the level of rigor you'll need, so a commercial offering or one of the
burgeoning open source solutions is recommended.

On the other hand, simply studying the behavior of trainers may reveal
core issues in the product design. Trainers often teach key
workarounds, which would block a self-guided learner.

Cheers,
Andy Edmonds, "Agile Project Management", http://www.versionone.com

Bojhan Somers wrote:
> Hi Everyone,
>
> I work with a lot of developers who give training on the system they
> develop, recently they noticed that a lot of their recommendations in
> usability issues come from their previous experience training people.

12 Oct 2008 - 10:26am
Catriona Macaulay
2007

Hi Bojhan

Your story sounds very familiar - I work on an agile open source
development project in the scientific software world. To respond to
William's point, designing a phone interface that needs minimal
training is a realistic goal, designing collaborative imaging
management and analysis software for molecular cell biologists that
needs minimal training is impossible. Sometimes users are simply
engaged in using software to achieve highly complex goals, it's the
activity not the software itself that demands 'learning curve
software'. And of course in the agile OSS world the idea of a user
being a tester is normal - it's the business model; release and get
the users to work out the bugs.

For us usability testing isn't just about 'finding the bugs' -
it's a vital part of our design and user research. The analogy is
with those traditional academic psychology papers that spend several
pages going through the usually hugely inconclusive formal experiment
details and then end with the usually really fascinating 'however
anecdotal observations during the experiment suggest that...'
paragraph. Usability sessions of any kind are first and foremost 'an
encounter with users' and we treat them as user research
opportunities in the widest sense.

We do in fact use training sessions as informal usability
observations and ethnographic fieldwork opportunities (partly because
like in a lot of agile OSS projects there is no money for training
consultants so the 'usability guys' do it). Whether on a group or
one to one basis we treat them as an opportunity to gain valuable
feedback. Depending on the session and people involved we will audio
record , video and/or note take, write these up, take to the next
developer meeting and use to generate tickets (requirements) for the
dev team. Handled in the right way there is no need to interfere with
the training element. Most of our users are only too happy to know
that we are interested in their observations/thoughts and that their
contribution will be fed into future releases. Even simply spending
10 minutes at the end over coffee having a chat with the users will
usually yield something useful. Won't work in every situation but in
highly complex domains and where the project is agile/OSS/has no money
it can be a cheap and cheerful way to wring value out of every user
encounter.

And whilst educating devs is vital (though many of the devs I have
met have great design instinct and user sensitivity) - they simply
don't have the time to be gathering user and design insight, that's
our job. As distribution of labour schemes go it works pretty well!

Catriona Macaulay

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Posted from the new ixda.org
http://www.ixda.org/discuss?post=34208

12 Oct 2008 - 1:39pm
DampeS8N
2008

Catriona,

I have my own issues with agile, which I'm going to avoid getting
into.

I find it interesting that you claim your projects for scientists
can't be well designed just because of the nature of the product, on
that I disagree. No software is going to be instantly understandable,
which seems to be many people's goals. I hear it all the time at
work, people create a comp and because they and the technosavy people
around them say they get it, they deem it instantly understandable.

I also find it interesting that you say you are mostly informed by
users as to what is an issue with your product.

This will tell you what could be more like something else to be
better, but you'll be hard pressed to find a user that has thought
about the whole of the process enough to give you really valuable
info. The kind of info that would enable you to create a system that
is learnable. That even these scientists can learn to use without a
manual. They need to learn their own terminology and their own trade,
but that should be the only requirement to be able to learn how to use
the software.

The general rule is if the user can figure out how to run the
software, they should be able to figure out how to use it.

I don't think that, "...designing collaborative imaging management
and analysis software for molecular cell biologists that needs
minimal training is impossible."

If it were broken down into modes that meet the scientists mental
models of the process, if they were presented in a clear and
straight-forward way, and if any manipulation was done in a way that
makes sense for the scientist, I think you could have easy to use
software.

I designed software for NASA, I know how complicated some of these
processes can be, it is just a matter of breaking them down.

You might need to have someone who does this work every day to do it,
but it isn't impossible. Why give up on it so easy?

I know it seems like the right thing to do to talk to the users about
what they want, but software companies have been doing this forever
and still release crap software. You need to pass the right info
through the right filter.

I think that Bojhan, and other have all the info needed. The trick
now is getting that info into the hands of someone familiar with IxD.
Even if it means learning all about IxD and doing it themselves.

(I can't believe I got through that without busting on agile)

Best Wishes,

Will

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Posted from the new ixda.org
http://www.ixda.org/discuss?post=34208

12 Oct 2008 - 2:09pm
Catriona Macaulay
2007

Hi William

Thanks for a thoughtful response. Let me pick up a few threads...

On 12 Oct 2008, at 12:39, William Brall wrote:
>
> I have my own issues with agile, which I'm going to avoid getting
> into.
>

I think anyone who has ever worked in any capacity on an agile project
has their issues with agile (include the gang of devs I work with) -
but that's the environment we are working in.

> I find it interesting that you claim your projects for scientists
> can't be well designed just because of the nature of the product, on
> that I disagree.

OK so I didn't make myself clear as I certainly did not mean to infer
(let alone claim) that! What I am trying to say is that the goal of
software that someone should be able to 'make sense' of with no
training within their own domain regardless of how complex that domain
is, or what the particular software goal is, is a very high one. There
is nothing inherently wrong with designing a tool that requires
training, practice and commitment from its users. The piano is a great
example of just such a tool. No one would claim the piano was designed
to be played without instruction and considerable practice. That
doesn't make it a bad tool, just a complex tool for a complex job that
has proven to be remarkably persistent - as Ivan Illich says, it's a
very convival tool (if you have been trained!). There is something
inherently wrong with designing a phone that requires that. It's all
about the context. I am not claiming our project can't be well
designed because of the nature of the project - I am proposing that
the idea that a product or system is badly designed if users need
training to use it is not always appropriate.

> No software is going to be instantly understandable,
> which seems to be many people's goals. I hear it all the time at
> work, people create a comp and because they and the technosavy people
> around them say they get it, they deem it instantly understandable.

Absolutely agree - the case of Lucy Suchman and the photocopier green
button is the great example of this. The myth around the 'birth' of
design ethnography is that Suchman observed people wanted to make
single copies of documents and then Xerox invented the green button
(that story gets repeatedly endlessly much to her dismay). Her own
version is that they already had the green button and she was
intrigued by the company's desire to try and mask the complexity of
their copiers.

>
>
> I also find it interesting that you say you are mostly informed by
> users as to what is an issue with your product.

Nope didn't say that - what I did say is that we do a lot of user
insight gathering, usability testing etc. That informs our project
thinking, but that is just one amongst many activities the project
teams engage in. Balancing user insight from 'what they tell us' and a
great deal of field observation too (as you point out later listening
to users only gets you so far!) with dev insight, market insight,
expert analysis, etc. is what mostly informs us (and quite a lot of
developer instinct alongside it).
>
>
> They need to learn their own terminology and their own trade,
> but that should be the only requirement to be able to learn how to use
> the software.

The problem is that in the environment we are working there is
something of a paradigm shift occurring - old models of scientific
work are in the early stages of a a new paradign of 'collaborative
open source science' - and like most big shifts in science it is
bringing with new ways of working - the trade and the terminology is
in flux.

>
> If it were broken down into modes that meet the scientists mental
> models of the process, if they were presented in a clear and
> straight-forward way, and if any manipulation was done in a way that
> makes sense for the scientist, I think you could have easy to use
> software.

Nicely put and that's our goal - with the caveat that my point above
raises; some of this is so new in science their is no pre-existing
mental model. We need to help them acquire a new one, and part of that
is something the software itself will contribute too (though we are
not so foolsih as to think that will be the only part of that complex
process - standards, science institutions, education systems, hardware
firms are all part of that).

>
>
> I designed software for NASA, I know how complicated some of these
> processes can be, it is just a matter of breaking them down.
>

It would be great if you had some case studies of this stuff you could
share (though I guess working for NASA would have come with a hefty
raft of NDAs)... Breaking down some of the processes we are dealing
with has proven to be a very 'dirty problem'.

>
> I know it seems like the right thing to do to talk to the users about
> what they want, but software companies have been doing this forever
> and still release crap software. You need to pass the right info
> through the right filter.

And once again no argument here from me. As an ethnographer I am as
suspicious of 'talking with users' as the devs I work with are of
agile methods :)

>
>
> (I can't believe I got through that without busting on agile)
>
>

Congratulations :) Such a tempting target but a little like shooting
ducks in a barrel !

Syndicate content Get the feed