Should analytics be an IA/XP role?

27 Nov 2007 - 12:29pm
6 years ago
27 replies
222 reads
Fred Beecher
2006

On 11/27/07, Mike Scarpiello <mscarpiello at gmail.com> wrote:
>
> Do you think analyzing data using tools like Omniture and Coremetrics
> should
> fall under the user experience umbrella?
>
> Just wondering.

Good question, Mike. At our consultancy, Web analytics falls under the
"online marketing" umbrella, which is separate from the user experience
umbrella. Granted, our situation (being a consultancy working on multiple,
disparate projects) is different than that of most of our clients, who are
mid to large sized organizations concerned with a limited set of Web
properties.

At these organizations (that actually *have both* analytics and UXP
practices) I've typically seen analytics and UXP as distinct units beneath a
larger umbrella organization, e.g., e-marketing. In this situation, they
tend to work very closely together without one being subordinate to the
other.

I believe that this is an ideal situation, because while analytics can be a
significant source of data for a UXP practice, it's also a significant
source of data for other elements of the business. Keeping it under UXP
would, in my opinion, unnecessarily limit its scope and capabilities.

- Fred

Comments

27 Nov 2007 - 3:09pm
Chad Mortensen
2007

The company that I work for uses analytics to back up almost
everything we do. From multivariate testing of new interface design
within the user experience department to merchandising products
within the marketing department.

I definitely agree that it has uses in multiple areas of the business
although whoever is in charge of analyzing the data should have a
background in statistics and be able to determine if your results are
statistically relevant based on the amount of traffic, frequency, day
of the week, market conditions, etc., etc.

It's easy to glance at some charts spit out by your analytics
package and make false assumptions.

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Posted from the new ixda.org
http://www.ixda.org/discuss?post=22955

27 Nov 2007 - 8:10pm
Robert Hoekman, Jr.
2005

> Do you think analyzing data using tools like Omniture and Coremetrics
> should
> fall under the user experience umbrella?

Definitely falls under UX. So much can be learned about human behavior from
stats, it's unreal. And stats don't lie, which is more than we can say about
humans (even when these "lies" are unintentional).

-r-

27 Nov 2007 - 8:46pm
Katie Albers
2005

At 6:10 PM -0700 11/27/07, Robert Hoekman, Jr. wrote:
> > Do you think analyzing data using tools like Omniture and Coremetrics
>> should
>> fall under the user experience umbrella?
>
>
>Definitely falls under UX. So much can be learned about human behavior from
>stats, it's unreal. And stats don't lie, which is more than we can say about
>humans (even when these "lies" are unintentional).
>
>-r-

Oh dear. Oh my. If you're consulting a statistician who can't make
any set of data say anything you want them to say then you should
find a better statistician. Of course statistics lie. Statistics
properly manipulated can tell you just about anything about anyone in
any situation. It's like the old joke about the difference between a
bookkeeper and an accountant: When you ask how much money you made
last year a bookkeeper will answer the question and the accountant
will ask you how much money you want to have made.

Data don't have meaning without context and context is amazingly
flexible. To give just a few examples that leap to my mind whenever
someone says that statistics don't lie I cite the following:

A study early in the co-education process of a previously all men's
college that said 1/3 of all women admitted had married faculty
members. Mind you there were only 6 women who'd been admitted and the
social life of the college was all frat based and they imported girls
for events, thank you very much. Both the male faculty in question
were also brand new PhDs.

As we all know, 50% of all marriages end in divorce. Except that they
don't and they never have. One year in the early 60s a study was done
which noticed that in a particular year there would be 50% as many
divorces as marriages. You'll never find anyone (except me) who will
call your attention to the fact that those data are unrelated to the
conclusion.

The point is not that the numbers are wrong, nor are they apparently
"false" but both of them are intended to elucidate the behavior of a
certain group of people under certain circumstances but tell us
absolutely nothing about human behavior except that in the US (at
least) we tend to believe things if there are numbers attached to it.

There are a million examples...many much more pointed than
these...and books are constantly being written on the application and
misapplication of statistics, but the central fact remains: If you
want someone to believe what you're saying, find a number that seems
to support it.

Katie
--

----------------
Katie Albers
katie at firstthought.com

27 Nov 2007 - 9:19pm
Robert Hoekman, Jr.
2005

> Of course statistics lie.

I definitely see your point, and I'm not going to presume to know a lot
about statistics, but I'm talking about some pretty basic math, I think, so
I'm wondering if you can elaborate based on an example.

Let's say you make some small design tweaks to a homepage and see a 25%
increase in conversions within a week after the new version goes live, and
the increase stays decently stable for months afterwards. Before, you had 6k
conversions per month, and now you have somewhere between 8k and 9k per
month.

Can you elaborate on how something as simple as this can be misinterpreted?
I believe it can be, I'm just having trouble seeing how.

Also, I realize many stats and such are a little trickier to interpret than
this, but generally, site stats aren't that complicated.

Once, for example, I noticed that out of 1,100 new registrants for a
subscription-based application in the span of one week, only about 10% had
actually gone through the setup process for the app after paying their
initial subscription fee. This was easy to spot because the EULA was the
first page a new user hit, and only 100 or so of these new users had hit the
EULA that week.

Tracing the process backwards, I saw that the email that went to new users
after signing up contained 42 links (stupid Marketing dept!), only 1 of
which went to a Help doc about how to set up the application and get
started. This link was buried in the middle of a long block of text. The
safe bet was that most people were not seeing the link.

We stripped out the vast majority of the links, left in only a few key ones,
focused the email around the 3-step process for setting up the app instead
of linking off to a Help doc, and generally cleaned things up quite a bit. A
week later, the percentage of people who hit the EULA had gone from 10% to
over 80%.

Again, how can simple numbers like this be misinterpreted?

-r-

27 Nov 2007 - 9:42pm
SemanticWill
2007

Definitely! The huge project I am working on right now - Coremetrics
falls under the Experience Design group.

will evans
user experience architect
wkevans4 at gmail.com
617.281.1281

On Nov 27, 2007, at 8:10 PM, "Robert Hoekman, Jr." <robert at rhjr.net>
wrote:

>> Do you think analyzing data using tools like Omniture and Coremetrics
>> should
>> fall under the user experience umbrella?
>
>
> Definitely falls under UX. So much can be learned about human
> behavior from
> stats, it's unreal. And stats don't lie, which is more than we can
> say about
> humans (even when these "lies" are unintentional).
>
> -r-
> ________________________________________________________________
> *Come to IxDA Interaction08 | Savannah*
> February 8-10, 2008 in Savannah, GA, USA
> Register today: http://interaction08.ixda.org/
>
> ________________________________________________________________
> Welcome to the Interaction Design Association (IxDA)!
> To post to this list ....... discuss at ixda.org
> Unsubscribe ................ http://www.ixda.org/unsubscribe
> List Guidelines ............ http://www.ixda.org/guidelines
> List Help .................. http://www.ixda.org/help

27 Nov 2007 - 10:07pm
SemanticWill
2007

Your example is not stats. A sample set of 6 is called anecdote.
Turning it into a percentage is not stats. Their I'd no amount of
boostrapping that will make it so either. If you are not using a
statician fluent in regression analysis and using spss or SAS - then
you cannot lay claim to doing quant. Numbers dont lie. People lie.
People also ask the wrong questions and then interpret the answers the
wrong way bases on assumptions, bias, ignorance or stupidity but
properly done real quant done by qualified people is eminently useful.
(IMHO)

will evans
user experience architect
wkevans4 at gmail.com
617.281.1281

On Nov 27, 2007, at 8:46 PM, Katie Albers <katie at firstthought.com>
wrote:

> At 6:10 PM -0700 11/27/07, Robert Hoekman, Jr. wrote:
>>> Do you think analyzing data using tools like Omniture and
>>> Coremetrics
>>> should
>>> fall under the user experience umbrella?
>>
>>
>> Definitely falls under UX. So much can be learned about human
>> behavior from
>> stats, it's unreal. And stats don't lie, which is more than we can
>> say about
>> humans (even when these "lies" are unintentional).
>>
>> -r-
>
> Oh dear. Oh my. If you're consulting a statistician who can't make
> any set of data say anything you want them to say then you should
> find a better statistician. Of course statistics lie. Statistics
> properly manipulated can tell you just about anything about anyone in
> any situation. It's like the old joke about the difference between a
> bookkeeper and an accountant: When you ask how much money you made
> last year a bookkeeper will answer the question and the accountant
> will ask you how much money you want to have made.
>
> Data don't have meaning without context and context is amazingly
> flexible. To give just a few examples that leap to my mind whenever
> someone says that statistics don't lie I cite the following:
>
> A study early in the co-education process of a previously all men's
> college that said 1/3 of all women admitted had married faculty
> members. Mind you there were only 6 women who'd been admitted and the
> social life of the college was all frat based and they imported girls
> for events, thank you very much. Both the male faculty in question
> were also brand new PhDs.
>
> As we all know, 50% of all marriages end in divorce. Except that they
> don't and they never have. One year in the early 60s a study was done
> which noticed that in a particular year there would be 50% as many
> divorces as marriages. You'll never find anyone (except me) who will
> call your attention to the fact that those data are unrelated to the
> conclusion.
>
> The point is not that the numbers are wrong, nor are they apparently
> "false" but both of them are intended to elucidate the behavior of a
> certain group of people under certain circumstances but tell us
> absolutely nothing about human behavior except that in the US (at
> least) we tend to believe things if there are numbers attached to it.
>
> There are a million examples...many much more pointed than
> these...and books are constantly being written on the application and
> misapplication of statistics, but the central fact remains: If you
> want someone to believe what you're saying, find a number that seems
> to support it.
>
> Katie
> --
>
> ----------------
> Katie Albers
> katie at firstthought.com
> ________________________________________________________________
> *Come to IxDA Interaction08 | Savannah*
> February 8-10, 2008 in Savannah, GA, USA
> Register today: http://interaction08.ixda.org/
>
> ________________________________________________________________
> Welcome to the Interaction Design Association (IxDA)!
> To post to this list ....... discuss at ixda.org
> Unsubscribe ................ http://www.ixda.org/unsubscribe
> List Guidelines ............ http://www.ixda.org/guidelines
> List Help .................. http://www.ixda.org/help

27 Nov 2007 - 10:15pm
SemanticWill
2007

Sorry about the spelling -

will evans
user experience architect
wkevans4 at gmail.com
617.281.1281

On Nov 27, 2007, at 10:07 PM, William Evans <wkevans4 at gmail.com> wrote:

> Your example is not stats. A sample set of 6 is called anecdote.
> Turning it into a percentage is not stats. Their I'd no amount of
> boostrapping that will make it so either. If you are not using a
> statician fluent in regression analysis and using spss or SAS - then
> you cannot lay claim to doing quant. Numbers dont lie. People lie.
> People also ask the wrong questions and then interpret the answers
> the wrong way bases on assumptions, bias, ignorance or stupidity but
> properly done real quant done by qualified people is eminently
> useful. (IMHO)
>
> will evans
> user experience architect
> wkevans4 at gmail.com
> 617.281.1281
>
>
> On Nov 27, 2007, at 8:46 PM, Katie Albers <katie at firstthought.com>
> wrote:
>
>> At 6:10 PM -0700 11/27/07, Robert Hoekman, Jr. wrote:
>>>> Do you think analyzing data using tools like Omniture and
>>>> Coremetrics
>>>> should
>>>> fall under the user experience umbrella?
>>>
>>>
>>> Definitely falls under UX. So much can be learned about human
>>> behavior from
>>> stats, it's unreal. And stats don't lie, which is more than we can
>>> say about
>>> humans (even when these "lies" are unintentional).
>>>
>>> -r-
>>
>> Oh dear. Oh my. If you're consulting a statistician who can't make
>> any set of data say anything you want them to say then you should
>> find a better statistician. Of course statistics lie. Statistics
>> properly manipulated can tell you just about anything about anyone in
>> any situation. It's like the old joke about the difference between a
>> bookkeeper and an accountant: When you ask how much money you made
>> last year a bookkeeper will answer the question and the accountant
>> will ask you how much money you want to have made.
>>
>> Data don't have meaning without context and context is amazingly
>> flexible. To give just a few examples that leap to my mind whenever
>> someone says that statistics don't lie I cite the following:
>>
>> A study early in the co-education process of a previously all men's
>> college that said 1/3 of all women admitted had married faculty
>> members. Mind you there were only 6 women who'd been admitted and the
>> social life of the college was all frat based and they imported girls
>> for events, thank you very much. Both the male faculty in question
>> were also brand new PhDs.
>>
>> As we all know, 50% of all marriages end in divorce. Except that they
>> don't and they never have. One year in the early 60s a study was done
>> which noticed that in a particular year there would be 50% as many
>> divorces as marriages. You'll never find anyone (except me) who will
>> call your attention to the fact that those data are unrelated to the
>> conclusion.
>>
>> The point is not that the numbers are wrong, nor are they apparently
>> "false" but both of them are intended to elucidate the behavior of a
>> certain group of people under certain circumstances but tell us
>> absolutely nothing about human behavior except that in the US (at
>> least) we tend to believe things if there are numbers attached to it.
>>
>> There are a million examples...many much more pointed than
>> these...and books are constantly being written on the application and
>> misapplication of statistics, but the central fact remains: If you
>> want someone to believe what you're saying, find a number that seems
>> to support it.
>>
>> Katie
>> --
>>
>> ----------------
>> Katie Albers
>> katie at firstthought.com
>> ________________________________________________________________
>> *Come to IxDA Interaction08 | Savannah*
>> February 8-10, 2008 in Savannah, GA, USA
>> Register today: http://interaction08.ixda.org/
>>
>> ________________________________________________________________
>> Welcome to the Interaction Design Association (IxDA)!
>> To post to this list ....... discuss at ixda.org
>> Unsubscribe ................ http://www.ixda.org/unsubscribe
>> List Guidelines ............ http://www.ixda.org/guidelines
>> List Help .................. http://www.ixda.org/help

27 Nov 2007 - 10:21pm
Melvin Jay Kumar
2007

Hi Katie,

You said it all. =)

"If you want someone to believe what you're saying, find a number that seems
to support it."

Although I don't like to give numbers for a lot of the IA/UX work I
do, but in the business / Corporate environment, without numbers , you
cannot sell or get funding or approval to go forward, so I provide the
numbers they need. =)

But, thats justs to satisfy the needs of the business /corporate
situtation, the real work however comes into play when you don't
actually look at the specific numbers, but understand the context of
what you are trying to do and use these analytics tools to see if they
can help you understand some of the qualitative things you are working
on. the data helps if you know what you are looking for....

Its kinda similar to usability testing, between using numbers
percentages which often is useful for the business / corporates who
like and need numbers, but a lot of te values comes not from the
specifics but the context, the qualitative feedback and observations
and frequency of certain erros etc...etc.....

"90% of users failed in completing tasks A" , when you digg further
you realize they only tested with two people and those were the wrong
test participants even.

Tsk...Tsk......

Regards,

Jay Kumar

On 11/28/07, Katie Albers <katie at firstthought.com> wrote:
> At 6:10 PM -0700 11/27/07, Robert Hoekman, Jr. wrote:
> > > Do you think analyzing data using tools like Omniture and Coremetrics
> >> should
> >> fall under the user experience umbrella?
> >
> >
> >Definitely falls under UX. So much can be learned about human behavior from
> >stats, it's unreal. And stats don't lie, which is more than we can say about
> >humans (even when these "lies" are unintentional).
> >
> >-r-
>
> Oh dear. Oh my. If you're consulting a statistician who can't make
> any set of data say anything you want them to say then you should
> find a better statistician. Of course statistics lie. Statistics
> properly manipulated can tell you just about anything about anyone in
> any situation. It's like the old joke about the difference between a
> bookkeeper and an accountant: When you ask how much money you made
> last year a bookkeeper will answer the question and the accountant
> will ask you how much money you want to have made.
>
> Data don't have meaning without context and context is amazingly
> flexible. To give just a few examples that leap to my mind whenever
> someone says that statistics don't lie I cite the following:
>
> A study early in the co-education process of a previously all men's
> college that said 1/3 of all women admitted had married faculty
> members. Mind you there were only 6 women who'd been admitted and the
> social life of the college was all frat based and they imported girls
> for events, thank you very much. Both the male faculty in question
> were also brand new PhDs.
>
> As we all know, 50% of all marriages end in divorce. Except that they
> don't and they never have. One year in the early 60s a study was done
> which noticed that in a particular year there would be 50% as many
> divorces as marriages. You'll never find anyone (except me) who will
> call your attention to the fact that those data are unrelated to the
> conclusion.
>
> The point is not that the numbers are wrong, nor are they apparently
> "false" but both of them are intended to elucidate the behavior of a
> certain group of people under certain circumstances but tell us
> absolutely nothing about human behavior except that in the US (at
> least) we tend to believe things if there are numbers attached to it.
>
> There are a million examples...many much more pointed than
> these...and books are constantly being written on the application and
> misapplication of statistics, but the central fact remains: If you
> want someone to believe what you're saying, find a number that seems
> to support it.
>
> Katie
> --
>
> ----------------
> Katie Albers
> katie at firstthought.com
> ________________________________________________________________
> *Come to IxDA Interaction08 | Savannah*
> February 8-10, 2008 in Savannah, GA, USA
> Register today: http://interaction08.ixda.org/
>
> ________________________________________________________________
> Welcome to the Interaction Design Association (IxDA)!
> To post to this list ....... discuss at ixda.org
> Unsubscribe ................ http://www.ixda.org/unsubscribe
> List Guidelines ............ http://www.ixda.org/guidelines
> List Help .................. http://www.ixda.org/help
>

27 Nov 2007 - 11:00pm
Katie Albers
2005

I'm aware that this is not what people think of when they think of
stats. But the fact of the matter is that if you are simply given the
final number -- 33% of all co-eds in a certain year married
professors -- very few people are likely to question the underlying
data. But what I think is more to the point: It was years before
anyone asked why that had happened.

But let's assume that it had been a sample set of 600 and the
percentage had been the same, what does that tell us about the sample
set? Absolutely nothing except that 1/3 of them acted a certain way.
Which slowed down absolutely no one when it came to drawing
conclusions about the behavior of the women in this set.

The problem with data is that it does not tell us WHY. Let's assume
that we get a datum that 90% of all visitors to a site drop out on a
page. So, we make some changes and now we now have 10% drop out on
the same page. Which -- if any -- of those changes made the
difference? Maybe it was because in the meantime we passed into the
Christmas gift-giving season and people were more tolerant of being
asked for certain data (or whatever). Perhaps it was because some
large corporation changed our status so we could be accessed
internally. Maybe someone with a particularly influential blog
recommended us. Maybe another similar site came onto the scene that
influenced our users to think of our methods as standard. Basically,
the data itself tells us bugger all. And I suspect we don't really
care; we just want to be able to attribute the improvement to our
work....and it doesn't even matter that we can't say *which* of the
changes is most important. Perhaps 1 of the changes, if made in
isolation, would have caused a ~0% drop out rate on that page.
Perhaps the other changes actually negated the underlying improvement.

And on the Web (as with all human endeavor) there's no way to isolate
the changes. You can't do a pure A/B test because there are just too
many possible reactions at any point -- all of which will be
categorized as "did what we planned" or "did something else."

My main issue with statistics in site development/interaction
design/user experience is that they tend to obscure more than they
illuminate. This is because everyone thinks they understand
statistics (they took math and know that 65% is more than 15% or they
took intro to statistics or even advanced statistics and know how to
do all the various mathematical tricks involved in deriving
statistics) and they're wrong. If you never found out how to make
statistics lie, then you have no business using them for any purpose
besides using them -- knowingly -- to support your point -- whatever
it may be. Almost all of statistic useage today confuses correlation
with causality, and overall that's incredibly dangerous.

Katie

At 10:07 PM -0500 11/27/07, William Evans wrote:
>Your example is not stats. A sample set of 6 is called anecdote.
>Turning it into a percentage is not stats. Their I'd no amount of
>boostrapping that will make it so either. If you are not using a
>statician fluent in regression analysis and using spss or SAS - then
>you cannot lay claim to doing quant. Numbers dont lie. People lie.
>People also ask the wrong questions and then interpret the answers
>the wrong way bases on assumptions, bias, ignorance or stupidity but
>properly done real quant done by qualified people is eminently
>useful. (IMHO)
>
>will evans
>user experience architect
>wkevans4 at gmail.com
>617.281.1281
>
>
>On Nov 27, 2007, at 8:46 PM, Katie Albers <katie at firstthought.com> wrote:
>
>>At 6:10 PM -0700 11/27/07, Robert Hoekman, Jr. wrote:
>>>>Do you think analyzing data using tools like Omniture and Coremetrics
>>>>should
>>>>fall under the user experience umbrella?
>>>
>>>
>>>Definitely falls under UX. So much can be learned about human behavior from
>>>stats, it's unreal. And stats don't lie, which is more than we can say about
>>>humans (even when these "lies" are unintentional).
>>>
>>>-r-
>>
>>Oh dear. Oh my. If you're consulting a statistician who can't make
>>any set of data say anything you want them to say then you should
>>find a better statistician. Of course statistics lie. Statistics
>>properly manipulated can tell you just about anything about anyone in
>>any situation. It's like the old joke about the difference between a
>>bookkeeper and an accountant: When you ask how much money you made
>>last year a bookkeeper will answer the question and the accountant
>>will ask you how much money you want to have made.
>>
>>Data don't have meaning without context and context is amazingly
>>flexible. To give just a few examples that leap to my mind whenever
>>someone says that statistics don't lie I cite the following:
>>
>>A study early in the co-education process of a previously all men's
>>college that said 1/3 of all women admitted had married faculty
>>members. Mind you there were only 6 women who'd been admitted and the
>>social life of the college was all frat based and they imported girls
>>for events, thank you very much. Both the male faculty in question
>>were also brand new PhDs.
>>
>>As we all know, 50% of all marriages end in divorce. Except that they
>>don't and they never have. One year in the early 60s a study was done
>>which noticed that in a particular year there would be 50% as many
>>divorces as marriages. You'll never find anyone (except me) who will
>>call your attention to the fact that those data are unrelated to the
>>conclusion.
>>
>>The point is not that the numbers are wrong, nor are they apparently
>>"false" but both of them are intended to elucidate the behavior of a
>>certain group of people under certain circumstances but tell us
>>absolutely nothing about human behavior except that in the US (at
>>least) we tend to believe things if there are numbers attached to it.
>>
>>There are a million examples...many much more pointed than
>>these...and books are constantly being written on the application and
>>misapplication of statistics, but the central fact remains: If you
>>want someone to believe what you're saying, find a number that seems
>>to support it.
>>
>>Katie
>>--
>>
>>----------------
>>Katie Albers
>>katie at firstthought.com
>>________________________________________________________________
>>*Come to IxDA Interaction08 | Savannah*
>>February 8-10, 2008 in Savannah, GA, USA
>>Register today: http://interaction08.ixda.org/
>>
>>________________________________________________________________
>>Welcome to the Interaction Design Association (IxDA)!
>>To post to this list ....... discuss at ixda.org
>>Unsubscribe ................ http://www.ixda.org/unsubscribe
>>List Guidelines ............ http://www.ixda.org/guidelines
>>List Help .................. http://www.ixda.org/help

--

----------------
Katie Albers
katie at firstthought.com

27 Nov 2007 - 11:44pm
Mark Schraad
2006

I had a professor that loved to tell us that statistics were invented
in order to legitimize social sciences... cause they weren't 'real'
science.

Mark

On Nov 27, 2007, at 10:21 PM, Melvin Jay Kumar wrote:

> Hi Katie,
>
> You said it all. =)
>
> "If you want someone to believe what you're saying, find a number
> that seems
> to support it."
>
> Although I don't like to give numbers for a lot of the IA/UX work I
> do, but in the business / Corporate environment, without numbers , you
> cannot sell or get funding or approval to go forward, so I provide the
> numbers they need. =)
>

27 Nov 2007 - 11:57pm
Steve Baty
2009

As much as it might be fun to generalise and beat up on statistics, lets
clear up a few points:
i) The word statistics refers to: an area of mathematical study; a single
piece, or group, of numerical data which describes a characteristic of some
object; and a set of methods use in the analysis of said data.
ii) Whilst numerical data is generally considered to be factual, the quality
of the methodology used, the assumptions made about that data, the
objectivity of the statistician, and the quality of the conclusions all
affect how accurately the resulting description of the object of study
reflect reality;
iii) the usefulness of quantitative analysis should not be confused with the
quality of some particular piece of analysis. Just because I make aweful
sketches doesn't mean sketching is a bad idea (it just means you should hide
the pencils & crayons from me).
iv) You don't need a statistician to analyse quantitative data, but it
certainly helps in some circumstances, particularly when dealing with
complex statistical analysis;
v) A small sample does not mean your data is useless (or an anecdote). It
just means that the margin of error in any conclusions that you draw might
be so great as to render the result meaningless in the context of your
inquiry.
vi) A large sample does not create inherently 'better' results. It does give
you more data to work with, and it does reduce the margin of error.
vii) There is no single statistical method that makes your analysis more or
less correct. Different techniques will provide different insights.
viii) You can never be 100% certain when dealing with statistics or
statistical analysis. The whole point is not to be certain, but to be able
to quantify the uncertainty.

I do encourage everyone to learn more about statistics, and particularly
about how to critically assess the quality of statistical analysis when it's
presented to you.

To the OP, analytics fall under the UX purview in my team, but that's
because I prefer to do my own analysis. I can see how that might not be the
case in other environments, but I would definitely suggest the UX team gains
access to the outputs from whoever does handle it.

Regards
Steve

----------------------------------------------
Steve 'Doc' Baty B.Sc (Maths), M.EC, MBA
User Experience Strategist
M: +61 417 061 292

Member, UPA - www.upassoc.org
Member, IxDA - www.ixda.org
Member, Web Standards Group - www.webstandardsgroup.org
Contributor, UXMatters - www.uxmatters.com

28 Nov 2007 - 12:19am
Steve Baty
2009

On 28/11/2007, Mark Schraad <mschraad at mac.com> wrote:
>
> I had a professor that loved to tell us that statistics were invented
> in order to legitimize social sciences... cause they weren't 'real'
> science.
>
> Mark
>

Mark, this just shows how little they knew about the history of statistical
theory and practice.

This area of study attracted some of the greatest names in mathematics for
the past 450 years and grew out of the study of probabilities by Cardano
(1560), Fermat and Pascal (mid-1600s). The works we use today in the design
of quantitative studies dates back to the late 1600's and early 1700's; and
much of the analytical techniques we use were derived and proven during the
same period through to the late 1800's. During the latter part of the 20th
century we saw the introduction of many of the multivariate analysis
techniques commonly used today.

Regards,
Steve

----------------------------------------------
Steve 'Doc' Baty B.Sc (Maths), M.EC, MBA
Director, User Experience Strategy
Red Square
P: +612 8289 4930
M: +61 417 061 292

Member, UPA - www.upassoc.org
Member, IxDA - www.ixda.org
Member, Web Standards Group - www.webstandardsgroup.org

28 Nov 2007 - 6:38am
Mark Schraad
2006

Since he held PhD's in both Mathematics and Economics from Berkeley,
I am going to pose that it was not so much a lack of knowledge, but
rather a lack of respect for how social science presents its work
(along with a sharp sense of humor).

Mark

On Nov 28, 2007, at 12:19 AM, Steve Baty wrote:

> On 28/11/2007, Mark Schraad <mschraad at mac.com> wrote:
>>
>> I had a professor that loved to tell us that statistics were invented
>> in order to legitimize social sciences... cause they weren't 'real'
>> science.
>>
>> Mark
>>
>
> Mark, this just shows how little they knew about the history of
> statistical
> theory and practice.
>
> This area of study attracted some of the greatest names in
> mathematics for
> the past 450 years and grew out of the study of probabilities by
> Cardano
> (1560), Fermat and Pascal (mid-1600s). The works we use today in
> the design
> of quantitative studies dates back to the late 1600's and early
> 1700's; and
> much of the analytical techniques we use were derived and proven
> during the
> same period through to the late 1800's. During the latter part of
> the 20th
> century we saw the introduction of many of the multivariate analysis
> techniques commonly used today.
>
> Regards,
> Steve
>
>

28 Nov 2007 - 7:09am
Fred Beecher
2006

On 11/27/07, Robert Hoekman, Jr. <robert at rhjr.net> wrote:
>
>
> Definitely falls under UX. So much can be learned about human behavior
> from
> stats, it's unreal. And stats don't lie, which is more than we can say
> about
> humans (even when these "lies" are unintentional).

This is *really* surprising to me... that you and Will believe analytics
falls *under* UXP. It depends on *why* you're using analytics, I suppose,
but if you're using it only for UXP purposes I don't think you're getting
the full business value from the practice. There are many good business uses
for analytics that fall outside the scope of UXP.

For example, what about tracking the effectiveness of advertising? Of
pay-per-click campaigns? Testing the messaging of one campaign against that
of another? These situations, while they have UXP components, fall over the
line into market research. I don't know about you guys, but market research
is neither my area of interest nor expertise.

Not to mention that the skills it takes to be an effective Web analyst are
different than (though similar to) the skills it takes to be an effective
UXP practitioner. Myself, I do primarily UXP but I also do analytics
projects. And when I'm doing a straight-up analytics project, it's not
always easy for me to take my UXP hat off (it never does, and it never
should, come off completely) and get down to questions about aspects of the
practice that aren't directly UXP-related. Here are some things analytics
practitioners need to do:

- Determine technical analytics requirements

- Determine how other information systems will get integrated into the
analytics solution (CRM data, product data, etc.)

- Create RFIs and RFPs for vendors to determine the correct analytics
solution for the given situation

- Monitor and test the accuracy of an implementation

- Address business issues that don't directly relate to UXP (e.g., keeping
track of which products are selling best, the aforementioned market research
tasks, etc.)

I don't know about you guys, but I've got enough to do on a typical UXP
project without all that junk thrown on top of it!

There's another distinction as well that Katie addressed. Analytics doesn't
answer the "why." UXP *can.* And this is why I constantly champion *strong
collaboration* between UXP and analytics. They need to be separate practices
due to their different focuses and different uses, but these separate
practices do need to work very closely with one another because they provide
value for one another. Analytics provides quantitative data for UXP to use,
and UXP provides qualitative data to help give meaning to the quantitative.

- Fred

28 Nov 2007 - 10:37am
Todd Warfel
2003

On Nov 27, 2007, at 10:21 PM, Melvin Jay Kumar wrote:

> "90% of users failed in completing tasks A" , when you digg further
> you realize they only tested with two people and those were the
> wrong test participants even.

Which is exactly why we starting using:

"90% of users failed (10% reporting)" to clearly indicate how many
failed/succeeded and how many that represents out of the total tested—
provide the full picture and context.

Cheers!

Todd Zaki Warfel
President, Design Researcher
Messagefirst | Designing Information. Beautifully.
----------------------------------
Contact Info
Voice: (215) 825-7423
Email: todd at messagefirst.com
AIM: twarfel at mac.com
Blog: http://toddwarfel.com
----------------------------------
In theory, theory and practice are the same.
In practice, they are not.

28 Nov 2007 - 10:38am
Todd Warfel
2003

On Nov 27, 2007, at 11:00 PM, Katie Albers wrote:

> The problem with data is that it does not tell us WHY.

Exactly. Metrics can tell us what, but they don't tell us why. We
cannot find out why w/o actually observing people.

Cheers!

Todd Zaki Warfel
President, Design Researcher
Messagefirst | Designing Information. Beautifully.
----------------------------------
Contact Info
Voice: (215) 825-7423
Email: todd at messagefirst.com
AIM: twarfel at mac.com
Blog: http://toddwarfel.com
----------------------------------
In theory, theory and practice are the same.
In practice, they are not.

28 Nov 2007 - 11:15am
Robert Hoekman, Jr.
2005

> Your example is not stats.

Again with the semantics! ;) All right, call 'em "numbers". Metrics?
Analytics? Call 'em whatever you want. The thread wasn't about the term we
use, it was about whether or not analytics should be an IxD role. I believe
they should be, because you can learn a lot about the effectiveness of your
design through them.

-r-

28 Nov 2007 - 11:20am
Robert Hoekman, Jr.
2005

> Exactly. Metrics can tell us what, but they don't tell us why. We
> cannot find out why w/o actually observing people.

Eh - we can make a pretty dern good guess a lot of the time. Sure, there are
certain things you'll never notice without observing people, but with
experience, some decent educated guesses will solve 80% of the usability
problems.

-r-

28 Nov 2007 - 11:31am
Anonymous

Hey all. One thing I failed to mention is that I really was interested if
the UX person should be the administrator for the analytics system, -
setting up click streams, page tags, conversion events - things like that.

I agree pulling reports and looking at usage data should fall in our area.

On Nov 27, 2007 10:44 PM, Mark Schraad <mschraad at mac.com> wrote:

> I had a professor that loved to tell us that statistics were invented
> in order to legitimize social sciences... cause they weren't 'real'
> science.
>
> Mark
>
>
> On Nov 27, 2007, at 10:21 PM, Melvin Jay Kumar wrote:
>
> > Hi Katie,
> >
> > You said it all. =)
> >
> > "If you want someone to believe what you're saying, find a number
> > that seems
> > to support it."
> >
> > Although I don't like to give numbers for a lot of the IA/UX work I
> > do, but in the business / Corporate environment, without numbers , you
> > cannot sell or get funding or approval to go forward, so I provide the
> > numbers they need. =)
> >
> ________________________________________________________________
> *Come to IxDA Interaction08 | Savannah*
> February 8-10, 2008 in Savannah, GA, USA
> Register today: http://interaction08.ixda.org/
>
> ________________________________________________________________
> Welcome to the Interaction Design Association (IxDA)!
> To post to this list ....... discuss at ixda.org
> Unsubscribe ................ http://www.ixda.org/unsubscribe
> List Guidelines ............ http://www.ixda.org/guidelines
> List Help .................. http://www.ixda.org/help
>

28 Nov 2007 - 11:38am
Todd Warfel
2003

On Nov 28, 2007, at 11:20 AM, Robert Hoekman, Jr. wrote:

>
> Exactly. Metrics can tell us what, but they don't tell us why. We
> cannot find out why w/o actually observing people.
>
> Eh - we can make a pretty dern good guess a lot of the time. Sure,
> there are certain things you'll never notice without observing
> people, but with experience, some decent educated guesses will solve
> 80% of the usability problems.

The no brainer stuff I don't take issue with. It's the other 20% I'm
speaking of.

I'd much rather teach people to not make a "good guess" and be certain
based on something more concrete. There are some of us in this field
that have intuition, having been doing this so long that we know what
the pattern means, that had proper training before getting into this
field, or that quite simply are just really good designers. But that's
a small percentage in the sum total of people doing design work.

Yes there are plenty of good, well designed products out there, but
they're terribly overshadowed by the number of poorly designed products.

I think we need to be a bit more responsible in our statements and
advocate using better methods for those getting into the field to
start balancing the scales out a bit between bad and good design.

Follow the guidelines until you know how and when you should break them.

Cheers!

Todd Zaki Warfel
President, Design Researcher
Messagefirst | Designing Information. Beautifully.
----------------------------------
Contact Info
Voice: (215) 825-7423
Email: todd at messagefirst.com
AIM: twarfel at mac.com
Blog: http://toddwarfel.com
----------------------------------
In theory, theory and practice are the same.
In practice, they are not.

28 Nov 2007 - 11:41am
Parth Upadhye
2007

I agree with Beecher and would only add the following based on my
experience in Analytics:
UX Professionals can elaborate and create business cases which then
should be handed over the Analytics team. This is very much like
Business Cases and Requirements.

I am currently working on a project where the business person is too
involved in the technical aspects and thereby ignoring the very
reason of the exercise - business cases. Without those, analytics
don't mean anything.

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Posted from the new ixda.org
http://gamma.ixda.org/discuss?post=22955

28 Nov 2007 - 12:26pm
Robert Hoekman, Jr.
2005

> It is true that stats/quantitative data can only tell us the "what", but
> people seem to be implying that this is not useful information. I find that
> very strange.

Amen. Our job is to apply our knowledge and experience and such to the
"what" and figure out the "why" so it can be improved. Without the data, you
can only guess. With the data, you can guess well.

-r-

28 Nov 2007 - 12:51pm
Mark Schraad
2006

I am not sure anyone has said that stats are not useful. I think the general notion here is of caution. When working with statistics, you are basically manipulating some of the variances in order to isolate and observe other variances. The very nature of statistics is in fact data manipulation. Even an R-square distorts the data, in spite of it often being very telling.

In particular, when I hear vague or compound stats being thrown around I get a bit testy. The other day I heard an NPR voice reporting that in 5% of results, the respondents were almost 3 times more likely to show improvement over the control group. What the heck does that mean? It has the potential to be grossly misinterpreted.

The final bit of caution, is that stats and data should inform the design, not make the decision for them.

Mark

On Wednesday, November 28, 2007, at 12:30PM, "Robert Hoekman, Jr." <robert at rhjr.net> wrote:
>> It is true that stats/quantitative data can only tell us the "what", but
>> people seem to be implying that this is not useful information. I find that
>> very strange.
>
>
>Amen. Our job is to apply our knowledge and experience and such to the
>"what" and figure out the "why" so it can be improved. Without the data, you
>can only guess. With the data, you can guess well.
>
>-r-

28 Nov 2007 - 11:53am
destraynor
2007

On Nov 28, 2007 4:20 PM, Robert Hoekman, Jr. <robert at rhjr.net> wrote:

> > Exactly. Metrics can tell us what, but they don't tell us why. We
> > cannot find out why w/o actually observing people.
>
> Eh - we can make a pretty dern good guess a lot of the time. Sure, there
> are
> certain things you'll never notice without observing people, but with
> experience, some decent educated guesses will solve 80% of the usability
> problems.
>
> -r-
>
>
It is true that stats/quantitative data can only tell us the "what", but
people seem to be implying that this is not useful information.
I find that very strange. Analytics data provides a huge insight in what is
happening on a website, and often I don't need to perform user testing to
find out the why as experience previous experience helps.

Here are some simple numbers I gained from Google Analytics reports this
week.

2,051 (8%) of users who searched did not enter any search term.
28% of users did not make it past a certain page in the purchase process,
yet this page comes after payment details.
15% of users partially completed a form, but abandoned it.
The special offers page was seen by only 8% of users, but of those 8%, 55%
went on to purchase something.

Access to such accurate data is hugely beneficial when evaluating a site,
or planning a user test.

Regards,

Des
--

Des Traynor,
Usability Analyst,
iQ Content Ltd.
http://www.iqcontent.com

28 Nov 2007 - 3:01pm
Jocelyn Spence
2007

On this topic, can anyone suggest a good book for someone in our field
to learn statistics from? I have people who are competent and happy to
guide me but I'd need a text to work from.

TIA,
Jocelyn Spence
as of next Monday, User Experience Architect at Cimex Media Ltd

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Posted from the new ixda.org
http://www.ixda.org/discuss?post=22955

28 Nov 2007 - 4:41pm
Mark Schraad
2006

Try Statistics for People Who (Think They) Hate Statistics by Neil J. Salkind

http://www.amazon.com/Statistics-People-Think-They-Hate/dp/141295150X/ref=pd_bbs_sr_2?ie=UTF8&s=books&qid=1196286069&sr=8-2

On Wednesday, November 28, 2007, at 04:29PM, "Jocelyn Spence" <stronglanguage.us at gmail.com> wrote:
>On this topic, can anyone suggest a good book for someone in our field
>to learn statistics from? I have people who are competent and happy to
>guide me but I'd need a text to work from.
>
>TIA,
>Jocelyn Spence
>as of next Monday, User Experience Architect at Cimex Media Ltd
>
>
>
>. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
>Posted from the new ixda.org
>http://www.ixda.org/discuss?post=22955
>
>
>________________________________________________________________
>*Come to IxDA Interaction08 | Savannah*
>February 8-10, 2008 in Savannah, GA, USA
>Register today: http://interaction08.ixda.org/
>
>________________________________________________________________
>Welcome to the Interaction Design Association (IxDA)!
>To post to this list ....... discuss at ixda.org
>Unsubscribe ................ http://www.ixda.org/unsubscribe
>List Guidelines ............ http://www.ixda.org/guidelines
>List Help .................. http://www.ixda.org/help
>
>

28 Nov 2007 - 6:09pm
Erin Walsh
2007

Coincidentally enough, the User Experience Podcast came out today
concerning web analytics.

http://uxpod.com/

Enjoy,
Erin

Syndicate content Get the feed