Comments on the peak usability/spirit soft works 2004 Salary Surv ey for User Experience Design & Usability Professionals

14 Jan 2005 - 6:10am
9 years ago
2 replies
1159 reads
carl myhill
2006

Apologies for cross posting this

Hi All,

Having finally upgraded Acrobat so I could read these recently published
(but long awaited) salary surveys I was quite surprised with what I saw, and
am equally surprised that nobody else has commented on this.

The reports are available here...
Spirit Softworks:
http://www.spiritsoftworks.com/resources/2004-salary-survey.htm
Peak Usability:
http://www.peakusability.com.au/resources/usabilitysalarysurvey.htm

I appreciate that these guys have put a lot of effort into analysing these
data but then 820 of us gave our time to filling out the survey.

Whilst there are some interesting data in the report, I am pretty appalled
with the report as it has been presented, particularly the fact that so much
of the data are not represented.

Here are a couple of points which sprang out:

- the word 'average' is used throughout the report. Which average is being
used? Mean? Median? Mode? Presumably the Mean is being reported but this is
not clear because often with salary data the Median is used, particularly by
governments.

- there is very little attempt to provide any 'descriptive statistics'
saying anything meaningful about the spread of the data. Not even basic
things like the highest and lowest values. What is the standard deviation?
What are the quartile ranges? Perhaps a picture of the range of values
reported? The UPA did a pretty good job of this in their 2000 report
http://www.upassoc.org/upa_publications/upa_voice/survey/2000_survey.html
(I seem to recall the 2003 version being very good as well, but can no
longer find it online)

Salary surveys are a very useful tool for us as individual professionals
looking at our own situations, and also when we're recruiting. I am
extremely disappointed with this effort, which gives me very little
information I can actually use. I have no clues as to how the "averages"
presented may have been influenced by outliers for example. There is a
little interesting stuff here, such as the professions where UXD people have
worked before but very little else which is substantive.

I normally hand salary surveys to our HR department so that they have
something more accurate than standard IT salary surveys on which to judge
us. I will not be doing that with this report - what could they do with it?
All they could say is whether we were paid above or below "average", which
is little use without a look at the spread of the data, even a quartile
range would be handy. Some companies even have a policy of paying staff in
the upper quartile - how could they judge from this?!

It seems a great shame that this has been presented in this way, especially
when the UPA seem to be pretty good at presenting this information - why not
just copy the way the UPA do it?

I apologise to the authors of the report if these comments seem unduly
ungrateful or over-critical. It seems to me that some very basic elements of
descriptive statistics would greatly improve this report and I'm extremely
surprised that such is quite so absent from what was published.

Does anyone else have any comments on this? Am I being overly grumpy this
week?

I look forward to the publication of the UPAs recent salary survey.

Carl

PS Perhaps the authors could publish the raw data so we can slice it up
ourselves

Comments

14 Jan 2005 - 12:50pm
Donna Timara
2004

Myhill, Carl S (GE Energy) <carl.myhill at ps.ge.com> wrote:

> Having finally upgraded Acrobat so I could read these recently published
> (but long awaited) salary surveys I was quite surprised with what I saw, and
> am equally surprised that nobody else has commented on this.

The visually literate community is supposed to be statistically dumb.
What else were you expecting? IMHO, this groups is so much qualitative
challenged that numbers will go flying…

> Here are a couple of points which sprang out:
>
> - the word 'average' is used throughout the report. Which average is being
> used? Mean? Median? Mode? Presumably the Mean is being reported but this is
> not clear because often with salary data the Median is used, particularly by
> governments.
>
> - there is very little attempt to provide any 'descriptive statistics'
> saying anything meaningful about the spread of the data. Not even basic
> things like the highest and lowest values. What is the standard deviation?
> What are the quartile ranges? ...
<snip>

Ditto my thought!
I was thinking where are the Box and Scatter Plots that can show me
where is the most likely population. Where and how many are the
outliers, etc. As such, if I have to take this analysis to claim, I am
under paid (or visa-versa) this will be insufficient.

I was also expecting to see more stuffs on the processes/methodology
they followed. Besides, I know for sure this was not the scientific
survey?! Else, why not explaining about margin of error/confidence
interval for all set of observations?

> Does anyone else have any comments on this? Am I being overly grumpy this
> week?

No you are not alone, but almost minority. I will quote what Howard
Dean said yesterday, "...there are folks [in this party] who knows but
don't talk… and those know talk, but don't know…"

I am not averse of somebody taking a step in new directions and trying
new professions/expertise, but some disclaimers, some explanation to
the madness prevents damage. I know these are pretty serious things to
expect from Usability/Design professionals. However, if you do what
you don't know you toss your credibility.

That said, I too found it as a good attempt to gather data and present
it interestingly. I would love to have the piece of data myself to do
my analysis. And I hope, if you make it public, you will still
maintain the privacy of individuals somehow.

:DT

14 Jan 2005 - 2:30pm
Todd Warfel
2003

While my interest was peaked by the survey, I think the format wasn't
that useful.

First rule of thumb we go by on reports - over 30 pages people don't
read - it's a giant paper weight. We target under 20, and shoot for
under 10-12, at least for the meat of it (excluding appendix).

This is based on our experience with clients and feedback from them.

So, the length is a major barrier for this thing.

Also, the format. It seemed like I was reading the same thing over and
over again, but possibly represented differently, but not really quite
sure. It wasn't clear what I was reading and how this was different
from the next page.

I think there's a bunch of potentially important data in that report,
but I didn't find it that useful because it wasn't that usable IMHO.

I'd be interested in seeing a revised version that's clearer and to the
point. Too much info and no clear direction IMHO.

On Jan 14, 2005, at 12:50 PM, Donna Timara wrote:

> That said, I too found it as a good attempt to gather data and present
> it interestingly. I would love to have the piece of data myself to do
> my analysis. And I hope, if you make it public, you will still
> maintain the privacy of individuals somehow.

Cheers!

Todd R. Warfel
Partner, Design & Usability Specialist
MessageFirst | making products easier to use
--------------------------------------
Contact Info
V: (607) 339-9640
E: twarfel at messagefirst.com
W: messagefirst.com
AIM: twarfel at mac.com
--------------------------------------
In theory, theory and practice are the same.
In practice, they are not.

Syndicate content Get the feed