How do you do heuristic evaluations? Examples?

7 Dec 2007 - 8:10am
6 years ago
2 replies
4292 reads
tdellaringa
2006

Sorry if this is a double post, I can't seem to locate my original posting
from my other email.

I've done about a dozen heuristic evals, they've usually been pretty high
level. An internal client has considered going with Forrester to get some
'web site reports' they offer, but my boss would rather they use us,
internally, of course to do the same work. Certainly we can do it.

He wanted to know if the evaluations could be more polished or detailed. So
my question is what heuristics do you usually test against? Do you lay those
out in your document and then answer them by group? It would be great if a
couple example docs could be posted if possible, even if the specific client
info was stripped out.

I have seen Neilson's heuristic article and list of 10 heuristics, which I
have used in the past. Wondering if that is enough or if it is too general.

Thanks

Tom

Comments

7 Dec 2007 - 9:30am
White, Jeff
2007

On Dec 7, 2007 8:10 AM, Tom Dell'Aringa <pixelmech at gmail.com> wrote:

> I have seen Neilson's heuristic article and list of 10 heuristics, which I
> have used in the past. Wondering if that is enough or if it is too general.

I did one a few months ago. I can't share it, but below are the
heuristics I used - which was a combination of Neilsen's and several
others. The links I collected while researching what heuristics to use
are below also - it's been a while so this may not be the best list -
I'm just blindly yanking them off my delicious bookmarks.. And, be
sure to check out the work of Jill Gerhardt-Powals on heuristics if
you haven't already. Hope this helps.

1. Automate unwanted workload
* Free cognitive resources for high-level tasks
* Eliminate mental calculations, estimations, comparisons,
and unnecessary thinking

2. Reduce uncertainty
* Display data in a manner that is clear and obvious

3. Fuse data
* Reduce cognitive load by bringing together lower level
data into a higher level summation

4. Present new information with meaningful aids to interpretation
* Use a familiar framework, making it easier to absorb
* Use everyday terms, metaphors, etc.

5. Use names that are conceptually related to function
* Context-dependent
* Attempt to improve recall and recognition

6. Group data in consistently meaningful ways to decrease search time

7. Limit data-driven tasks
* Reduce the time spent assimilating raw data
* Make appropriate use of color and graphics

8. Include in the displays only that information needed by the user
at a given time
* Allow users to remain focused on critical data
* Exclude extraneous information that is not relevant to current tasks

9. Provide multiple coding of data when appropriate

10. Practice judicious redundancy (to resolve the possible conflict
between heuristics 6 and 8)

11. Visibility of system status

12. Recognition rather recall

http://www.usabilitybok.org/methods/p275

Cognitive engineering principles for enhancing human-computer
performance (ACM) http://tinyurl.com/2lnbgc

http://www.humanfactors.com/downloads/may99.asp

http://jthom.best.vwh.net/usability/

http://www.stcsig.org/usability/resources/toolkit/toolkit.html#heuristics

http://www.stcsig.org/usability/topics/articles/he-checklist.html

Jeff

9 Dec 2007 - 2:24pm
Chauncey Wilson
2007

> He wanted to know if the evaluations could be more polished or detailed. So
> my question is what heuristics do you usually test against? Do you lay those
> out in your document and then answer them by group? It would be great if a
> couple example docs could be posted if possible, even if the specific client
> info was stripped out.
>
> I have seen Neilson's heuristic article and list of 10 heuristics, which I
> have used in the past. Wondering if that is enough or if it is too general.

Hello Tom,

Here are some notes on Heuristic evaluation that might be useful.
Jeff White described the research-based heuristics that were endorsed
by Bailey. Muller, Matheson, Page, and Gallup (1995, p. 14) extend the
basic heuristic evaluation Method by adding a new category of
heuristics called "Task and Work Support". Muller and his colleagues
note how the earlier sets of heuristics were generally
product-oriented and concerned with problems in isolation. The new
task and work support heuristics focus on user goals (produce quality
work, keep sensitive material private, enhance my skills) and a
positive experience in the workplace. Here are the
enhanced heuristics which are part of a participatory heuristic
inspection that DOES involve users.

"Skills -- The system supports, extends, supplements, or enhances the
user's skills, background knowledge, and expertise. The system does
not replace them. Wizards support, extend, or execute decisions made
by users.
Pleasurable and respectful interaction with the user -- The user's
interactions with the system enhance the quality of her or his
experience. The user is treated with respect. The design reflects the
user's professional role, personal identity, or intention. The design
is aesthetically pleasing — with an appropriate balance of artistic as
well as functional value.
Quality work -- The system supports the user in delivering quality
work to her or his clients (if appropriate). Attributes of quality
work include timeliness, accuracy, aesthetic appeal, and appropriate
levels of completeness.
Privacy -- The system helps the user to protect personal or private
information—belonging to the user or to his or clients.."

The basic heuristic evaluation can be enhanced in a number of ways including:

• Use an object/task approach where you have evaluators do a
task-based evaluation and then an object-based evaluation.
• Considering basic human factors principles that are not included in
the original sets of heuristics.
• Adapting existing heuristics to your domain or create
context-specific heuristics.
• Using multiple evaluators who are trained on heuristics and the
heuristic evaluation process.
• Evaluating explicit features and apply specific usability criteria
to each feature (Yahuda & McGinn, 2007) to determine problem severity.
• Meeting as a group to discuss their individual lists and combine the
problems into a single list with agreed on severity levels.
• Developing one or more solutions for each problem or problem category.
• Determine the effectiveness of heuristic evaluation by tracking what
problems were fixed, what solutions were implemented (compared to
those suggested in the report), how much effort was expended on the
change, and if there are meta-problems that should be addressed.

Sears (1997) developed a technique called a "heuristic walkthrough"
that had some of the attributes of three UCD methods: a heuristic
evaluation, a perspective-based inspection and a cognitive
walkthrough. In Sears' method, the evaluators were given a prioritized
list of user tasks, a set of heuristics, and "thought-provoking"
questions derived from the cognitive walkthrough method (Cockton,
Lavery, & Woolrych, 2002).

Chauncey

Syndicate content Get the feed