Any usabilty studies on free hand gestures?

1 May 2008 - 7:17am
8 years ago
1 reply
988 reads
Amnon Dekel

Hi All
I am working on the design of a free hand gesture based interface (not touch
interfaces - but free hand gestures performed in the air in front of a
sensor system) and am failing to find any published usability studies in
this field. I have found a lot of technical papers on how to
extract/recognize gestures from the visual domain, but nothing on what might
be considered better gestures and models. I am trying to minimize the amount
of testing I will need to do in order to decide on the geture language to
adopt. Any tips / links will be appreciated.

Amnon Dekel
UX Designer & Researcher
Cell: +972 54 813-8160


1 May 2008 - 1:45pm
Kevin Doyle

Hi Amnon,

Johnny Chung Lee has developed a computer interface tech that uses
hand gestures and head movements... using a $40 Wii controller (aka:
Wiimote) and some custom programming.

Definitely check out his movies -- how he's interacting with the
computer is simply mind-blowing.

He's getting a LOT of attention because it's relatively easy and
cheap to do, uses a tech that's readily available and it seems so
amazingly futuristic and cutting-edge. Here's his movie at TED:

Here's a blurb from his site about using the Wiimote for tracking
hand gestures:

Tracking Your Fingers with the Wiimote:
Using an LED array and some reflective tape, you can use the infrared
camera in the Wii remote to track objects, like your fingers, in 2D
space. This lets you interact with your computer simply by waving
your hands in the air similar to the interaction seen in the movie
"Minority Report". The Wiimote can track upto 4 points
simultaneously. The multipoint grid software is a custom C# DirectX

Here's his description for head-tracking:

Head Tracking for Desktop VR Displays using the Wii Remote:
Using the infrared camera in the Wii remote and a head mounted sensor
bar (two IR LEDs), you can accurately track the location of your head
and render view dependent images on the screen. This effectively
transforms your display into a portal to a virtual environment. The
display properly reacts to head and body movement as if it were a
real window creating a realistic illusion of depth and space.

The program only needs to know your display size and the size of your
sensor bar. The software is a custom C# DirectX program and is
primarily provided as sample code for developers without support or
additional documentation. You may need the most recent version of
DirectX installed for this to work.

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Posted from the new

Syndicate content Get the feed