Tog on Gestures will force the mouse into retirement

30 Dec 2008 - 6:49am
5 years ago
13 replies
1096 reads
SemanticWill
2007

@hannusalonen tweeted this article this morning which was in the Financial
Times. Including the article b/c it was behind an annoying
signup/registration process. http://tinyurl.com/9hmpj4

"The Tog: Mouse users are "little more than cavemen, running around pointing
at symbols and 'grunting' with each click". http://bit.ly/yP1M"
Gestures will force the mouse into retirement

By Jessica Twentyman

Published: September 17 2008 03:00 | Last updated: September 17 2008 03:00

At almost 30 years old, is the computer mouse ready for retirement?
Certainly, a growing band of human-computer interaction (HCI) specialists
believe so. The crude language of "point and click", they argue, seriously
limits the "conversations" we have with our computers.

Among them is Bruce "Tog" Tognazzini, a veteran HCI expert who joined Apple
in 1978 as its 66th employee and founded the company's Human Interface Group
during his 14 years there. These days, after spells at Sun Microsystems and
online healthcare company WebMD, Mr Tognazzini is a respected consultant,
author and speaker with usability company, the Nielsen Norman Group.

"In many ways, our continued reliance on the computer mouse reduces us to
little more than cavemen, running around pointing at symbols and 'grunting'
with each click," he says. "A revolution is long overdue, because we need
more sophisticated tools that will allow us to increase our vocabulary way
beyond that caveman grunt." Plus, the link between the computer mouse and
cases of repetitive strain injury (RSI) are hardly an argument in its
favour, he adds.

Luckily, he says, those "more sophisticated" tools are right in front of our
faces and we already know how to use them. They are, in fact, our fingers.

"Look at the facts: we've typically got 10 of these 'tools'; they move in a
multitude of different ways; and gestural language, which came long before
verbal language, is an established and intuitive form of self-expression.
Even primates can be trained to express needs and intentions using their
fingers," he points out.

What has historically been lacking, is the ability of computers to read and
understand our gestures - but that is changing very quickly. In fact,
real-time video interpretation and inertial sensors are already being used
to recognise facial expression and physical movement in a number of consumer
technology devices, says Steven Prentice, an analyst with IT market research
company Gartner.

He traces the roots of this migration to two recent events: the launch of
the Nintendo Wii games console in 2006 and of the Apple iPhone in 2007.
Through clever use of accelerometers and optical sensor technology, the Wii
Remote (or "wiimote") is already enabling millions of people to practise
their golf swings, play rock guitar or swordfight with imaginary enemies.
And since the iPhone was launched, strong sales and high user satisfaction
have reinforced just how powerful and intuitive a multitouch interface can
be.

These early announcements have been followed by a string of others in
consumer technology. In recent months, Panasonic, Sony and NEC have all
demonstrated applications that use facial and movement recognition. These
include, for example, video displays from Panasonic that can identify users
from their faces, serve up content choices based on their individual
preferences, and that allow screen control by hand gestures.

It's easy for business leaders and chief information officers to dismiss
such trends in consumer preference as minimally relevant to enterprise
computing - but that's a "dangerous oversimplification", warns Mr Prentice.
"Not all consumer-targeted technologies find their way directly into
enterprise IT environments," he concedes, "but the growing adoption of these
technologies by individuals in their 'personal infrastructures' is leading
to increasing frustration and dissatisfaction with constraints and
restrictions the corporate IT environment often imposes on users."

Fortunately, it's not just the consumer technology firms that have their
eyes on gestural technologies. At Accenture Technology Labs, research
director Kelly Dempski has a long track record in exploring how they can be
used in business applications, most recently concentrating on building
multi-touch, interactive display walls.

Accenture has installed such walls, for example, in O'Hare International
Airport in Chicago and John F. Kennedy Airport in New York. Consisting of
multiple screens housed in giant custom frames, they use graphics and
touch-screen technology to allow passengers to check the weather at their
destination, read the latest news from CNN, or find out how their team
scored while they were in flight, by simply touching areas on the screen.

This technology, says Dempski, could have equally valuable back-office
applications, presenting vital internal data from back-end enterprise
resource planning (ERP) systems to employees in a control room at a utility
firm, for example. "The aim is to create a mode of interaction that requires
zero training but offers a high degree of interactivity," says Mr Dempski.

At Microsoft, meanwhile, researcher Desney Tan is taking HCI to new levels:
muscle-computer interaction. Mr Tan and his colleagues, alongside
researchers from the Universities of Washington and Toronto, have developed
an armband worn on the forearm that recognises finger movements by
monitoring muscle activity. They have called it MUCI, which stands for
muscle-computer interface and its aim is to make controlling computers and
gadgets easier in situations where the user is otherwise engaged - for
example, when driving a car or in a meeting.

"The human body is a prolific signal-generator," he says. "My work is
focusing on the potential of tapping into the electromagnetic signals that
the brain sends to muscles, which has the potential to harness a whole range
of subtler movements than simply a press or a pinch on an interactive
screen."

MUCI currently works extremely well in situations where major arm movements
are constrained and finger gestures are made on a flat surface, he says.
Tests on volunteers have shown that after calibration, the system can
recognise the position and pressure of all 10 digits with 95 per cent
accuracy.

"Where we want to take this research next is to capturing gestures made in
three-dimensional space," he says, adding that the ability to do that and
still recognise gestures with a fair degree of accuracy will start to open
the door to a huge range of potential applications - even recognising and
translating sign language used by deaf people.

Naturally, applications based on gestural computing place a huge strain on
underlying hardware, which is forced to process a larger volume and wider
range of more subtle signals. Among chip manufacturers, this is forcing a
shift in focus from traditional central processing units [CPUs] to the
graphics processing units [GPUs] that, up to now, have primarily been used
in gaming and virtual world environments.

In essence, a GPU is a dedicated graphics rendering device for personal
computers and games consoles that is very efficient at manipulating and
displaying computer graphics.

More important, the ability to process information in a highly parallel way
makes GPUs far more effective at handling a large range of complex
algorithms than CPUs, which process them in a linear, one-at-a-time fashion,
explains Richard Huddy, worldwide head of developer relations at chip
company Advanced Micro Devices (AMD).

"Say you've got an application that uses a webcam to capture shots of a
human subject and analyse their gestures. It will need to figure out the
relationship between each frame and its predecessor in, perhaps,
one-sixtieth of a second, and there's a lot of maths involved in doing that
in a smooth and uninterrupted way. A GPU will do that much, much better," he
says.

In order to get its slice of the gestural computing market, AMD is already
locked in a pitched battle with rivals Intel and Nvidia to deliver advanced
GPU capabilities to hardware manufacturers as soon as possible, with a slew
of new product announcements planned for 2009.

All this means that businesses need to be prepared. No one is predicting the
instant demise of the computer mouse, and certainly not of the keyboard as a
text-entry tool. "Despite the many disadvantages of a design nearing its
centenary, nothing else currently comes close to the functionality of the
conventional tactile keyboard," says Mr Prentice of Gartner.

But there will be a "strong and unstoppable" trend towards a control
interface for technology that is based on simple human gestures, rather than
on indirect manipulation via physical objects such as a mouse, he predicts.

He says that revolution is three to five years off for mainstream business,
but it's not too soon for business leaders to "suspend their natural
scepticism" and start to think about how gestural computing might be used to
address their organisations' most intractable user interface issues.

"The phrase 'paradigm shift' is an overused one, but it's not often that
such fundamental elements of the computer interface change, and the
opportunities for enterprises able to capitalise on these changes will be
substantial," he says.

~ will

"Where you innovate, how you innovate,
and what you innovate are design problems"

---------------------------------------------------------------------------------------------
Will Evans | User Experience Architect
tel: +1.617.281.1281 | will at semanticfoundry.com
aim: semanticwill
gtalk: semanticwill
twitter: semanticwill
skype: semanticwill
---------------------------------------------------------------------------------------------

Comments

30 Dec 2008 - 8:16am
Jakub Linowski
2008

Somehow every time someone says the mouse will go on retirement I am
not convinced. It's a seriously well designed product which has
lasted for what, over 40 years? The mouse provides quite a bit of
precision. Yes, perhaps it takes effort to learn, but with time
people can move items around at pixel level detail which I doubt will
be possible to do with fingers. Furthermore, when using the mouse the
hand rests at a 90 degree angle and is supported by a desk, which
suits longer working hours. Will people be able to move their fingers
and wave their arms for 9 to 5, 5 days a week? Unlikely as it will
require more physical energy.

The way I see it, retirement of the mouse is an over exaggeration.
Gestures definitely are a new way of interacting and will increase in
popularity. However, I think gestures will diversify our ways of
interaction and not replace the old, the same way paper still
supports us today along side computers.

Or am I in the denial phase? :)

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Posted from the new ixda.org
http://www.ixda.org/discuss?post=36725

2 Jan 2009 - 1:19pm
Dan Saffer
2003

On Dec 30, 2008, at 5:16 AM, Jakub Linowski wrote:

> Furthermore, when using the mouse the
> hand rests at a 90 degree angle and is supported by a desk, which
> suits longer working hours. Will people be able to move their fingers
> and wave their arms for 9 to 5, 5 days a week? Unlikely as it will
> require more physical energy.

I tried to address this concern with our Desktop Touchscreen Concept
we did at Kicker:

<http://www.kickerstudio.com/blog/2008/12/concept-desktop-touchscreen-system/
>

This is a major flaw with systems like HP's Touchsmart. Reaching
across your keyboard to manipulate things on a screen all day, every
day is going to get tiresome.

>
>
> The way I see it, retirement of the mouse is an over exaggeration.
> Gestures definitely are a new way of interacting and will increase in
> popularity. However, I think gestures will diversify our ways of
> interaction and not replace the old, the same way paper still
> supports us today along side computers.
>

I agree with you. Although I think mice that incorporate touch and/or
gestures are the, well, present. How many people are viewing this on a
laptop or mobile phone without a mouse? I don't think I've used a
traditional mouse in probably three years or more.

> On Dec 31, 2008, at 12:20 PM, Scott Berkun wrote:
>
>>
>> One conflict between design & innovation is this: we simultaneously
>> believe
>> design is about making things easier for people, but also want to
>> see big
>> changes happen. But innovation, especially big UI changes like, say,
>> eliminating mice, or toolbars, or depending on gesture language,
>> are major
>> inconveniences to people simply trying to live their lives (as
>> opposed to
>> designers and early adopters who go out of their way to experience
>> change).

Agreed. Like all changes, the value of the change has to be apparent,
even if that value isn't immediately apparent. Touchscreens and
gestural interfaces are *a* solution to a particular set of
technological/cultural/physical space issues, some of which we're only
now discovering. They are good in some situations, not so good in
others. It's the same with all technology.

Dan

Dan Saffer
Designing Gestural Interfaces (O'Reilly)
http://www.designinggesturalinterfaces.com

2 Jan 2009 - 7:09pm
Troy Gardner
2008

I don't see the mouse going away, I tend to view gestures like Wii Devices,
operating in a bigger space. I don't see many playing wii tennis in a chair,
and I don't see many mouse+keyboard while standing up.

Having played with mulittouch (owned a TouchStream keyboard/mouse), played
with FIR, webcams, there are many issues with gestural based input that
aren't easy to solve, one of the most obvious is a lack of tactile
feedback. Increased errors as fingers/arms drift for programs that track
gestures on a static position. Stacking gestures also produces more lag and
errors, e.g. did you mean o, O or 0? or C etc. Leading to a conversation
with misunderstandings with the interface rather than a 1 to 1 deterministic
device like mouse and keyboard. There are some solutions to this, but I
believe the IO between device and application has to be bidirectional,
meaning that the gestures available are based on the context of what you are
working with ...like lateral inhibition in the brain.

The mouse as much as it's malaligned can do single point gestures, if two
mice existed two point gestures (move, scale, rotate, etc) and can support
chording with buttons. A mouse or trackball at at rest is one of the
lowest energy and most comfortable as it conforms to the hand. The
pinky/ring finger aren't well suited to doing any heavy work.

The biggest RSI issues are from overuse of the index finger which can be
solved with zero contact switches (like the touchstream), or hand/foot
clicks, poor workspace design, in particular the distance between
keyboard,mouse and reach.

2 Jan 2009 - 10:15pm
Dave Malouf
2005

I wonder how a device like this:
http://www.gizmowatch.com/entry/olpc-xo-2-dual-touchscreen-concept-laptop-to-sell-for-75/

Changes the the requirement for a mouse?

I know it is only a concept but if I think about multiple touch
planes instead of a single one, I can do both indirect (touch pad)
and direct (touch screen) and maintain a coherence between the
content and the actions taking place.

Just a thought.

-- dave

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Posted from the new ixda.org
http://www.ixda.org/discuss?post=36725

3 Jan 2009 - 1:51am
Krystal Higgins
2008

RSI issues will remain no matter what the input device (even if just
gestural), since it's a catchall term. Treadmill-addicts, tennis
pros, painters, truck drivers, all these people experience their own
forms of repetitive strain injuries (rotater cuff injury, shin
splints, neck strain, sciatica, degenerative disks). Even if we are
only twitching our arms around in uninhibited air, we'll manage to
develop some sort of injury-- do it enough, and the body will
protest.

But as some of the earlier folks have pointed out, expanding on the
options we have for interacting with a device is key to reducing
these injuries. If I'm using gestural at home, voice recog in the
car, and traditional pen-tablet (I haven't used a traditional mouse
in 5 years) at work, then I'm reducing my RSI chances on, say, my
elbow.

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Posted from the new ixda.org
http://www.ixda.org/discuss?post=36725

2 Jan 2009 - 3:47pm
Janna
2008

How can a group of well-educated, innovative and forward thinking
people not be interested in some of this research? The computer
keyboard is a soft side-step from typewriter to computing machine.
The mouse is ok, but as one of many sufferers of a repetitive stress
injury, it seems to have created more medical problems than it's
solved.
Gestures, air typing, and new forms of data entry should all be
explored. We are nowhere near the end of discovering how to interact
with data and computing machines, but just at the beginning.
Our stationary machines became laptops, then laptops got smaller and
mobile. What comes next is thanks to the researchers pushing the
boundaries of technology. I, for one, would love a world of truly
ubiquitous computing where I have options to interact with technology
without a physical device at hand.

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Posted from the new ixda.org
http://www.ixda.org/discuss?post=36725

2 Jan 2009 - 10:07pm
Joshua Muskovitz
2008

I agree with the masses here. Mice and keyboards are remarkably
efficient devices.

I try to imagine operating my microwave with a Wiimote, running
photoshop with a multitouch screen (what, fingers aren't
transparent? who knew?), or using voice commands for practically
anything.

They are all novel technologies, and all are finding their niche
uses, which is great. But I seriously doubt any of these will replace
the doorknob on my house, whose behavior is almost universally
understood, requires no electricity, and operates more efficiently
than any alternatives. And similarly, I find it hard to imagine that
any of them will allow me to enter this comment as efficiently as my
keyboard and mouse.

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Posted from the new ixda.org
http://www.ixda.org/discuss?post=36725

3 Jan 2009 - 5:58am
SteveJBayer
2008

Wouldn't adding a touch sensitive surface to replace the conventional
2 buttons and scroll wheel on a mouse be a way of adding finger
gesture sensing technology to conventional mouses?

The technology for adding touch sensing to contoured surfaces maybe a
few years off but it would certainly be more convenient for a user to
use their index finger to move the cursor over relatively small
distances than to do the same by moving the entire physical mouse.

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Posted from the new ixda.org
http://www.ixda.org/discuss?post=36725

5 Jan 2009 - 7:19pm
Helen Killingbeck
2005

I would love to expand this discussion to how the use of gestural
interaction, instead of the mouse, will allow those users who have been
hindered by the mouse. I am thinking of those who use screen readers and
other assistive technologies. Before the mouse we had keyboard, and
accommodating those who used keyboard only were left behind.

Thoughts on how this could new form of interaction could become universal
usability?

Helen

On Sat, Jan 3, 2009 at 5:58 AM, SteveJB <stevejbayer at gmail.com> wrote:

> Wouldn't adding a touch sensitive surface to replace the conventional
> 2 buttons and scroll wheel on a mouse be a way of adding finger
> gesture sensing technology to conventional mouses?
>
> The technology for adding touch sensing to contoured surfaces maybe a
> few years off but it would certainly be more convenient for a user to
> use their index finger to move the cursor over relatively small
> distances than to do the same by moving the entire physical mouse.
>
>
> . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
> Posted from the new ixda.org
> http://www.ixda.org/discuss?post=36725
>
>
> ________________________________________________________________
> Welcome to the Interaction Design Association (IxDA)!
> To post to this list ....... discuss at ixda.org
> Unsubscribe ................ http://www.ixda.org/unsubscribe
> List Guidelines ............ http://www.ixda.org/guidelines
> List Help .................. http://www.ixda.org/help
>

5 Jan 2009 - 11:51am
Sam Menter
2008

I imagine a touchscreen control, roughly the same size as a computer
keyboard, but with the same dimensions ratio as the monitor, that you use to
control the main monitor / PC. It would respond to various of gestures - 2
fingers, 3 fingers, swiping, tapping etc.
This could combine the benefits of the controls of the touchscreen with the
benefits of having the screen at a comfortable height.

A specific gesture could then make the pad display a keyboard when necessary
and other controls as / when needed.

Open up your media player and it displays media controls. Software
developers could develop specific control interfaces for their application.

Does this exist already? Has anyone ever built something similar? All the
technology exists, it would be a case of pulling it all together.

Sam
www.pixelthread.co.uk

2009/1/3 SteveJB <stevejbayer at gmail.com>

> Wouldn't adding a touch sensitive surface to replace the conventional
> 2 buttons and scroll wheel on a mouse be a way of adding finger
> gesture sensing technology to conventional mouses?
>
> The technology for adding touch sensing to contoured surfaces maybe a
> few years off but it would certainly be more convenient for a user to
> use their index finger to move the cursor over relatively small
> distances than to do the same by moving the entire physical mouse.
>
>
> . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
> Posted from the new ixda.org
> http://www.ixda.org/discuss?post=36725
>
>
> ________________________________________________________________
> Welcome to the Interaction Design Association (IxDA)!
> To post to this list ....... discuss at ixda.org
> Unsubscribe ................ http://www.ixda.org/unsubscribe
> List Guidelines ............ http://www.ixda.org/guidelines
> List Help .................. http://www.ixda.org/help
>

5 Jan 2009 - 11:09am
Torey Maerz
2008

My prediction is that the mouse will not go away until it just does
what I think. http://is.gd/eBpt

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Posted from the new ixda.org
http://www.ixda.org/discuss?post=36725

5 Jan 2009 - 11:07pm
mark ahlenius
2008

Hi,

the "technology" exists, but perhaps the specific application
implementation you mentioned may not. While I was in the Labs I was
researching some touch screens for table tops and ran across a couple of
systems which would detect gestures without touching the screens.
They could be done (if my memory serves me right) on up to 40"
displays. One method used sound and the other was using light. I'm
sure this
is all pretty well known stuff anyway, but I thought it was pretty neat.

To develop that kind of stuff full time and get paid would be a sweet
job! Of course you'd also have to find a practical use for it which
could sell, but we'd leave that up to someone else. ;-}

'mark

Sam Menter wrote:
> I imagine a touchscreen control, roughly the same size as a computer
> keyboard, but with the same dimensions ratio as the monitor, that you use to
> control the main monitor / PC. It would respond to various of gestures - 2
> fingers, 3 fingers, swiping, tapping etc.
> This could combine the benefits of the controls of the touchscreen with the
> benefits of having the screen at a comfortable height.
>
> A specific gesture could then make the pad display a keyboard when necessary
> and other controls as / when needed.
>
> Open up your media player and it displays media controls. Software
> developers could develop specific control interfaces for their application.
>
> Does this exist already? Has anyone ever built something similar? All the
> technology exists, it would be a case of pulling it all together.
>
> Sam
> www.pixelthread.co.uk
>
>
>
>
> 2009/1/3 SteveJB <stevejbayer at gmail.com>
>
>
>> Wouldn't adding a touch sensitive surface to replace the conventional
>> 2 buttons and scroll wheel on a mouse be a way of adding finger
>> gesture sensing technology to conventional mouses?
>>
>> The technology for adding touch sensing to contoured surfaces maybe a
>> few years off but it would certainly be more convenient for a user to
>> use their index finger to move the cursor over relatively small
>> distances than to do the same by moving the entire physical mouse.
>>
>>
>> . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
>> Posted from the new ixda.org
>> http://www.ixda.org/discuss?post=36725
>>
>>
>> ________________________________________________________________
>> Welcome to the Interaction Design Association (IxDA)!
>> To post to this list ....... discuss at ixda.org
>> Unsubscribe ................ http://www.ixda.org/unsubscribe
>> List Guidelines ............ http://www.ixda.org/guidelines
>> List Help .................. http://www.ixda.org/help
>>
>>
> ________________________________________________________________
> Welcome to the Interaction Design Association (IxDA)!
> To post to this list ....... discuss at ixda.org
> Unsubscribe ................ http://www.ixda.org/unsubscribe
> List Guidelines ............ http://www.ixda.org/guidelines
> List Help .................. http://www.ixda.org/help
>
>

6 Jan 2009 - 12:50am
Elizabeth Bacon
2003

Let's all help kill the mouse! I'm OK with keeping the keyboard for
awhile, though. :)

Cheers,
Liz

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Posted from the new ixda.org
http://www.ixda.org/discuss?post=36725

Syndicate content Get the feed