Do you seriously think the Future Of Interaction should be a single finger?

7 Dec 2011 - 8:33am
2 years ago
11 replies
1187 reads
Maurice
2009

With an entire body at your command, do you seriously think the Future Of Interaction should be a single finger?

Here's an interesting article I came across.
http://worrydream.com/ABriefRantOnTheFutureOfInteractionDesign/

What could we find it we look beyond?
Just putting it out there!

Comments

8 Dec 2011 - 1:00am
EmilySun
2010

That's a great article. I'm clearly biased since I work at Sifteo, but there is a lot of interesting research around tangible interactions that has the potential for expanding beyond just a single finger. 

8 Dec 2011 - 2:30am
ptamzz
2010

That is a nice article but I've something to talk about the last part, that our body has got limbs, arms, torso or so on, but having those doesn't necessarily mean you've to use it somewhere, just like you didn't use your feet to send an email!

If you can bring tangible interactions, that would be one great leap but in order to complete simple tasks, why should we force our whole body into consideration?

One thing I'm certain about interaction is that, users don't want to make a lot of physical movements. That's the main reason why Minority Report style interactions are under debate (http://www.kickerstudio.com/blog/2010/11/why-you-want-but-wont-like-a-minority-report-style-interface/). The main reason why we are stuck with our fingers are, we've got 10 of them, and we can make several gestures, movements etc and are able to do that with the least physical movement our body can ever perform. This saves us a lot of trouble and energy; saving us from being fatigued. Imagine you typing an email, you probably make thousands or keystrokes without getting tired!

My point is, I strongly believe that the future is the fingers (well maybe apart from speech recognition, brain activity detection or those stuffs). Bringing tangible interactions through them, detecting which finger you are using; that kind of technology will enhance and will perhaps bring the next big thing on interaction.

All in all, if you can make a sandwich just by pressing a single button on your iphone with your finger, maybe it's my shortsightedness but I can't imagine a more simpler way to do it using any other part of your body.

8 Dec 2011 - 7:05am
martinsz
2011

Hi, do you think simpler is always better? Don't you think that in some cases there are exceptions to the rule? We don't have to have, and we don't have, only one type of input device.There are knobs, there are strings, there are screens, there are keys, buttons, mice, touchpads, kinect, speech, brain. We're just advancing in several fronts at once.Maybe in one life of work, like websites, catering to mobile / touchscreen users is the thing to do in 2012 and maybe 2013 too, but "the future" you mention is a long way, and the chose of the input device should really depend on where are you, where you come from, and what you want to do with it.

When we design interaction, we don't have to design for computers only, we should also chose the selection of devices in which our services will be available, each of them supporting a diferent context and reaching to a different user group.

Martín.

9 Dec 2011 - 5:03am
ptamzz
2010

I guess 'simpler interactions' are the rule of thumb!

Simpler means it's convenient, it is affordable, it is achieveable, it's intuitive, it serves the right context, in the right time; a whole lot of factors made it simple. It is not like 'simplicity' comes first, rather simplicity is the term we give to an interaction which has touched these several factors. The interaction and the designs come prior to us calling it simple. A good interface which can gives a proper and sound interaction is termed as simple; it's not like, the design is simple so the experience must be good!

I agree, we have several types of input devices, knobs, strings, touch screens, keys, buttons, mice kinect, speech, brain or so on; but if we consider them in real life interactions, almost all of these items are majorly controlled by our fingers and the palm (without fingers palms are although almost irrelevant) at the most. We turn a knob with our palm grip, we pluck strings with our fingers, we touch the screens with our fingers, we unlock a lock using our fingers holding the keys, we push the buttons with our fingers, we move the mouse with our finger grip or so on. Going forward, I sense that fingers are one critical device that's going to play in the interactions realm, be it real life, digital life or whatsoever. It's the power to give precision while not needing to spend a huge physical movement. Yes the fingers are attached to the palm, the wrist, the hand and arm and so they are all resposible for the final interaction but my point is, the forefront of the connection is alwasy going to be the fingers.

With kinect, the Sixth Sense or John Underkoffler's Minority Report style interactions, I believe these kinda of interactions will mostly go with games, physical training similators etc but will fail to grasp the major day to day interactions such as for checking out the flight schedule, for sending an email or even making a cup of coffee. I see the future of making a cup of coffee as merely a push of a button or a voice command than making hand & arms gestures to levitate the coffee maker, putting coffee in it or so on. Almost everything will be automated and that's one reason why we need to go jogging and do some physically exercises ourselves in the near future!

 

9 Dec 2011 - 11:05pm
martinsz
2011

You're right.As long as moving arms for performing tasks, I think the idea is "natural user interfaces", where natural means: nothing I wouldn't do to accomplish the task if the computer was not helping me. If for a task, let's say, change a lightbulb, I had to move my hand up, maybe the interface for a system that changes lightbulbs could involve the arm movement to fire the task.I would be natural in the sense that: You see the lightbulb, you extend your arm towards it, and the computer catches your "natural" movement and changes it. So the point is that body movements can be used to determine user intent, even before he gives an explicit order (with his fingers).
About fingertips: They are one of the most sensitive parts of our body, with lots of nervous terminations, and we have a good capacity to train ourselves to move fingers with precision. That's why I think fingers became so important.
BUT! I don't enjoy that current lifestyle makes us so sedentary, so while I think fingers are good, determining user intent from body posture will help in avoiding the office chair paralysis, which happens when people are all sitting down on their desks in order to be able to put their fingers in the right spots of their terminals. 
Martín.

2011/12/9 ptamzz <ptamzz@gmail.com>

I guess 'simpler interactions' are the rule of thumb!

Simpler means it's convenient, it is affordable, it is achieveable, it's intuitive, it serves the right context, in the right time; a whole lot of factors made it simple. It is not like 'simplicity' comes first, rather simplicity is the term we give to an interaction which has touched these several factors. The interaction and the designs come prior to us calling it simple. A good interface which can gives a proper and sound interaction is termed as simple; it's not like, the design is simple so the experience must be good!

I agree, we have several types of input devices, knobs, strings, touch screens, keys, buttons, mice kinect, speech, brain or so on; but if we consider them in real life interactions, *almost *all of these items are majorly controlled by /our fingers and the palm (without fingers palms are although almost irrelevant) /at the most. We turn a knob with our palm grip, we pluck strings with our fingers, we touch the screens with our fingers, we unlock a lock using our fingers holding the keys, we push the buttons with our fingers, we move the mouse with our finger grip or so on. Going forward, I sense that fingers are one critical device that's going to play in the interactions realm, be it real life, digital life or whatsoever. It's the power to give precision while not needing to spend a huge physical movement. Yes the fingers are attached to the palm, the wrist, the hand and arm and so they are all resposible for the final interaction but my point is, the forefront of the connection is alwasy going to be the fingers.

With kinect, the Sixth Sense or John Underkoffler's Minority Report style interactions, I believe these kinda of interactions will mostly go with games, physical training similators etc but will fail to grasp the major day to day interactions such as for checking out the flight schedule, for sending an email or even making a cup of coffee. I see the future of making a cup of coffee as merely a push of a button or a voice command than making hand & arms gestures to levitate the coffee maker, putting coffee in it or so on. Almost everything will be automated and that's one reason why we need to go jogging and do some physically exercises ourselves in the near future!

 

(((P
8 Dec 2011 - 9:59am
Maurice
2009

Great feedback.

Honestly? I don't know what the answer is to the question.

However, I do believe thinking ahead, beyond and outside the box opens the door for new possibilities.
While "touch" is popular today I still feel the need to think expansively.

I loved my iPad intially, took it everywhere. Now it's attached to my wall in the kitchen and basically used to check recipes, the weather and video calls with Skype.
 
The invention of the mouse and keyboard has enabled us to interact with computers efficiently for decades, but at the expense of a healthy lifestyle in many ways.

I don't expect people to start typing with their toes, nor do I expect people to start weighing themselves by doing handstands, nor do I expect people to start wearing their underwear on the outside.

The fact is, things evolve only when we explore other ways of doing what we consider "normal".
Flight travel was unthinkable at one time, no one was in a rush to give up their horse drawn carrages until the automobile changed that. Pen and paper, then typewriters, then keyboards...modems, then wireless...on and on.

My intenstions are to open a debate on what might/could be possible.

Some interesting links
.............................
SOE (spatial operating environment)
http://oblong.com/

Pattie Maes TED presentation:
http://www.youtube.com/watch?v=nZ-VjUKAsao&feature=player_embedded#!

SixthSense - wearable gestural interface:
http://www.youtube.com/watch?v=ZfV4R4x2SK0&feature=player_embedded#!

Temperature Sensitive Glass:
https://www.inventables.com/technologies/temperature-sensitive-glass

Smart Glass:
http://www.smartglassinternational.com/topics/smart-glass-videos/

Controlling Phones With the Body Electric:
http://bits.blogs.nytimes.com/2010/02/17/controlling-phones-with-the-body-electric/

8 Dec 2011 - 5:05pm
cfmdesigns
2004

Hmm. What led you to attach it to the wall? You're implying that the allure and utility wore off, but I'm wondering how much of that is "out of sight, out of mind" -- once you attached it to the wall, its purpose was narrowed and thus the allure and utility paled accordingly.

To be sure, if I didn't have a BT keyboard with mine, attached to its case, the behavior patterns would be different.

8 Dec 2011 - 5:35pm
Maurice
2009

@cfmdesigns

Very true and I agree.
Interestingly enough mine is actually attached to a swivel.
As for the allure, my teenage daughters also lost interest in iPad in 2weeks. It's still great for road trips. =)
We still all use laptops.

Note:
As a designer my "work" iPad is put to use everyday, at work.
A requirement for us when designing for multiple platforms. It definately has it's place in history and has definately changed the way most of us interact or expect to interact with devices.

Again. Great comment!
Maybe I should look into getting a BT keyboard (and a stylus for sketching) =)

8 Dec 2011 - 11:05pm
cfmdesigns
2004

Looking back to how Steve Jobs introduced the iPad, as falling between the iPhone and a laptop, I certainly find myself agreeing. The iPhone is for more immediate and small uses, the laptop for mouse and perhaps processor-intensive activities, and the iPad falls smack between -- bigger than the phone, more flexible than the laptop. It's a market slice which we didn't really know existed until we had something good to sit in it.

The BT keyboard allows it to infringe much more into the laptop's "space". With it and stuff like the iWork suite, the things I need the laptop for shrink massively (down to Dreamweaver and Photoshop, frankly -- and I eagerly look forward to being able to have a Dreamweaver equivalent on the iPad, I'm sure it's not too far away).

I do like the idea of a wall mount option like you described, so long as I didn't forget to take it off frequently.

9 Dec 2011 - 12:44pm
marcintreder
2011

Great discussion guys. Just one thing to add. Don't forget that we're evolutionally adapted to usage of our hands as a primary way to cooperate with any tools. Eye-hand coordination, enormously prehensile hands, shape and construction of thumb. That's the way it is. Using hands is quite comfortable. I guess the only way to really take a use of more parts of human body as a way to interact with tools, would be moving toward communication with tools. Usage of face (face muscles can express so much!), voice and maybe whole body language, might be good direction for the future. Complicated, but promising.

Marcin

9 Dec 2011 - 2:40pm
fingerpuk
2011

One thing to consider is accessibility, many people don't have the ability to use anything other than simple gestures with one finger.  Many people don't even have that.  As interaction designers we must never forget that although we have the ability to make very pretty fun GUIs, that's not what we're about.

I believe good design, not just interaction design but design in general, is about removing everything you don't need until you have the core idea laid bare.  The problem with the Microsoft (and Nokia) visions of the future is that they are not representitive of what is most likely to happen, and all the gadgets and GUIs they display are about the consumption of content instead of the creation of it, and they are all overly designed and look like Minority Report on smaller mobile devices.

But they are fun to watch.

Syndicate content Get the feed