A different approach

2 Jun 2011 - 6:09am
2 years ago
1 reply
641 reads
avin
2011

Touch screen application interfaces have been following a common frame design for quite sometime now.
When i got an oppurtunity to work on couple of similar projects, 1st thing i did was i started observing the way people interact with the devices. I went through most of the similar applications.
As i was going through, lot of questions came in my mind. Few were like.

1. Technology has grown so much, isn't the practice of interaction with it also has grown?

2. Normally a person uses just 1 hand to interact with a non-touch screen based phone. But most people use both hands to interact with a touch screen. (May be in 1 hand he is holding the phone and in other he is interacting. What ever, He is using both his hands.)

3. Why is that we are not thinking little different from the adopted pattern. Are we trying to play safe?

This is when i thought i will take a chance to think and come up with something different. Some solution where life with touch screens become more simpler.

After weeks of brainstorming and head breaking working around with different options, i got little closer to possible solution.

What i did here is i looked closely into how exactly was the interaction happening. What are we using the most to interact?? The answer i found was 'thumb'. Now all i had to do was try to know how exactly the movement of the thumb happens. So i figured out our thumb moves mainly in 1 axis without any strain.

Now Keeping thumb has primary, i tried to come up with a design solution.

What are the most used elements(controls) in a media player?

1. Play
2. Pause
3. Next
4. Previous
5. Stop
6. Volume

Looking into this, i moved all 1st 5 controls to the right side of the screen to make it  easily accessible with the thumb.
For Volume there has always been physical controls on the top side of the phone. Which means when you are holding a phone, you are always in very easy reach to volume controls with your index and middle finger. 

Comments

4 Jun 2011 - 6:44am
Jochen Wolters
2010

Avin:

Some thoughts on the three "requests for comment" you posit:

1. Yes, human-machine interaction has been expanded quite significantly by the introduction of multi-touch gestures.

2. I doubt that you can generalize this assumption. Depending on context, users will want to access their mobile phones one-handedly, e.g., when carrying a bag of groceries while they receive a call. Such kinds of constraints are caused by the usage context, not the device's UI.

3. Regarding your example of a media player's controls being positioned along the right edge of the screen: What about left-handed users?

Greetings,

Jochen.

Syndicate content Get the feed