However, I argue that major advances in UI will come from a close examination of inter-related factors such as physiology, technology and behavior.
The question of arm fatigue
One of the insights shared by Steve Jobs was the idea that touch screens on iMacs made no sense as arm fatigue would set in when constantly manipulating a vertical screen. Surprisingly LeapMotion attempts similar screen manipulation in a virtual vertical plan, with the added disadvantage of no tactile feedback when a gesture has registered. How do you know when a gesture begins and ends? This certainly does not look like an improvement in UI.Others perhaps have been paying closer attention. Note that the new Microsoft Surface has a touchpad with its keyboard even though the Surface has a full touch screen.
The Kinect model
Coming back to 3D gestures, the Microsoft Kinect has a more natural model of gestures, one that may not be hyper-accurate, but gets the job done. One can tell it is a winner with the rabid excitement in the developer community of hacking the system and doing cool things with it. Not one to lose an opportunity, Microsoft has posted an SDK for it. Once Kinect is embedded in laptops and mobile phones the possibilities of combining it with AR (augmented reality) get quite interesting.Another company with a similar concept is SoftKinetic which is experimenting with a range of use cases from flipping a presentation to AR like interactions with large displays.
No comments:
Post a Comment