Future User Interface

As part of yet another engaging conversation (which I kept having a mental note that we need to start recording all our conversations), we discussed future interface as part of a massive interconnecting discourse over dinner at Din Tai Feng.

We debated on how the future of how we control our computer will no longer be mouse, but our fingers, hands and arm.

The important thing to note is that, mouse is by far, still one of the most intuitive and irreplaceable piece of control hardware there is today. The usage of the mouse allows our wrist to rest on the table while we move a single cursor effortlessly around our mouse pad or any surface to click with precision on anything on the 2D GUI screen.

As insinuated by many modern sci fi movies, the future of computer is more akin to what Tony Stark have in his lab, his move his arm and hand dynamically is zoom in and out, manipulate the GUI and direct what he desires with verbal commands.

For me, I strongly objected that it is a viable form of control as it is not possible for someone to lift up their arm over a prolong period of time. 10mins maybe, and your arm is as good as broken. I debated that, control have to be done via only hand and fingers; and there could be a trigger of some sort, that will notify the computer that the gesture control is on or off.

Joshua upon pondering feels that, perhaps, the on off is perhaps even simpler than a button but using rythmns to notify the computer. Which i find extremely interesting and ground breaking. The possibility of on off with just the same hand means that you will allow control with 2 hands.

Building on that, intergrating limited arm gesturing with the dualhand gesturing, it introduce a whole universe of new possibilities to manage and control the GUI.

I think we are on to something.

More in our heads, and more to come.