This article discusses the different types of interfaces that we are going to be confronting in the future as well as ones that have already invaded our market. It is nice to see them all (at least the ones we can come up with for now) in one place. The article identifies five different interfaces beyond our current mouse point and click interfaces; multi-touch, gesture, voice recognition, eye tracking, and brain computer interface. Its interesting to consider how we went from a simple click of hte mouse, to the finger swiping on ipad screens or the scrolling on track pads. While the gesture technology hasn’t come quite as far as to be in our daily lives, it has been integrated more into other aspects. We went from a stationary controller for gaming, to the wiimote which allowed us to gesture, to the kinect which tracks our body movements. (results may vary on this). This is extraordinary when you consider it. It has happened in the span of just a few years really. It reminds me though of the radio quite from Douglas Adam’s “The HitchHikers Guide to the Galaxy. ”
A loud clatter of gunk music flooded through the Heart of Gold cabin as Zaphod searched the sub-etha radio wave bands for news of himself. The machine was rather difficult to operate. For years radios had been operated by means of pressing buttons and turning dials; then as the technology became more sophisticated the controls were made touch-sensitive–you merely had to brush the panels with your fingers; now all you had to do was wave your hand in the general direction of the components and hope. It saved a lot of muscular expenditure, of course, but meant that you had to sit infuriatingly still if you wanted to keep listening to the same program. |
I certainly hope that we don’t sacrifice the usability of the item in exchange for the novelty of its interface. But, we shall see.
We have voice interface technologies that have been expanded (Siri for example, who still has some problems but how awesome is it to ask a non-sentient object a question and receive an answer? Maybe not the right one, but it still responds!) Eye tracking is used in research for the most part ( for now) My fear with eye tracking interfaces though is how you control it. Do you flick your eyes? Is it just staring at an instruction? Blinking? We shall have to see.
The final one is the Brain computer interface and this is one that I am very excited for. We already have those people who are cyborgs in their own right, people using special computerized lens to actually see after losing an eye. The implications are so much more than just convenience. Think of the speed of our interactions! The firing of synapses could be the speed at which we interact and just imagine the possibilities! Imagine losing your arm and leg, and having it replaced and never batting an eye because to you, its only the loss of responding feeling in the limb you have to deal with. You can control it just as normal. And who knows, when your arm responds at the speed of synapses, maybe your thoughts that there should be feeling could result in actual feeling?
This article was written in 2010 but I feel its still very relevant. We have not come to complete grips with the future of interface design and while I imagine there has been more than a little research done, it has not truly entered the consumer market. I look forward to seeing these new interfaces begin to sneak more into our daily lives.
If you want to read it for yourself, Its located at this site. Tech News Daily