The Next Computer Mouse of the Future
Part of what we strive to do at H+ is just what our name says: humanize technology. We want to make technology work better with people instead of having to train people to work better with technology.
The first electronic computers (as we know them) used punch cards, and later a command prompt, requiring the users to memorize the specific commands for all the actions that they would want to perform.
A major advancement came in the 1960s, when an input device was created for a research project at Stanford Research Institute. It was called the mouse because of its cable at the back and small size.
It was not an immediate success, but eventually it came into more common use, thanks to its relative precision and ease of use.
Even now, the mouse reigns as king of computer interfacing - still for those same reasons. But now, whether it’s been the increase in processing power, people’s increased awareness of the limitations of current human-computer interaction, or some combination of both, there have recently been many new attempts into redefining how we interact with our technology.
One of the most successful technologies has been Microsoft’s Kinect, as it has been designed to analyze people’s complete bodies, thereby allowing full-body motions and gestures.
Originally released with the Xbox 360, it eventually came out for the PC as well, and has been extremely popular with researchers. Another very successful technology - also designed for a gaming console - was Nintendo’s Wii controller.
The Wii controller is different in that it still operates as a controller, measuring where the player is pointing, as opposed to the player’s physical body configuration, and it even has an on-screen cursor. This was more of a blend between the traditional mouse pointer and new body-mapping technology.
While the Kinect was good for full-body analysis, it wasn’t very good for hand gestures. The Leap Motion was designed with this type of interaction in mind. The Leap Motion was able to detect fine finger positions in 3D space within a defined space right above the sensor, allowing for precise finger gestures.
One of the problems with all of these peripherals is that they require a user to abstract a desired action to a physical movement - for instance, a swipe to move forward/backward through a browser or making a fist to select an object. What I believe the future of interaction to be is to directly think an action and to have it happen. And though that may still be a long way off, we have the beginning of some of that technology with products like the Emotive EPOC headset.
The headset uses EEG technology to scan the wearer’s electrical brainwaves and output a signal based on that information. The field of Brain Computer Interfacing (BCI) is very much in its infancy, but it is a field that I expect to be very important in the near future for how we interface with our technology. It will be the first time in our history as a species that we will be able to directly will something to happen without making a physically abstracted gesture.
At H+, we want to make using technology as natural and intuitive as possible, so we are experimenting with the newest technology available, and thinking as creatively as we can to create the most human-centric technology that we can.
- Michael, Creative Developer
For more information do visit our website at http://hplustech.com