Have you ever been walking down the street and thought someone might be talking to you until you realize they are wearing iPhone earphones with a sneaky mouthpiece connected to the wires? Well, a new level of confusion could soon be coming to Android phones.
With a new program that developers hope to have ready in 2012, we may start seeing people just stare at their phones without moving their hands. But they aren't willing their phone to work through their brainwaves -- they'll be moving their eyes. "Senseye" uses a small camera to track the user's pupils, analyzes the movements through a computer algorithm and performs the action the phone or tablet was directed to do accordingly.
Watch the prototype in action:
Learn more about the technology and see more demos:
Although eye-tracking devices are not widely used yet, it is not considered entirely new technology. Such technology has been used to help those without speech to communicate for years. Check out this story of a 9-year-old who uses a machine to speak for her, simply by selecting words and phrases with her pupils.
[H/T PC World]