Smartphone today are cool, we can do our task by touching into the screen of the phone other handset can air shuffle by passing their hands into the device, what if we can control our smartphones with our eyes? How cool is that?
Scientist, together with one of Indian origin, are building a new mobile software that can accurately recognize where a person is looking in real time, an improve that lead smartphones and other devices to be operated by eye movements.
In an effort to create eye tracking compact, accurate and cheap to be included in smartphones, researchers are group sourcing the collection of peek information and using it to instruct the mobile software how to determine where a person is looking in.
The researchers at Max Planck Institute for Informatics in Germany, Massachusetts Institute of Technology (MIT) and the University of Georgia in the United States have recently been able to teach software to recognize where a person is looking with an precision of about a centimeter on a smartphone.
Aditya Khosla, a graduate student at MIT mentioned, “It’s still not exact enough to use for consumer applications.”
Nevertheless, Khosla believes the system accuracy will enhance with more data. The technology has been pricey and has required hardware that has produced it tricky to add the ability to gadget like phones and tablets.
It could possibly make eye tracking a lot more extensive and also be useful as a way to let you navigate your phone or play without tapping the screen.
Researchers began by making an application called GazeCapture that collected data about how users look at their handsets in various surroundings outside the limits of a lab.
Users stare was recorded with the phone’s front cam as they were shown blinking dots on the smartphone screen. To make sure they were paying attention, they were then show a dot along with “L” or “R” inside it, and they had to tap the left or right of the screen in response.
GazeCapture information was then applied to train software called iTracker. The phone camera capture your face and the app considers factors like the placement and direction of your eyes and head to determine where your eyes is focused on the screen.
Khosla stated that “About 1,500 people have used the GazeCapture app so far,” adding that if the experts can get data from 10,000 users they will be able to lessen software’s error rate to half, which is good enough for a range of eye-tracking applications.
Many users will love this as soon as the app release, reviews on this app will absolutely be filled like the cellular country reviews. Imagine, you can easily read email, and text messages without touching the screen.
What is your thought about this?