[ad_1]
Sometimes voice control just isn’t enough. Let’s say you could crank the AC, dim the lights, and queue up your Ed Sheeran or Dua Lipa playlist with nothing more than your eyes and a slight flick of your wrist.
Maybe someday. Researchers at Carnegie Mellon University in Pittsburgh have developed a gaze-tracking tool called EyeMU that allows users to control smartphone apps–including streaming music services–with their eyes and simple hand gestures. No touchscreen needed.
The Future Interfaces Group, part of the school’s Human-Computer Interaction Institute, combined a gaze predictor with a smartphone’s motion sensors to enable commands. As in: Look at a notification to lock it in, then flick the phone to the left to dismiss it or to the right to respond. Or move the phone closer to enlarge an image or away to disengage the gaze control. And that leaves one hand free for other tasks – like sipping your latte.
Google’s adaptation, a free app called Look to Speak, featured recently in an Oscars advertisement, is an eyes-only technology developed for people with disabilities. Try the Android-only app and you’ll see how EyeMU’s simple hand movements could make a difference.
“The big tech companies like Google and Apple have gotten pretty close with gaze prediction,” says Chris Harrison, director of the Future Interfaces Group, “but just staring at something alone doesn’t get your there. The real innovation in this project is the addition of a second modality, such as flicking the phone left or right, combined with gaze prediction. That’s what makes it powerful. It seems so obvious in retrospect.”
Getting gaze analysis and prediction to accurately control a smartphone has been elusive. Andy Kong, a Carnegie Mellon senior computer science major, wrote a program as an alternative to commercial eye-tracking technologies that uses a laptop’s camera to track the user’s eyes, which then control on-screen cursor movement. This proved a foundation of EyeMU.
“Current phones only respond when we ask them for things, whether by speech, taps, or button clicks,” says Kong. “If the phone is widely used now, imagine how much more useful it would be if we could predict what the user wanted by analyzing gaze or other biometrics.”
[ad_2]
Source link
More Stories
Microsoft is dropping support for Authenticator on Apple Watch
How to Keep AirPods From Falling Out of Your Ears
Free Technology for Teachers: Ten Canva Features for Students