Without cameras, glasses that track your facial expressions and eyes? Though it may sound like science fiction, two recent technological advancements are bringing it to life. These technologies, called GazeTrak and EyeEcho, are being developed at Cornell University by a group headed by doctorate student Ke Li in information science. These technologies, which employ sonar instead of cameras, have longer battery lives and improved user privacy.
GazeTrak utilizes one speaker and four microphones arranged around the inside of each lens frame on a pair of glasses. The speakers emit pulsed inaudible sound waves which echo off the eyeball and get picked up by the microphones. By analyzing the millisecond differences in these echoes, AI-based software can track the direction of the user’s gaze. This technology is not affected by loud background noises and consumes only 5% as much power as conventional camera-based eye-tracking wearables.
EyeEcho, on the other hand, uses one speaker and one microphone located next to each of the glasses’ two arm hinges. It tracks facial expressions by analyzing the subtle movements of the facial skin affecting the time between each emitted pulse and its echo. After just four minutes of training, EyeEcho proved to be highly accurate at reading expressions, even during various everyday activities.
Both GazeTrak and EyeEcho provide a more discrete and energy-efficient option to camera-based systems by being integrated into third-party smart glasses or virtual reality headsets. Future wearable technology has a lot of interesting potential thanks to these technologies, which also give users new ways to engage with their devices in a private way.