In a groundbreaking revelation, scientists have unlocked the ability to discern where someone is looking by merely listening to the subtle sounds emitted by their ears. Dr. Jennifer Groh, a professor at Duke University, spearheads this innovative research that sheds light on the intricate connection between eye movements and auditory signals.
In a recent report published in Proceedings of the National Academy of Sciences, Dr. Groh and her team build upon their 2018 discovery that ears produce imperceptible noises during eye movements. These sounds, previously unnoticed, are now found to carry information about the direction of one’s gaze. Groh explains, “You can actually estimate the movement of the eyes, the position of the target that the eyes are going to look at, just from recordings made with a microphone in the ear canal.”
The reciprocal nature of this phenomenon is equally fascinating. Armed with knowledge about where someone was looking, the researchers successfully predicted the corresponding ear sound waveforms. Groh speculates that these ear sounds may result from the stimulation of the brain during eye movements, possibly involving middle ear muscles or hair cells that amplify quiet sounds.
The practical implications extend beyond mere curiosity. One of the lead authors, Stephanie Lovich suggests that understanding the relationship between ear sounds and vision could pave the way for novel clinical tests for hearing impairments. Lovich explains, “If each part of the ear contributes individual rules for the eardrum signal, then they could be used as a type of clinical tool to assess which part of the anatomy in the ear is malfunctioning.”
The study also challenges previous assumptions about the role of sound-regulating mechanisms in the ears. Much like the way the eye’s pupils adjust to light, the ears modulate hearing. The team’s earlier discovery in 2018 hinted at these mechanisms being activated by eye movements, indicating a sophisticated communication between the eyes and the ears.
The research team conducted eye tests on participants to delve deeper, tracking their gaze with a static green dot. Ear sounds, captured by microphone-embedded earbuds, were then analyzed alongside eye movements recorded by an eye tracker. The results were remarkable—distinctive signatures in ear sounds corresponded to different directions of eye movement, enabling the researchers to decode the soundwave and pinpoint where the participants were looking.
The ongoing exploration by Dr. Groh includes investigations into how these ear sounds differ in individuals with hearing or vision impairments. Additionally, she is probing whether individuals without such impairments exhibit ear signals that can predict their performance in tasks requiring auditory and visual information integration.
As Dr. Groh aptly concludes, “We think this is part of a system for allowing the brain to match up where sights and sounds are located, even though our eyes can move when our head and ears do not.”
This fascinating intersection of sight and sound opens new avenues for understanding human perception and holds promise for the development of innovative clinical tools.
Source: Duke University