Produced by the Office of Marketing and Communications
UMD Doctoral Student, Multi-Institutional Partners Develop Method That Forgoes Bulky, Complex Instruments
Ordinary earbuds could stand in for expensive, uncomfortable and bulky auditory testing equipment, according to research from a UMD graduate student and his team.
Illustration by iStock
In the clamor of a crowded café or the din of a busy street, your brain performs a quiet miracle—tuning into one voice while filtering out the rest. This ability, known as auditory attention, is crucial to how we learn, communicate and connect.
But capturing and measuring this process in real-world settings has long been a stubborn challenge for researchers, thanks in part to equipment that’s inconvenient and complicated to operate, and that can’t always differentiate your body’s signals from all the other activity happening in and around your head.
A team of researchers from the University of Maryland, the University of Glasgow and Nokia Bell Labs, Cambridge, is working to change that—starting with something millions of people already own: ordinary earbuds. A paper detailing this system was presented recently at the 2025 IEEE International Conference on Acoustics, Speech and Signal Processing in Hyderabad, India.
The idea originated during UMD computer science doctoral student Harshvardhan Takawale’s internship at Nokia Bell Labs last summer. The Nokia team had been using sensory earbuds to monitor physiological signals like heart rate and blood pressure. But they began wondering: What if they could go beyond physical health and tap into cognitive states, like attention?
“That curiosity led to the central idea—using in-ear muscle contractions to detect auditory attention,” said Takawale, the lead author on a study describing the system that could stand in for the bulky EEG brainwave headsets or uncomfortable in-ear sensors commonly used in auditory attention tests. “It’s an exciting concept that bridges physiology, acoustics and cognitive sensing.”
The resulting system uses off-the-shelf earbuds and a barely noticeable ultrasonic tone to listen for tiny internal changes—specifically, subtle, involuntary muscle contractions inside the ear that occur when we concentrate on listening. Just like we might squint to focus visually, our ear muscles subtly adjust to enhance auditory focus.
The system emits an inaudible signal from a speaker inside a standard earbud. This signal bounces off the muscles within the ear and is picked up by a microphone in the same device. When these muscles contract, they subtly alter the shape and tension of the ear canal. The signal processing pipeline detects these resulting micro-vibrations and correlates them with the user’s attention state. When you're engaged, the vibrations become more stable; when your mind wanders, they grow erratic.
Takawale said validating these attention shifts in a controlled way was one of the biggest hurdles. The team addressed it using the well-documented psychological principle that attention is limited and shared across senses. When we concentrate on a visually demanding task, our auditory focus naturally declines.
Building on this, the researchers designed a study where participants alternated between focused listening and playing a cognitively demanding game—providing clear periods of “attention” and “distraction” for comparison.
To test the approach, eight participants wore the team’s wearable prototype on one ear, with over-the-head earmuffs to minimize external noise. In some segments, they concentrated solely on listening to audio; in others, they played the visual game while the audio continued in the background.
During focused moments, the system picked up clear, consistent vibration patterns. When attention waned, the signal became chaotic. That pattern held true for every participant. The system was able to distinguish attention states with nearly 86% accuracy—especially impressive for a method that does not rely on brain waves or eye tracking.
The team also explored how attention changes over time. By analyzing 20-second chunks of data in 10-second intervals, they were able to detect natural transitions in attention—whether sudden or gradual.
Still, the system has limitations. It cannot yet distinguish attention-related signals from unrelated body movement, and accuracy may vary depending on ear shape or earbud fit.
Takawale, who is advised by Nirupam Roy, an assistant professor of computer science with an appointment in the University of Maryland Institute for Advanced Computer Studies, said the team plans to improve the system’s reliability and test it across more users and listening situations.
Practical applications could be just around the corner. Imagine a podcast app that marks where your attention drifted, or hearing aids that adjust to your focus in real-time. In education, the system might track student engagement during online lectures. Down the line, it could even help detect attention disorders earlier or power interfaces that respond to our mental state.
The beauty of the approach lies in its simplicity and scalability.
“Our system runs on the speaker and mic already inside most earbuds,” Takawale says. “That makes it ready for the real world.”
Maryland Today is produced by the Office of Marketing and Communications for the University of Maryland community on weekdays during the academic year, except for university holidays.
Faculty, staff and students receive the daily Maryland Today e-newsletter. To be added to the subscription list, sign up here:
Subscribe