Most hearing aid wearers will know just because you can hear a sound doesn’t mean you can understand it. If you’re out and about while wearing a hearing aid, you can be inundated with a variety of sounds but not know what the sound refers to. This apparent disconnect occurs because while hearing and understanding are critical parts of our daily life, they’re two distinct processes.
The New Study
A new collaborative study between researchers at Belgium’s KU Leuven and the University of Maryland seeks to better understand the relationship between hearing and comprehension. Using their newly developed automated test, the researchers looked at EEG machine results to determine if someone who heard a sound could actually understand it. An EEG is a special machine that uses electrodes placed on a patient’s head to record and analyze their brainwaves. These researchers wanted to see if using EEGs could improve current tests of listening comprehension to better understand the processes involved.
During the study, the researchers measured patient’s brain waves as they listened to sentences. This technique allowed the researchers to assess whether or not the patient actually comprehended the sentence based on their neurological response. Instead of asking patients if they understood a sound, the researchers sought to directly measure understanding from the brain level.
Essentially, the researchers look at the correlation between brain waves created when someone listens to a sentence and the brain waves that result from a person’s attempt to understand it. If these brain waves are quite similar, then researchers can infer comprehension; if not, the patient probably didn’t understand.
Potential Implications For Future Research
The benefit of this method is that a patient’s capacity to understand sounds can be measured independently of their current state of mind. It is both objective and automatic, so there’s no behavioral component involved. Thus, data collected through the EEG can be incredibly valuable for developing better diagnostic tests of speech comprehension issues.
Such a method also has a number of potential implications that could revolutionize hearing-related research and medical practice. By measuring understanding directly at the brain level, researchers could potentially diagnose patients who can’t actively and reliably participate in a hearing or speech understanding test, such as small children, people with developmental disabilities, or people in comas. In fact, EEG tests such as the one used in this study can work regardless of whether or not the patient is paying attention – babies often sleep through these tests but still provide informative results.
While this all sounds exciting, it’s important to keep in mind that the technology is at a very early stage. However, in the long-run, this study and its conclusions could drastically change the way people with hearing loss and hearing aids are diagnosed and treated. A neat potential outcome could be the development of smart hearing aids or cochlear implants that can fine-tune their signals based on if the wearer actually understands what they’re hearing.
For now, the technology is still young. The researchers are currently looking to understand the effects of a listener’s attention on their speech comprehension. Additionally, they are testing their technology on a broad range of different hearing aid and cochlear implant users to better understand its effectiveness within these population groups. An exciting future of hearing healthcare developments is among us, and this study is the next step in the right direction.