Seminar Recap – Listening through noise: Search for autism biomarkers


Anne Luebke


Loisa Bennetto

It’s not easy to see signs of autism in a child who is barely walking and talking.

So while doctors who know what to look for can accurately diagnose an autism spectrum disorder when a child is 18 to 24 months old — allowing treatment to begin at the earliest possible stage — the average diagnosis doesn’t come until a child is 4 years old.  Higher-functioning cases aren’t discovered, on average, until a child is 6.

“In order to improve outcomes, we need to be doing a better job at detecting autism earlier,” said Loisa Bennetto, PhD, principal investigator at the Bennetto Lab at the University of Rochester.  “We need to find ways to do this before the language delays in autism really start to take hold.”

According to research findings from Bennetto and colleague Anne Luebke, PhD, one of the first signs may come from the ears.

Bennetto and Luebke shared their research, which was supported by a PILOT grant from the CTSI, at a March seminar in Helen Wood Hall auditorium.


Past studies have shown that children with autism struggle with visual cues.  Their comprehension of a spoken statement is aided less by seeing the speaker’s face than children without an autism spectrum disorder, and they are slower to react to a verbal description when a person is gesturing to them in addition to speaking.

Bennetto and Luebke wanted to find out if their hearing was different as well.

Their research consisted of a hearing in noise test whereby youngsters with and without autism were asked to identify specific phrases over background babble or non-descript broadband noise.

For both the babble and the broadband test, children with autism scored significantly worse.  Researchers performed a similar test with a sequence of musical notes in broadband noise.  Again, children with autism scored significantly worse.

Moreover, children with autism were shown to have reduced otoacoustic emissions (OAEs) at certain frequencies of sound but not all frequencies.

“We saw robust group differences in OAE amplitudes in speech frequency regions,” said Luebke, an associate professor in biomedical engineering and neurobiology and anatomy who specializes in work on the cochlear efferent feedback pathway.


One of the important implications of the study is that children with autism are actually doubly disadvantaged when it comes to communicating.

“If people without autism have trouble hearing in background noise, seeing the speaker’s lips allows you to do a better job listening.  Kids with autism don’t have that advantage,” said Bennetto.  “So if they also have trouble with hearing in noise as well, then that’s a double problem for them.”

The hearing in noise language tests that Bennetto and Luebke used on the children would not work for newborns or toddlers, who wouldn’t be able to fulfill the call-and-response required.

However, OAE tests are regularly used to measure the hearing of newborns.  So adapting an OAE test to use as an autism indicator could lead to earlier diagnoses — and more proactive treatment.

Said Bennetto: “It’s something that I think we could easily do for toddlers with autism.”

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s