SOUTH KINGSTOWN — The University of Rhode Island’s Collaborative Cognitive Neuroscience Lab is partnering with Yale- and University of Connecticut-affiliated Haskins Laboratories in New Haven on a study of how children living with autism learn language. Researchers are expected to begin recruiting study participants later this summer.

With a focus on children age 10 through 18, the research will “incorporate electroencephalogram (EEG) sensors that monitor brain activity with eye-tracking technology to determine the level of audio-visual integration occurring as children observe human and computer-animated faces speaking,” according to URI.

“One way we learn language is by looking at people’s faces and expressions and watching how their mouths move in addition to listening to what they are saying,” said URI assistant professor of communicative disorders Alisa Baron. “In children with autism, we find that they have difficulty making eye contact or looking at peoples’ faces as they speak. So they are missing out on critical information regarding language and communication.”

Said her colleague Vanessa Harwood: “Our goal for the end result of this study is to develop effective interventions that will support and reinforce those types of looking behaviors that may help improve language processing.”

Parents of children living with autism are also being invited to participate in the study, which will involve several sessions. Parents interested in learning more about how they or their children can participate are asked to email ccnl@etal.uri.edu.

Recruitment will also involve efforts by URI’s Speech and Hearing Center and the Rhode Island Consortium on Autism Research and Treatment.