Two different regions of the brain are critical to integrating semantic information while reading, which could shed more light on why people with aphasia have difficulty with semantics, according to new research from UTHealth Houston.
The study, led by first author Elliot Murphy, PhD, postdoctoral research fellow in the Vivian L. Smith Department of Neurosurgery with McGovern Medical School at UTHealth Houston, and senior author Nitin Tandon, MD, professor and chair ad interim of the department in the medical school, was published today in Nature Communications.
Language depends largely on the integration of vocabulary across multiple words to derive semantic concepts, including reference to events and objects, and statements of truth. However, the way people integrate semantic information while reading remains undetermined.
"Typically, we take pieces from different words and derive a meaning that's separate. For example, one of the definitions in our study was 'a round red fruit' — the word 'apple' doesn't appear in that sentence, but we wanted to know how patients made that inference," Murphy said. "We were able to expose the dynamics of how the human brain integrates semantic information, and which areas come online at different stages."
To uncover this, researchers studied intracranial recordings in 58 epilepsy patients who read written word definitions, which were either referential or nonreferential to a common object, as well as phrases that were either coherent ("a person at the circus who makes you laugh") or incoherent ("a place where oceans shop"). Sentences were presented on the screen one word at a time, and researchers focused their analysis over the time window when the final word in the sentence was presented.
Overall, they found that different areas of the language network showed sensitivity to meaning across a small window of rapidly cascading activity. Specifically, they discovered the existence of complementary cortical mosaics for semantic integration in two areas: the posterior temporal cortex and the inferior frontal cortex. The posterior temporal cortex is activated early on in the semantic integration process, while the inferior frontal cortex is particularly sensitive to all aspects of meaning, especially in deep sulcal sites, or grooves in the folds of the brain.
Murphy said these findings can help illuminate the inner dynamics of aphasia, a disorder that affects a person's ability to express and understand written and spoken language. It can occur suddenly after a stroke or head injury, or develop slowly from a growing brain tumor or disease.
People with aphasia often have difficulty with semantic integration, meaning that while they can understand individual words, they cannot make additional semantic inferences.
"Both the frontal and posterior temporal cortexes disrupt semantic integration, which we see happen in individuals with various aphasias," Murphy said. "We speculate that this intricately designed mosaic structure makes some sense out of the varying semantic deficits people experience after frontal strokes."
Co-authors with UTHealth Houston included Kathryn M. Snyder, MD/PhD student; and Patrick S. Rollo, research associate and third-year medical student, both with the Vivian L. Smith Department of Neurosurgery and the Texas Institute for Restorative Neurotechnologies (TIRN) at McGovern Medical School. Tandon is the Nancy, Clive and Pierce Runnels Distinguished Chair in Neuroscience of the Vivian L. Smith Center for Neurologic Research and the BCMS Distinguished Professor in Neurological Disorders and Neurosurgery with McGovern Medical School and a member of TIRN. Tandon is also a faculty member with The University of Texas MD Anderson UTHealth Houston Graduate School of Biomedical Sciences, where Snyder is also a student. Kiefer J. Forseth, MD, PhD, and Cristian Donos, PhD, both formerly with UTHealth Houston and now with the University of California at San Diego and the University of Bucharest in Romania, respectively, also contributed to the study.