A new study claims that ones ability to make sense of Groucho’s words and Harpo’s pantomimes in an old Marx Brothers movie seem to occur inside the same regions of our brain.
Study experts have shown that the brain regions that have long been identified as a center in which spoken or written words are decoded may perhaps be crucial in interpreting wordless gestures. The findings suggest that these brain regions could possibly play a much broader role in the understanding of symbols than what experts have believed. Also, for this reason, it may be the evolutionary starting point from which language originated.
“In babies, the ability to communicate through gestures precedes spoken language, and you can predict a child’s language skills based on the repertoire of his or her gestures during those early months. These findings not only provide compelling evidence regarding where language may have come from, they help explain the interplay that exists between language and gesture as children develop their language skills,†says James F. Battey, Jr., M.D., Ph.D., director of the NIDCD.
Scientists have known that sign language is mainly processed in the same regions of the brain as spoken language. Apparently, these regions include the inferior frontal gyrus, or Broca’s area, in the front left side of the brain, and the posterior temporal region, commonly referred to as Wernicke’s area, toward the back left side of the brain. It isn’t surprising that signed and spoken language appears to activate in the same brain regions. This is because sign language is known to function in the same way as spoken language does with its own vocabulary and rules of grammar.
During this study, experts were believed to have collaborated with scientists from Hofstra University School of Medicine, Hempstead, N.Y., and San Diego State University to find out if non-language-related gestures seem to be developed in the same regions of the brain as language is. Supposedly, non-language-related gestures are the hand and body movements one makes use of to convey meaning on their own, without having to be translated into specific words or phrases.
Two types of gestures were noted to have been considered for the study namely pantomimes, which imitate objects or actions, such as unscrewing a jar or juggling balls, and emblems, which are usually used in social interactions and which signify abstract. Moreover, emblems seem to be frequently more emotionally charged concepts as compared to pantomimes. For instance, a hand sweeping across the forehead to indicate ‘it’s hot in here!’ or a finger to the lips to signify ‘be quiet.’
Whereas inside a functional MRI machine, 20 healthy, English-speaking volunteers i.e. nine males and eleven females seem to have watched video clips of a person either acting out one of the two gesture types or voicing the phrases that the gestures signify. As controls, volunteers also were observed to have watched clips of the person making use of meaningless gestures or speaking pseudo words. These meaningless gestures or pseudo words appear to have been chopped up and randomly reorganized so the brain would not interpret them as language.
Furthermore, volunteers watched 60 video clips for each of the six stimuli, with the clips presented in 45-second time blocks at a rate of 15 clips per block. Additionally, a mirror was noted to have been attached to the head which enabled the volunteer to watch the video projected on the scanner room wall. Later, the scientists measured brain activity for each of the stimuli and studied similarities and differences as well as any communication taking place between individual parts of the brain.
The authors found that for the gesture and spoken language stimuli, the brain seems to have been extremely activated in the inferior frontal and posterior temporal areas, the long-identified language regions of the brain.
“If gesture and language were not processed by the same system, you’d have spoken language activating the inferior frontal and posterior temporal areas, and gestures activating other parts of the brain. But in fact we found virtual overlap,†says senior author, Allen Braun, M.D.
“Our results fit a longstanding theory which says that the common ancestor of humans and apes communicated through meaningful gestures and, over time, the brain regions that processed gestures became adapted for using words. If the theory is correct, our language areas may actually be the remnant of this ancient communication system, one that continues to process gesture as well as language in the human brain,†he elucidates.
Present thinking in the study of language is that, like a smart search engine that pops up the most appropriate website at the top of its search results, the posterior temporal region seems to serve as a storehouse of words from which the inferior frontal gyrus selects the most suitable match.
The study authors were of the opinion that rather than being limited to interpreting words alone, these regions may be able to apply meaning to any incoming symbols, be it words, gestures, images, sounds, or objects.
According to Dr. Braun, these regions could also present a hint into how language evolved. He claims that developing a better understanding of the brain systems that support gestures and words may assist in the treatment of some patients with aphasia, a disorder that hinders a person’s ability to produce or understand language.
The findings of the study have been published in the journal, Proceedings of the National Academy of Sciences (PNAS).