How Does Our Brain Process Language?
Verbal communication includes speech, the tone of the words, and other sounds, like laughter, crying, or expressions of alarm.
How do our brains help us understand verbal information from others?
Our ears take in all sounds from the world around us (people, cars, birds, music, etc.). At the very first step of the neural pathway in our ears, we enhance the ability to hear speech sounds with a built-in filter that inhibits unwanted sounds. That’s why you can listen to a person talk at a noisy party!
All sound information projects up to a certain part of our brain, the auditory cortex, through the following pathway[1]. Each step of the pathway processes, or analyzes different features of a sound, such as frequency, intensity, and sound location. The pathways are not all ascending: multiple feedback loops occur at every step. First, our inner ear separates sounds out by frequencies. Higher sounds are higher frequencies; lower sounds, lower frequencies. The frequencies are tonotopically mapped along the basilar membrane in our cochlea. Louder, or more intense sounds are conveyed to the brain by increased activity of the cochlear nerve fibers. The cochlear nerve terminates in the cochlear nuclear complex in the brain stem. Then, auditory information passes to a midbrain region, the inferior colliculus, and a forebrain region, the medial geniculate nucleus of the thalamus. Both regions send auditory information up to the primary auditory cortex.
Sounds that make up part of speech then take their own special pathway, projecting up to areas in the brain unique to humans and to language, including Broca’s area in the Inferior Frontal Gyrus, Wernicke’s area in the Superior Temporal Gyrus, a region of the Middle Temporal Gyrus, and the inferior parietal and angular gyrus of the Parietal Lobe[2]. Each language area serves a different function.
To understand a sentence, the brain has to break it up into different parts. The first step is to separate speech sounds from nonspeech sounds. This site has not been conclusively identified in the brain. However, a particular region in the auditory cortex has been found to recognize the sounds of phonemes, or consonants, and distinguish them from non-speech. Another region analyzes tone.
The next step is to understand semantics and syntax. Semantics refers to the meaning of a word. Syntax means the rules we have for combining words into phrases and sentences, and for understanding the relationship among words.
Scientists are still debating different theories for how semantic and syntactic information are processed, and how they are integrated. We do know that both the anterior and posterior part of Wernicke’s area in the superior temporal gyrus are involved in processing syntax. When word lists are presented along with sentences, these parts of the brain activate only with sentences[3].
The posterior region of Wernicke’s area also appears to be involved in processing semantic information. Interestingly, this region also has been shown to integrate speech, motion, and face processing. All of these components help us interpret speech, which is why it is easier to understand speech when watching someone than when listening only. While the role of Broca’s area is still up for debate, it is known to be involved in both language production and comprehension[4]. The act of comprehending speech requires working, or short-term memory, as our brain keeps track of what the verb is and what it acts on in the sentence.
In neurotypical people, the left side of the brain is more important in the basic processing of language. The right hemisphere, however, is engaged in interpreting prosody, or the pitch and tone of a sentence. In English, we know when a sentence is a question because the speaker raises his or her voice at the sentence’s end.
The processing of language is complex. An amazing thing about our brain is that all of this processing takes place in a little over half a second!
References
[1] Kandel ER, Shwartz JH, Jessel TM (2000) Principles of Neural Science. 4th edition.McGraw-Hill,NY. 1414 pp.
[2] Friederici A (2011). “The brain basis of language processing: From structure to function.” Physiol Rev 1357:1392.
[3] Ibid.
[4] Ibid.