Blog Explanation

This blog brings together content that is noticeable, important or otherwise interesting from a human givens point of view.

Sunday 11 May 2014

Language Moves Your Inner Dancer There are deep, surprising connections between words and the brain’s sense of motion through space Apr 22, 2014 |By Lindsay Harris

If you are sitting at a computer right now, take a look at your keyboard. If your keyboard looks like most, the “delete” key is further away from you than the “shift” key. Go ahead and hold down the “shift” key, and don’t release it until I tell you to.
This was the procedure followed by participants in a study recently published in the journal PLoS One. Participants were seated in front of a response apparatus with two buttons, one closer than the other. They read sentences, displayed one word at a time, while they depressed one of the buttons and their brain activity was recorded with scalp electrodes. After reading the last word in the sentence, their instructions were to move their hand to the other button.
Still holding down the shift key? Okay, move your hand to “delete” after you finish reading the following sentence:
As e-mail becomes more common, the amount of snail mail most people receive grows less and less.
The findings of this seemingly simple study might surprise you. On average, participants took longer to move their hand upward (e.g., from “shift” to “delete”) after reading a sentence that ended with the expression “less and less” than after reading a sentence that ended with “more and more.” The researchers argued that moving one’s hand in an upward motion is more difficult than moving one’s hand in a downward motion after reading “less and less,” because encountering that phrase activates areas of motor cortex that are also activated when lowering—not raising—one’s hand.
These were not the first researchers to find evidence in support of the theory of embodied simulation, the notion that we understand the meanings of words by activating the parts of our brains we use to interact with them. (E.g., I know what you mean when you say “e-mail,” the theory goes, because motor regions of the brain I use to place my fingers on a mouse, and to move my eyes across a screen, are activated when I hear the word.) Other studies have shown that it takes longer to move your hand toward you after reading the sentence “Close the drawer,” for example, or to press a button with your fingers outspread after reading a sentence describing an action that requires a closed hand, like hammering a nail.
The PLoS One study, however, shows that our understanding of abstract quantifiers, not just concrete language, is grounded in perceptual simulation. The linguist George Lakoff has argued that we understand abstractions through the use of conceptual metaphors. For example, the colorless, shapeless, weightless quantifier “more” is understood in terms of the concrete direction “up.” (The metaphor results in idioms such as “rising incomes” and “high numbers.”) The fact that the “more is up” metaphor seems natural to us—the more one has of something, the taller the pile of it becomes, and the higher one must raise one’s arm to add to the pile—is why the metaphor was adopted in the first place. The PLoS One study, led by Connie Qun Guan of Beijing’s University of Science and Technology, provides evidence that a motor simulation of the act of stacking is undertaken whenever one encounters the phrase “more and more.”
Guan’s study is also the first to provide neurophysiological evidence of a causal relationship between sensorimotor activity and language comprehension during silent reading. The fact that we have a hard time lowering our hand when we see “more” suggests that the brain simulates an upward movement when it encounters the word, but not necessarily that the simulation is what allows us access to the word’s meaning. In principle, comprehension of language might lead to the activation of neural motor systems (you see the word “more” and access its meaning; you simulate stacking; and then you have trouble lowering your hand) rather than the other way around (the simulation triggered by seeing the word “more” allows you access to its meaning; difficulty lowering your hand is a secondary effect of the simulation). Guan’s team, however, argues that the electrical signals recorded from participants’ brains during the experiment indicated a motor response to key words occurred faster than it would take to comprehend a word and then perform a simulation.  Motor cortex was active shortly after presentation of a key word, signaling that a simulation occurred early on in the chain of neural events.
The embodied simulation phenomenon has practical implications. By now we have all heard the claims that talking on the phone while driving is dangerous, because it diverts a portion of our limited attentional resources away from the road. The research supporting these claims is strong, but evidence for embodied simulation suggests that chatting-while-driving is dangerous for another reason. If part of your sensorimotor system is engaged in language comprehension while you are having a conversation, then less of it is engaged in processing incoming visual, auditory, and tactile information from the real world. The impairment is so slight as to be insignificant in most situations—not much tends to happen in a second’s time during a walk around the block or a game of Scrabble, and a temporary reduction in visual processing capacity during these activities will at worst lead to a stubbed toe, or a double letter score where you might have had a triple.  One second in a car moving at 60 miles per hour, however, equals 88 feet.  Yet one more unanticipated consequence of our modern, speeded-up world—and one more reason to shut off that cell phone or shush the person in the passenger seat when navigating a sticky situation.