Language affects half of what we see
By William Harms, University of Chicago, and Robert Sanders, UC Berkeley Media Relations | 31 January 2006
BERKELEY – The language we speak affects half of what we see, according to researchers at the University of California, Berkeley, and the University of Chicago.
Scholars have long debated whether our native language affects how we perceive reality – and whether speakers of different languages might therefore see the world differently. The idea that language affects perception is controversial, and results have conflicted.
A paper published this month in the journal Proceedings of the National Academy of Sciences supports the idea – but with a twist. The paper suggests for the first time that language affects perception in the right half of the visual field, but much less, if at all, in the left half.
The paper, “Whorf Hypothesis is Supported in the Right Visual Field but not in the Left,” is by Aubrey Gilbert, Richard Ivry and Paul Kay at UC Berkeley and Terry Regier at the University of Chicago.
This new finding is suggested by the organization of the brain, the researchers say. Language function is processed predominantly in the left hemisphere of the brain, which receives visual information directly from the right visual field. “So it would make sense for the language processes of the left hemisphere to influence perception more in the right half of the visual field than in the left half,” said Regier, an associate professor of psychology at the University of Chicago who proposed the idea behind the study.
The team confirmed the hypothesis through experiments designed and conducted in psychology professor Ivry’s lab at UC Berkeley.
“We were thrilled to find this sort of effect and are very interested in investigating it further,” said Gilbert, a UC Berkeley graduate student in the Helen Wills Neuroscience Institute and the study’s lead author. The experiments tested UC Berkeley undergraduates and also a patient whose brain hemispheres had been surgically separated.
Many of the distinctions made in English, such as between colors, do not appear in other languages, and vice versa, according to researchers. For instance, English uses two different words for the colors blue and green, while many other languages – such as Tarahumara, an indigenous language of Mexico – instead uses a single color term that covers shades of both blue and green. An earlier study by Kay, a UC Berkeley professor emeritus of linguistics and a senior research scientist at the International Computer Science Institute in Berkeley, and his colleagues had shown that speakers of English and Tarahumara perceive colors differently: English speakers found blues and greens to be more distinct from each other than did speakers of Tarahumara, as if the English green/blue linguistic distinction sharpened the perceptual difference between the colors themselves.
Subjects were shown a circle of squares and asked whether the odd-colored square was on the left or right. When the odd man out was on the right, subjects more quickly saw the blue square amidst green squares than a shaded green square among the green squares. When the odd square was on the left, there was no difference in recognizing colors.
The present study essentially repeated the English part of that earlier test, but also made sure that colors were presented to either the right or the left half of the visual field – something the earlier study hadn’t done – so as to test whether language influences the right half of our visual world more than the left half, as predicted by brain organization.
In each experimental trial of the current study, participants saw a ring of 12 small colored squares. All but one of the squares were the same color. The “odd man out” square appeared in either the right or left half of the circle, and participants were asked to indicate its position with a keyboard response. Critically, the color of this square had either the same name as the other squares (green, for example, but a different shade of green from the other squares), or a different name (a shade of blue, while the others were all a shade of green).
Participants responded more quickly when the color of the odd-man-out square had a different name than the color of the other squares – as if the linguistic difference had heightened the perceptual difference – but this only occurred if the odd-man-out was in the right and not the left half of the visual field. This was the predicted pattern.
Earlier studies addressing the possible influence of language on perception tended to look for a simple “yes” or “no” answer: Either language affects perception, or it does not. In contrast, the current findings support both views at once. Language appears to sharpen visual distinctions in the right visual field, and not in the left. In their paper, the researchers conclude that our representation of the visual world “may be, at one and the same time, filtered and not filtered through the categories of language.”
The work was supported by the National Science Foundation and the National Institutes of Health.