Words help deterimine what we see

The language we speak affects half of what we see, according to researchers at the University of California, Berkeley, and the University of Chicago.

Scholars have long debated whether our native language affects how we perceive reality — and whether speakers of different languages might therefore see the world differently. The idea that language affects perception is controversial, and results have conflicted. A paper published this month in the Proceedings of the National Academy of Sciences supports the idea — but with a twist.

The paper suggests that language affects perception in the right half of the visual field, but much less, if at all, in the left half. The paper, “Whorf Hypothesis is Supported in the Right Visual Field but not in the Left,” by Aubrey Gilbert, Terry Regier, Paul Kay, and Richard Ivry — is the first to propose that language may shape just half of our visual world.

Terry Regier is Associate Professor of Psychology at the University of Chicago. Gilbert is a graduate student in the Helen Wills Neuroscience Institute at UC Berkeley. Kay is Professor Emeritus of Linguistics and a senior research scientist at the International Computer Science Institute in Berkeley. Ivry is a Professor of Psychology, director of UC Berkeley’s Institute of Cognitive and Brain Sciences, and a member of the Helen Wills Neuroscience Institute.

Their finding is suggested by the organization of the brain, the researchers say. Language function is processed predominantly in the left hemisphere of the brain, which receives visual information directly from the right visual field. “So it would make sense for the language processes of the left hemisphere to influence perception more in the right half of the visual field than in the left half”, said Terry Regier of the University of Chicago, who proposed the idea behind the study.

The team confirmed the hypothesis, through experiments designed and conducted in Richard Ivry’s lab at the University of California, Berkeley. “We were thrilled to find this sort of effect and are very interested in investigating it further,” said Gilbert, the lead author on the study. The hypothesis was confirmed in experiments that tested Berkeley undergraduates, and also in an experiment that tested a patient whose hemispheres had been surgically separated. “The evening I first reviewed the split-brain patient data I called people at home in my excitement to share the findings,” said Gilbert.

Many of the distinctions made in English do not appear in other languages, and vice versa. For instance, English uses two different words for the colors blue and green, while many other languages — such as Tarahumara, an indigenous language of Mexico — instead use a single color term that covers shades of both blue and green. An earlier study by Paul Kay and colleagues had shown that speakers of English and Tarahumara perceive colors differently: English speakers found blues and greens to be more distinct from each other than speakers of Tarahumara did, as if the English “green” / “blue” linguistic distinction sharpened the perceptual difference between the colors themselves. The present study essentially repeated the English part of that earlier test, but also made sure that colors were presented to either the right or the left half of the visual field — something the earlier study hadn’t done — so as to test whether language influences the right half of our visual world more than the left half, as predicted by brain organization.

In each experimental trial of the present study, participants saw a ring of colored squares. All the squares were of exactly the same color, except for an “odd-man-out” of a different color. The odd-man-out appeared in either the right or the left half of the circle, and participants were asked to indicate which side of the circle the odd-man-out was on, by making a keyboard response. Critically, the color of this odd-man-out had either the same name as the other squares (e.g. a shade of “green”, while the others were all a different shade of “green”), or a different name (e.g. a shade of “blue”, while the others were all a shade of “green”). The researchers found that participants responded more quickly when the color of the odd-man-out had a different name than the color of the other squares — as if the linguistic difference had heightened the perceptual difference — but this only occurred if the odd-man-out was in the right half of the visual field, and not when it was in the left half. This was the predicted pattern.

Earlier studies addressing the possible influence of language on perception tended to look for a simple yes or no answer: either language affects perception, or it does not. In contrast, the current findings support both views at once. Language appears to sharpen visual distinctions in the right visual field, and not in the left visual field. The researchers conclude that “our representation of the visual world may be, at one and the same time, filtered and not filtered through the categories of language.”

From University of Chicago


Substack subscription form sign up