What you hear can change what you see

Getting a verbal cue - hearing a word or instructions - can change what you see. In a research study published today, scientists reveal that spoken language can alter your perception of the visible world.

The study in PLoS One reveals that people given a series of visual tests had dramatically different scores when they were prompted first with a verbal cue. Asked to find a specific letter in a crowded picture, people were much more likely to find that letter when they were given the auditory cue "letter B" beforehand. Interestingly, being shown an image of the letter B before looking at the picture did not help them pick out the letter B any better than a control group could.

This study has implications for fields as diverse as aviation and the military, where people have to assimilate a lot of visual information while also taking verbal commands. It might be possible, for example, to help people locate targets or patterns by linguistically prompting them as they look out a window or pore over sets of images.

So why do verbal cues work, but visual cues don't? The scientists speculate:

Interestingly, although auditory verbal cues increased detection sensitivity, visual cues did not. This finding makes some sense when one considers that linguistic cues involve a non-overlapping format of sensory information that is globally statistically independent of the visual format of information in the detection task itself. By contrast, visual cues involve the same format of information as the detection task, and therefore do not provide converging sensory evidence from independent sources when the to-be-detected stimulus is presented.

In other words, a combination of auditory and visual stimuli creates a kind of triangulation effect in your brain, allowing you to zero in on the sought-after object more easily.

The researchers say they are just at the beginning of their research, and end their article with a lot of interesting questions for further study:

First, does the cuing effect generalize to more complex objects? Because the cuing effect was observed in a design that intermixed cued and uncued trials, the cue-induced facilitation must be transient component, but its duration and temporal profile are at present unknown. Second, how general are the present findings of a cross-modality advantage for visual detection? Future work will need to explore whether the cross-modality advantage is present in the reverse direction: is detection of an auditory target improved more by a visual cue than a corresponding auditory cue? Based on the present results, the answer is unclear, however, ongoing studies suggest that the format of the cue, in addition to its modality, is important: verbal auditory cues (e.g., "cow") facilitated visual identification and discrimination more than nonverbal auditory cues (e.g., the sound of a cow mooing"). Finally, future research will need to investigate the process by which learning to associate new labels with new stimuli enhances detection of these stimuli. Such work may inform our understanding of how, and to what degree, learning different languages can induce differences in perceptual processing.

This last question is particularly interesting. Do people who know multiple languages stand a better chance of identifying objects or patterns because they can use different "perceptual processes" to see the world around them? If you're interested in how perception works in the brain, you really must read this whole article - the researchers also include a very detailed account of all the experiments they did and how they verified their results.

via PLoS One

Image by Harsányi András/Shutterstock