In the researcher’s words: Neural mechanisms underlying attentional selection

Sunday, May 1, 2011

Sabine Kastner, professor of psychology and the Princeton Neuroscience Institute

Real-world scenes, such as cityscapes or mountain vistas, are cluttered and contain many different objects. The capacity of the visual system to process the information that is present in these scenes is rather limited, so the brain has developed neural mechanisms to select the information that is most relevant for guiding current behavior.

Traditionally, the problem of attentional selection from natural scenes has been studied in laboratory settings, where the clutter of a scene is mimicked by visual displays containing a large number of more simple objects, such as shapes or letters. While this approach helps to control many physical parameters of the stimuli present in a visual display, such reductionism may barely reflect the true complexity of natural vision as a result. In an attempt to understand attentional selection from ecologically more relevant visual displays, we have begun to investigate the neural mechanisms underlying natural scene categorization in the human visual system using functional brain imaging with an MRI scanner.

In daily life, we often extract visual object information at the categorical level from our environment. For example,A research team led by neuroscientist Sabine Kastner found that the neural activity in the object-selective cortex — a brain region that responds more strongly to objects than to other visual stimuli — depends entirely on a subject’s task. When subjects were instructed to detect either people or cars in more than 2,000 outdoor scenes, the observed fMRI activity in the object-selective cortex (illustrated here in orange) differed, demonstrating how brain activity reflects only the searched-for category. (Image courtesy of Marius Peelen and Sabine Kastner) when we cross the street, we look for cars as a general category, and not necessarily for a specific exemplar that belongs to that category. In a study reported in Nature, we asked our subjects to detect either people or cars in more than 2,000 outdoor scenes that were briefly presented while our subjects underwent brain imaging.

We found that the neural activity in the object-selective cortex -- a brain region that responds more strongly to objects than to other visual stimuli -- that was evoked by the scenes depended entirely on the subject’s task: It reflected information about the category “people” when subjects performed the people detection task, but not when subjects performed the car detection task, and vice versa. These results suggest that looking for a particular piece of information in a scene renders us “blind” to other information that is present but not relevant to behavior. Our findings imply that neural activity in sensory processing areas is primarily determined by internally generated signals related to ongoing behavior, rather than by the physical properties of the visual world.

Peelen, Marius V., Fei-Fei Li, and Sabine Kastner. 2009. “Neural Mechanisms of Rapid Natural Scene Categorization in Human Visual Cortex.” Nature 460: 94-97.