A single mechanism can account for human perception of depth in mixed correlation random dot stereograms

Relating neural activity to perception is one of the most challenging tasks in neuroscience.
Stereopsis—the ability of many animals to see in stereoscopic 3D—is a particularly tractable
problem because the computational and geometric challenges faced by the brain are
very well understood. In essence, the brain has to work out which elements in the left eye’s
image correspond to which in the right image. This process is believed to begin in primary
visual cortex (V1). It has long been believed that neurons in V1 achieve this by computing
the correlation between small patches of each eye’s image. However, recent psychophysical
experiments have reported depth perception in stimuli for which this correlation is zero,
suggesting that another mechanism might be responsible for matching the left and right
images in this case. In this article, we show how a simple modification to model neurons
that compute correlation can account for depth perception in these stimuli. Our model
cells mimic the response properties of real cells in the primate brain, and importantly, we
show that a perceptual decision model that uses these cells as its basic elements can capture
the performance of human observers on a series of visual tasks. That is, our computer
model of a brain area, based on experimental data about real neurons and using only a single
type of depth computation, successfully explains and predicts human depth judgments
in novel stimuli. This reconciles the properties of human depth perception with the properties
of neurons in V1, bringing us closer to understanding how neuronal activity causes
perception.
HenriksenCummingRead2016.PDF
File Size0.7 MiB
DateMay 20, 2016
Downloads2151
AuthorHenriksen S, Cumming BG, Read JCA