Standard models of perception are stimulus-driven, meaning that the
external perceptual event drives the brain's perception-related
activity. However, the tide may be turning: recent ideas suggest that
our perceptual experiences and visually guided behaviors are influenced
by top-down processes in the brain – specifically, the brain's
predictions about the external world. Recently, scientists at University
of Wisconsin–Madison demonstrated that perceptual expectations about
when a stimulus will appear are instantiated in the brain by optimally
configuring prestimulus alpha-band oscillations in order to optimize the
effectiveness of subsequent neural processing.
The researchers state
that their findings provide direct evidence that forming temporal
predictions about when a stimulus will appear can bias the phase of
ongoing alpha-band oscillations (one of the dominant oscillations in the
human brain) toward an optimal phase for visual processing, and so may
be the means for the top-down control of visual processing guided by
temporal predictions.
Doctoral candidate Jason Samaha discusses the paper that he and his colleagues published in Proceedings of the National Academy of Sciences,
first addressing what was involved in providing evidence that
perceptual expectations about when a stimulus will appear are
instantiated in the brain by optimally configuring prestimulus
alpha-band oscillations to optimize subsequent processing efficacy.
"Alpha-band oscillations have been studied for a long time, especially
recently in the context of visual attention," Samaha tells Phys.org.
"Because neural oscillations have many potentially relevant features –
that is, neural processing may be related to the amplitude or phase or
frequency of these oscillations as well as the way that they interact – a
major challenge remains in delineating which properties of these
oscillations are relevant and why." In this study, he notes, the
scientists focused on a single but important physiological signal: the
alpha-band oscillation phase (peak or trough) at a critical point in
time.
The paper also describes the researchers' investigation of whether
the alpha-band phase can be guided by top-down control in a temporal
cueing task as a mechanism through which perceptual predictions can
optimally configure prestimulus neural activity. "Since oscillations
necessarily unfold over time, they've been argued to provide intrinsic
timescales for coordinated neural processing.
Moreover," Samaha continues, "events in the world often follow a
predictable time course, such as predicting when a thrown baseball will
land in the glove of the catcher. We wanted to understand whether these
types of temporal expectations improve the way that the visual system
processes expected events – and if so, whether this was accomplished by
adjusting the brains intrinsic oscillatory time scale in anticipation of
the expected event." To accomplish this, the scientists had to design a
task that could both induce these kinds of expectations and measure
their effect on visual perception.
A key issue in designing said task was testing if cueing human observers to the time at which a target visual stimulus
would appear would bias the phase of ongoing alpha-band oscillations
toward an optimal phase for visual discrimination. "A range of
interesting findings has recently come to light suggesting that visual
perception may be physically modulated, in that our perception of the
same stimulus may change depending on if that stimulus occurs at the
peak or the trough of our alpha oscillations," Samaha points out. "Our
findings suggest that temporal expectations may change the timing of
these oscillations such that visual events that are expected in time can
land at the optimal phase of the alpha rhythm." To pinpoint this
effect, the researchers first had to establish that the participants in
their studies had an optimal alpha phase, meaning that they had to
replicate previous findings. "We could then ask whether telling people
when a visual stimulus would occur could push them towards their best
phase at the predicted moment of the visual stimulus."
The team conducted two experiments to make these determinations. The
first established that cues predictive of the moment of target
appearance significantly enhanced orientation discrimination and
subjective visibility. "It's been a fairly recent discovery that
establishing expectations, either about when a visual event will occur
or what that visual event will be, can actually change the way that
visual event is processing at early stages of the visual system," Samaha
explains. "In the case of temporal expectations we have known for some
time that an individual would be quicker to respond to an expected
event, but this does not mean that temporal expectations enhance
perception, per se. Therefore, we used a demanding visual task
while measuring perceptual accuracy as well as participant's subjective
reports to determine that temporal expectations actually affect the way
that people see a stimulus."
In their second experiment, the scientists found that temporal
predictions led to a bias in the phase of ongoing alpha-band
oscillations toward each participant's optimal phase for visual
discrimination by replicating the first experiment's behavioral effect
while concurrently performing an electroencephalogram (EEG) – a
recording of the brain's electrical activity that measures voltage
fluctuations resulting from ionic current within neurons.
"To describe complex neural signals that evolve over time requires
some novel analytic methods," Samaha recounts. "In particular, phase is a
nonlinear variable, and tools for analyzing it are constantly being
developed in neuroscience. By applying circular statistics to our neural
data, we were able to observe that temporal expectations can alter the
phase of neural oscillations so that, at the expected moment in time, the oscillation will be at a specific phase angle that is optimal for visual processing."
Unlike classical statistics, which work only with linear variables,
circular statistics is designed to work with nonlinear variables – in
this case, data points in the recurrent periodicity of alpha-band
oscillations.
Interestingly, the study's results touch on two theoretical domains
that are currently in flux – namely, temporal prediction vs. temporal
attention. "There have been recent suggestions that attention and
prediction are not the same," Samaha tells Phys.org. "If I know
when something will happen, it could be said that I can predict when it
will happen – but if I don't need to do anything in response to that
event, then I may not really be paying attention to it. In our
experiments, stimuli were both predictable and attended, so it remains
an open question for future research whether the effects we observed
would disappear if attention was not allocated to the stimulus."
Relatedly, Samaha notes that independence of attention and
consciousness is an important question for psychology and neuroscience
in understanding how aware we are of events in our environment that may
cause our behavior. "When we pay very close attention to a complex
stimulus, we sometimes feel like we are aware of more than we can
actually discriminate behaviorally. The opposite is true in some cases,
where we do not recall seeing something but it nevertheless predicts our
behavior later on. These findings suggest that our attention can
sometime change our behavior without our ability to consciously
introspect on those changes. In our experiments, we measured people's
ability to behaviorally discriminate stimuli and we also measured their
subjective reports about those stimuli – if they thought they saw them
or not. We did this in case the two measures would dissociate from each
other, but we actually found that they tracked each other pretty well –
that is, temporal expectations improved individual's discrimination, and
it also increased the frequency of their subjective reports of having
consciously seen the stimulus. It therefore remains another question for
future research as to whether temporal expectations and attention can
differentially impact consciousness and behavioral responses."
In terms of ongoing research, an interesting implication of their
results is that visual perception may occur in discreet processing
windows, clocked by alpha oscillations, rather than in a continuous
stream. "We're currently testing these ideas and relating them to
theories of consciousness," he tells Phys.org. As discussed
above, they're also interested in exploring the independence of
attention and consciousness – that is, to determine if independently
manipulating stimulus visibility, temporal attention, and temporal
prediction reveals another dimension along which consciousness
dissociates from other high-level cognitive processes.
"Another important step in this line of research is to understand the
source of control of these processes: While we've shown that
expectations can change the state of perceptual systems prior to the
expected stimulus,
we don't yet have a full account of how those changes are being
implemented by higher-level systems that may encode goals and
expectations."
Samaha notes that there may well be translational applications their
study. "A growing body of basic research into the neural basis of
predictions may have implications for certain clinical conditions that
are beginning to be understood as an alteration of
predictive processes. Schizophrenia and autism are being thought of in
this way, at least in part – and some evidence suggests that these
conditions are associated with differential patterns of oscillatory
brain activity. More generally, predictions greatly shape our behavior
and our conscious perception – so understanding how these predictions
take place in the brain," Samaha concludes, "can hopefully inform
theoretical advances as to how information processes takes place in
neural systems."
SOURCE:
Medicalxpress





No comments:
Post a Comment