Download Multi-sensory integration

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Multi-sensory integration
CoSMo 2012
Gunnar Blohm
1
CoSMo 2012 - G. Blohm
Aug 6-7, 2012
Introduction
The world is highly variable




Sensory uncertainties
Noisy neural codes
Conflicting sensory cues (e.g. illusions)
Questions:





2
How does the brain generate a perceptual experience despite
all this uncertainty?
How can we infer a state (i.e. code in the brain, attribute of an
object, sensory state, etc)?
What is the optimal way to act in this noisy world?
How do we decide what cue to trust and/or how much?
CoSMo 2012 - G. Blohm
Aug 6-7, 2012
Introduction
Multi-sensory combination


E.g. Multi-sensory integration




3
McGurk effect, ventriloquism, ...
The study of how different sensory modalities (vision, touch,
sound, etc) get combined into a perceptual experience that is
coherent and unified.
The brain always uses all available useful information.
Information from different sources is combined in a statistically
optimal fashion
CoSMo 2012 - G. Blohm
Aug 6-7, 2012
What is multi-sensory integration?
Example: Ventriloquism

Integration of vision and audition
Edgar Bergen with sidekick (Charlie McCarthy)
4
CoSMo 2012 - G. Blohm
Aug 6-7, 2012
What is multi-sensory integration?
Example: reaching


Evaluation of current hand position


Vision
Current joint angles (proprioception and/or efference copy)
Noisy signal estimates

5
CoSMo 2012 - G. Blohm
Aug 6-7, 2012
Signal-dependent sensory noise

Reach variability depends on gaze fixations

Signal-dependent noise in muscle spindles
explains arm position variability

Neuronal noise is
signal-dependent (Poisson)
Maimon & Assad (2009)
6
CoSMo 2012 - G. Blohm
Blohm & Crawford (2007)
Scott & Loeb (1994)
Aug 6-7, 2012
Mathematical framework for Bayesian
integration
Cue combination




p( A, V | X )  p( X )
Optimal Bayesian observer p ( X | A, V ) 
p( A, V )
Independent observations A, V
p ( A, V | X )  p(V | X )  p( A | X )
If uniform priors, then
p ( X | A, V ) 
p (V | X )  p( A | X )

7
The brain always uses all
available useful information.
Information from different
sources is combined in a
statistically optimal fashion
likelihood

p ( A, V | X )
p (V | X )
CoSMo 2012 - G. Blohm
p( A | X )
Aug 6-7, 2012
Bayesian integration
Cue combination

Gaussian likelihood functions
1
2

1

1
 21  2 2
p ( A, V | X )
2
2



1
2
2
  2
 1   22
2 
 1
    2  2 
 1  2 
likelihood

p (V | X )
2
8
CoSMo 2012 - G. Blohm
p( A | X )
Aug 6-7, 2012
Example: breast cancer
sensitivity
P(test +)=.90
P(BC+)=.003
P (+, test +)=.0027
P(test -) = .10
P(+, test -)=.0003
P(test +) = .11
P(-, test +)=.10967
P(BC-)=.997
P(test -) = .89
specificity
P(-, test -) = .88733
______________
1.0
Marginal probabilities of breast cancer….(prevalence
among all 54-year olds)
P(BC/test+)=.0027/(.0027+.10967)=2.4%
9
CoSMo 2012 - G. Blohm
Aug 6-7, 2012
Estimation of priors?



Based on a priori belief
Difficulty: priors can be subjective and/or objective
A non-uniform prior acts like a cue in cue integration
p ( A, V | X )  p ( X )
p ( X | A, V ) 
p ( A, V )

How to build an objective prior?

10
Based on prior evidence...
CoSMo 2012 - G. Blohm
Aug 6-7, 2012
Estimation of priors?

Estimation of priors

Kalman filter (~1960): recursive Bayesian estimation



Given a hidden Markov process with state xk (i.e. chain of events)
xk  Fk xk 1  Bk uk  n1k
n1k  N (0, Qk )
z k  H k xk  nk2
nk2  N (0, Rk )
Initial belief (prior): uniform or some other function
Each observation zk can be used to update the belief
p ( z k | xk )  p ( xk | Z k 1 )
p ( xk | z k ) 
p ( z k | Z k 1 )
Z k 1  z1... z k 1
Wikipedia
11
CoSMo 2012 - G. Blohm
Aug 6-7, 2012
Bayesian computations in population codes

Representing uncertainty with population codes

Probabilistic population codes


Poisson-like neural noise
Variance inversely
related to gains of
population code
Ma et al. (2006)
12
CoSMo 2012 - G. Blohm
Aug 6-7, 2012
Bayesian integration

Cue combination

E.g. Visual-vestibular integration
Gu et al. Nat Neurosci 2008
13
CoSMo 2012 - G. Blohm
Aug 6-7, 2012
Sensory-motor integration
14
CoSMo 2012 - G. Blohm
Aug 6-7, 2012
Multi-sensory integration examples


Back to ventriloquism…
Audio-visual integration
Knill & Pouget, TINS 2004
15
CoSMo 2012 - G. Blohm
Aug 6-7, 2012
Multi-sensory integration across ref. frames

Sensory signals have to be
transformed into a
common reference frame
BEFORE integration




Transformations depend on
relative orientation of eyes,
head, shoulder...
The CNS needs to estimate
eye-head-shoulder angles
But, sensory estimations
are noisy!
Does this noise affect
multi-sensory integration?
16
Sensory
Input 1
T1
Eye
Head
Shoulder
Sensory
Input 2
T2
Multi-sensory
integration
Perception
CoSMo 2012 - G. Blohm
Action
Aug 6-7, 2012
Multi-sensory perception
Burns, Nashed & Blohm (2011)
17
CoSMo 2012 - G. Blohm
Aug 6-7, 2012
Multi-sensory perception


Head roll affects discriminability of hand-target distance
Signal-dependent head roll noise in transformation
Burns, Nashed & Blohm (2011)
18
CoSMo 2012 - G. Blohm
Aug 6-7, 2012
Multi-sensory integration for reaching

Visual-proprioceptive integration
Sober & Sabes (2003, 2005)
19
CoSMo 2012 - G. Blohm
Aug 6-7, 2012
Multi-sensory integration for reaching

Sober & Sabes (2003, 2005) dual comparison model
Multi-sensory integration
20
CoSMo 2012 - G. Blohm
Aug 6-7, 2012
Multi-sensory integration for reaching

Statistical formulation (1st order)
Multi-sensory integration
Reference frame transformation
Inverse kinematics (linear)
Burns & Blohm (2010)
21
CoSMo 2012 - G. Blohm
Aug 6-7, 2012
Multi-sensory integration for reaching
Burns & Blohm (2010)
Head roll influences reach errors and reach variability & sensory weights
22
CoSMo 2012 - G. Blohm
Aug 6-7, 2012
Neuronal implementation?

Divisive normalization & marginalization

Relevant for sensory processing, visual search, object
recognition, multi-sensory integration, coordinate
transformations, navigation, inference, motor control, etc…
Ohshiro, Angelaki, DeAngelis (2011)
23
CoSMo 2012 - G. Blohm
Aug 6-7, 2012
Neuronal implementation?

Alternatively, if probabilistic
population codes are used


Gaussian RFs
Poisson-like spiking
Beck, Latham, Pouget (2011)
24
CoSMo 2012 - G. Blohm
Aug 6-7, 2012
Problems with explicit divisive normalization

The curse of dimensionality

N(ds) neurons needed






Huge connectivity - # connections for each neuron > # neurons
Precise structure required – extraordinary regularity
Representation of population coding



Exmaple: N=100, d=3, s=2  1012 neurons needed!
Network connectivity constraints


N: neurons along each dimension
d: # dimensions
s: # signals to be combined
All input and output codes must be the same (Weak Fusion Model)
Mixture of different codes is not trivial
Alignment of population codes

25
Perfect alignment of codes required
CoSMo 2012 - G. Blohm
Aug 6-7, 2012
A possible alternative: implicit approximate
normalization (IAN)

In machine learning, marginalization is known as the
partition function problem.

Explicit partition functions are typically impossible to compute,
but non-probabilistic approaches allow for solutions.
Standage, Lillicrap, Blohm (in preparation)
26
CoSMo 2012 - G. Blohm
Aug 6-7, 2012
Properties of the IAN model
27
CoSMo 2012 - G. Blohm
Aug 6-7, 2012
Properties of the IAN model

Multi-sensory suppression
28
CoSMo 2012 - G. Blohm
Aug 6-7, 2012
That’s all folks
29
CoSMo 2012 - G. Blohm
Aug 6-7, 2012
Related documents