Download Neurons Excitatory vs Inhibitory Neurons The Neuron and its Ions

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Eyeblink conditioning wikipedia , lookup

Neuroanatomy wikipedia , lookup

Long-term depression wikipedia , lookup

Binding problem wikipedia , lookup

End-plate potential wikipedia , lookup

Holonomic brain theory wikipedia , lookup

Neural oscillation wikipedia , lookup

Neuromuscular junction wikipedia , lookup

Metastability in the brain wikipedia , lookup

Caridoid escape reaction wikipedia , lookup

Neural coding wikipedia , lookup

Circumventricular organs wikipedia , lookup

Synaptic noise wikipedia , lookup

Anatomy of the cerebellum wikipedia , lookup

Optogenetics wikipedia , lookup

Premovement neuronal activity wikipedia , lookup

Clinical neurochemistry wikipedia , lookup

Apical dendrite wikipedia , lookup

Synaptogenesis wikipedia , lookup

Neural modeling fields wikipedia , lookup

Nervous system network models wikipedia , lookup

Pattern recognition wikipedia , lookup

Biological neuron model wikipedia , lookup

Activity-dependent plasticity wikipedia , lookup

Neuropsychopharmacology wikipedia , lookup

Sparse distributed memory wikipedia , lookup

Nonsynaptic plasticity wikipedia , lookup

Recurrent neural network wikipedia , lookup

Neurotransmitter wikipedia , lookup

Catastrophic interference wikipedia , lookup

Convolutional neural network wikipedia , lookup

Pre-Bötzinger complex wikipedia , lookup

Channelrhodopsin wikipedia , lookup

Stimulus (physiology) wikipedia , lookup

Spike-and-wave wikipedia , lookup

Types of artificial neural networks wikipedia , lookup

Central pattern generator wikipedia , lookup

Hierarchical temporal memory wikipedia , lookup

Feature detection (nervous system) wikipedia , lookup

Chemical synapse wikipedia , lookup

Molecular neuroscience wikipedia , lookup

Synaptic gating wikipedia , lookup

Transcript
1
Left over from units..
2
Networks
Layers and layers of detectors...
• Hodgkin-Huxley model
Inet = gN am3h(Vm − EN a) + gk n4(Vm − Ek ) + (Vm − El )
m, h, n: voltage gating variables with their own dynamics that
determine when channels open and close
• Bias weight
3
Networks
4
Cortex: Neurons
1. Biology of networks: the cortex
2. Excitation:
• Unidirectional (transformations)
• Bidirectional (pattern completion, amplification)
3. Inhibition: Controlling bidirectional excitation.
4. Constraint Satisfaction: Putting it all together.
Two separate populations:
• Excitatory (glutamate): Pyramidal, Spiny stellate.
• Inhibitory (GABA): Chandelier, Basket.
5
Excitatory vs Inhibitory Neurons
6
The Neuron and its Ions
Inhibitory
Synaptic
Input
• Excitatory neurons both project locally and make long-range
projections between different cortical areas
Cl−
−70
Leak
Excitatory Vm Cl−
Synaptic
Input
• Inhibitory neurons primarily project within small, localized
regions of cortex
K+
Na+
• Excitatory neurons carry the information flow (long range
projections)
Na/K
Pump
+55
Vm
Na+
Vm
−70mV
0mV
• Inhibitory neurons are responsible for (locally) regulating the
activation of excitatory neurons
Glutamate →
K+
−70
Vm
7
8
The Neuron and its Ions
The Neuron and its Ions
Inhibitory
Synaptic
Input
Inhibitory
Synaptic
Input
Cl−
Cl−
−70
Excitatory Vm Cl−
Synaptic
Input
+55
Vm
+55
Vm
−70mV
Vm
0mV
0mV
Glutamate → opens Na+ channels →
9
Glutamate → opens Na+ channels → Na+ enters (excitatory)
10
The Neuron and its Ions
The Neuron and its Ions
Inhibitory
Synaptic
Input
Inhibitory
Synaptic
Input
Cl−
Cl−
−70
Vm
+55
Vm
Na/K
Pump
−70mV
K+
−70
K+
Na+
Vm
Na+
Na/K
Pump
Leak
Excitatory Vm Cl−
Synaptic
Input
K+
−70
K+
+55
−70
Leak
Excitatory Vm Cl−
Synaptic
Input
Na+
Vm
Vm
Na+
Na/K
Pump
−70mV
K+
−70
K+
Na+
Vm
Na+
Na/K
Pump
Leak
Excitatory Vm Cl−
Synaptic
Input
K+
−70
K+
Na+
−70
Leak
Vm
Vm
Na+
Vm
0mV
−70mV
0mV
Glutamate → opens Na+ channels → Na+ enters (excitatory)
Glutamate → opens Na+ channels → Na+ enters (excitatory)
GABA →
GABA → opens Cl- channels →
11
12
The Neuron and its Ions
Inhibitory
Synaptic
Input
Cl−
−70
Leak
Excitatory Vm Cl−
Synaptic
Input
K+
Na+
Na/K
Pump
+55
Vm
Vm
Na+
Vm
K+
−70
−70mV
0mV
Glutamate → opens Na+ channels → Na+ enters (excitatory)
GABA → opens Cl- channels → Cl- enters if Vm ↑ (inhibitory)
Laminar Structure of Cortex
13
Layers
14
Laminar Structure of Cortex
Three Functional Layers
• Layer = a bunch of neurons with similar connectivity
Hidden
(Layers 2,3)
• Localized to a particular region (physically contiguous)
Input
(Layer 4)
• All cells within a layer receive input from approximately the
same places (i.e,. from a common collection of layers)
• All cells within a layer send their outputs to approximately the
same places (i.e., to a common collection of layers)
Output
(Layers 5,6)
Sensation
(Thalamus)
Subcortex
Motor/BG
Hidden layer transforms input-output mappings
More hidden layers → richer transformations → less reflex-like...
smarter? more “free”?
15
16
17
Networks
18
Excitation (Unidirectional): Transformations
5
1. Biology: Cortical layers and neurons
0
6
1
7
2
8
3
9
4
Hidden
2. Excitation:
• Unidirectional (transformations)
• Bidirectional (pattern completion, amplification)
3. Inhibition: Controlling bidirectional excitation.
Input
4. Constraint Satisfaction: Putting it all together.
• Detectors work in parallel to transform input activity pattern
to hidden activity pattern.
19
Excitation (Unidirectional): Transformations
20
Emphasizing/Collapsing Distinctions
0
1
2
3
4
5
6
7
8
9
Hidden
2
8
3
7
1
6
5
0
9
4
Input
Other (more interesting) examples?...
• Emphasizes some distinctions, collapses across others.
• Function of what detectors detect (and what they ignore).
21
[transform.proj]
digit detectors:
• tested with noisy digits
• tested with letters
22
Cluster Plots
• Cluster plots provide a means of visualizing similarity
relationships between patterns of activity in a network
• Cluster plots are constructed based on the distances between
patterns of activity
• Euclidean distance = sum (across all units) of the squared
difference in activation
d=
sX
(xi − yi)2
i
• Example...
23
24
(1)
25
26
27
28Emphasizing/Collapsing Distinctions: Categorization
[transform.proj]
a)
b)
NoisyDigits Pattern: 0
Y
cluster plots (digits, noisy digits, hidden).
Hidden_Acts Pattern: 0
Y
30.00
28.00
26.00
24.00
22.00
4
4
4
1
1
1
20.00
18.00
30.00
6
6
6
0
0
0
26.00
24.00
22.00
20.00
16.00
14.00
12.00
10.00
2
2
2
8.00
6.00
4.00
7
7
7
2.00
0.00
9
9
9
5
5
5
18.00
8
8
8
3
3
3
16.00
14.00
12.00
10.00
8.00
6.00
4.00
2.00
X
0.00
5.00
10.00
9
9
9
8
8
8
7
7
7
6
6
6
5
5
5
4
4
4
3
3
3
2
2
2
1
1
1
0
0
0
28.00
15.00
20.00
0.00
X
0.00
0.50
1.00
Emphasize distinctions: Different digits separated, even though
they have perceptual overlap.
Collapse distinctions: Noisy digits categorized as same, even
though they have perceptual differences.
29
a)
30
Detectors are Dedicated, Content-Specific
b)
Letters Pattern: 0
Y
Localist vs. Distributed Representations
Hidden_Acts Pattern: 0
Y
26.00
26.00
L
K
24.00
22.00
E
20.00
20.00
N
V
18.00
18.00
U
J
16.00
16.00
O
G
C
14.00
14.00
S
12.00
12.00
D
B
10.00
10.00
Y
X
8.00
8.00
Z
T
I
6.00
6.00
M
H
4.00
4.00
W
2.00
2.00
Q
A
0.00
X
0.00
5.00
10.00
15.00
20.00
• Localist = 1 unit responds to 1 thing (e.g., digits, grandmother
cell).
Z
Y
X
W
V
U
T
R
Q
P
O
N
M
L
K
J
I
H
G
F
E
D
C
B
A
S
24.00
R
P
F
22.00
0.00
• Distributed = Many units respond to 1 thing, one unit
responds to many things.
• With distributed representations, units correspond to stimulus
features as opposed to complete stimuli
X
0.00
0.50
1.00
31
Digits With Distributed Representations
32
Advantages of Distributed Representations
Efficiency: Fewer Units Required
The digits network can represent 10 digits using 5 “feature” units
Hidden
Each digit is a unique combination of the 5 features, e.g.,
Input
“0” = feature 3
“1” = features 1, 4
“2” = features 1, 2
“3” = features 1, 2, 5
“4” = features 3, 4
“5” = features 1, 2, 3
There are 32 unique ways to combine 5 features
There are > 1 million unique ways to combine 20 features
33
Advantages of Distributed Representations
34
Similarity and Generalization:
If you represent stimuli in terms of their constituent features,
stimuli with similar features get assigned similar representations
This allows you to generalize to novel stimuli based on their
similarity to previously encountered stimuli
35
36
37
38
39
40
41
42
Advantages of Distributed Representations
Robustness (Graceful Degradation):
Damage has less of an effect on networks with distributed (vs.
localist) representations
43
44
45
46
Advantages of Distributed Representations
Activation
47
48
Advantages of Distributed Representations
Efficiency: Fewer total units required.
Similarity: As a function of overlap.
Continuous Dimension
Generalization: Can use novel combinations.
Accuracy: By coarse-coding. (e.g, color, position)
Robustness: Redundancy: damage has less of an effect
These tuning curves are commonly seen, e.g. in V1:
Accuracy: By coarse-coding.
49
Networks: Bidirectional Excitation
50
...
...
1. Top-down expectations about low-level features.
2. Pattern completion.
51
52
53
54
55
57
56
[pat complete.proj]
58 Word Superiority Effect: Top-Down Amplification
Identify second letter in:
NEST (faster)
DEST (slower)
Weird! You have to recognize the letter before you can recognize
the word, so how can the word help letter recognition?
59
[amp topdown.proj]
60
Application to Word Superiority Effect
61
Networks: Bidirectional Excitation
62
Networks
...
1. Biology: The cortex
2. Excitation:
...
1. Top-down processing (“imagery”).
2. Pattern completion.
3. Amplification/bootstrapping.
4. Need inhibition! [amp top down dist.proj ]
• Unidirectional (transformations)
• Bidirectional (top-down processing, pattern completion,
amplification)
3. Inhibition: Controlling bidirectional excitation.
4. Constraint Satisfaction: Putting it all together.