Download From/To LTM - Ohio University

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

State-dependent memory wikipedia , lookup

Subventricular zone wikipedia , lookup

Mirror neuron wikipedia , lookup

Development of the nervous system wikipedia , lookup

Multielectrode array wikipedia , lookup

Clinical neurochemistry wikipedia , lookup

Nonsynaptic plasticity wikipedia , lookup

Caridoid escape reaction wikipedia , lookup

Neural coding wikipedia , lookup

Holonomic brain theory wikipedia , lookup

Effects of stress on memory wikipedia , lookup

Single-unit recording wikipedia , lookup

Catastrophic interference wikipedia , lookup

Neural modeling fields wikipedia , lookup

Electrophysiology wikipedia , lookup

Stimulus (physiology) wikipedia , lookup

Neuroanatomy wikipedia , lookup

Anatomy of the cerebellum wikipedia , lookup

Circumventricular organs wikipedia , lookup

Pre-Bötzinger complex wikipedia , lookup

Premovement neuronal activity wikipedia , lookup

Sparse distributed memory wikipedia , lookup

Central pattern generator wikipedia , lookup

Recurrent neural network wikipedia , lookup

Biological neuron model wikipedia , lookup

Optogenetics wikipedia , lookup

Nervous system network models wikipedia , lookup

Convolutional neural network wikipedia , lookup

Neuropsychopharmacology wikipedia , lookup

Feature detection (nervous system) wikipedia , lookup

Types of artificial neural networks wikipedia , lookup

Channelrhodopsin wikipedia , lookup

Hierarchical temporal memory wikipedia , lookup

Synaptic gating wikipedia , lookup

Transcript
Hierarchical spatio-temporal memory for machine learning
based on laminar minicolumn structure
Janusz A. Starzyk, Yinyin Liu
Ohio University, Athens, OH
LONG-TERM MEMORY
INTRODUCTION
Spatio-temporal memories are
fundamental to self-organization and
learning in bio-inspired systems.
 Short term memory (STM) and long
term memory (LTM): two major types
of memories in neurobiological research
of human brain.
 They occupy different regions of the
human brain, have different structural
organization.
 They interact with each other.
 Input information go through the
STM so that it can be stored in the LTM.
 Information from LTM is retrieved to
STM where it is updated and new
associations are created (Fig.1).

Information
Short-term
Memory
Store
Retrieve
Long-term
Memory
•LTM cell overall output
inhibition among layer 4 neurons
selective circuit between 6/4 layers
 perceptual grouping
local winners’ domination
Feedback from layer 2 to layer 6
Folded feedback 2 6 4
Feedback from higher-level layer 6
 solve ambiguity
input selection
Feedback will not propagate forward
 network stability

PN
SN
P
PN
in
S out ( )   max( S PN (t ))

STM
Fig. 1 Interaction between
LTM and STM
SN
Input text
perforation
perforations
peroration
reformation
defloration
percolation
performance
deforestation
penetration
perversion
prerogative
perk
mantilla
gorilla
Normalized output
0.1118
0.2111
0.1217
0.2082
0.1505
0.0549
0.1026
0.0800
0.0000
0.0000
0.0000
0.0000
0.0000
0.0000
Y
Time
t1
t2
t3
t4
t5
Fig. 3 LTM cell
P
WTA
LTM
LTM
LTM
Level h
STM
P
LTM
LTM
…
LTM
LTM
P
Level h-1
STM
P
t1
t2
t3
t4
t5
Fig. 4 Hierarchical LTM
LTM based on minicolumns:
Sequence is from the real input:
• Minicolumns representing symbols “A” “R” “Y” are found
through competition.
• Signal flow (Fig. 5):
LTM (“ARRAY”)
From/to
input  layer 6/4 of winning columns 
STM
layer 2 of winning columns  PNs  LTM cell output
• Strongly stimulates the PNs in LTM cell
• Strong activations of layer 2 neurons of winning minicolumns
…
PN
help their layer 6 neurons win in local competition  PNs are
connected with layer 6 using Hebbian learning.
• The output of LTM cell enters the layer 6 neuron on the higher Layer 2
LTM level so that “ARRAY” can be combined with other possible
sequences to build complex sequence memory.
 Sequence is from STM:
•Signal flow (Fig. 5):
input layer 6 of winning columns 
Layer 6
PNs  LTM cell output
• STM stimulation will not flow up the minicolumn and overlap
with the real sensory input
From/to
From/to
• Slightly stimulates the PNs in LTM cell.
STM
“A” STM
“Y”
“R”
• By comparing the level of stimulation, LTM is able to
Fig. 5 LTM cell with minicolumns
differentiate the recalled information from the real sensory input.
Feedback
from higher
level
5
Lateral
inhibition
Input activation
Fig.2 Laminar Minicolumn
5
5
4
4
9
9
8
Playback
neurons
3
3
8
from STM cell:
Read
Activation
pointer
from storage
position
neuron
6
1
7
2
8
3
9
4
10
5
Recalled sequence: ARRAY
10
10
Storage
neurons
• Read pointer disinhibits playback neurons
• Playback neurons fire when activated from
storage neurons & the read pointer is not active
Time
Sensor input
“N”
to STM cell:
Write
Activation
pointer
from symbol
position
neuron
1
A
2
R
3
R
4
A
5
Y
Recalled sequence: ARRAY
Reading

Lateral
association
6
LTM cell recalling:
Sequence Learned: perforation
Normalized output
Input text
1.0000
prus
0.8833
prast
0.7232
barcelona
0.6733
forest
0.6551
preference
0.5030
cat
0.5115
manifestation
0.4536
manifesto
0.3566
aba
0.3324
bba
0.3107
abb
0.2237
abbbba
0.0601
abbaba
0.1817
abba
R
Link strength 3
Link strength 2
Link strength 1
Signal strength on the LTM output neuron represents the match between the
sequence stored in this LTM cell and the input sequence.

Layer 6
next level
4
A
“A”
From/To Symbol neurons
LTM “R”
“Y”
• A signal from level 6 neuron of active
minicolumn activates a column in the STM
•Storage neurons in the STM cell fire only when
get two activations (from inputs & write pointer)
(3)
Through competition, a winning LTM cell with maximum output signal
strength stores the sequence by adjusting the weights.

2/3
Writing
Win
LTM cell learning:
STM cell:
• Universal playback machine (Fig. 6)
• Size is limited, like in human STM
• Storage neurons for writing
• Playback neurons for reading
• Write/erase pointers & read pointers
• Pointers in a closed loop to reused storage
LTM
WPN
t 1
MINICOLUMN STRUCTURE
Lateral
PN

LTM (“ARRAY”)
•One long-term memory (LTM) cell stores one particular sequence whose
length determines the number of required PNs (Fig.3). Cells can be combined
into hierarchy (Fig. 4).
•Symbol neurons (SN) excite primary neurons (PN) through Win.
S PN (t )  S SN (t )Win
(1)
•PNs are interconnected to induce the temporal association and model
dynamics.
(2)
S (t )  S (t  1)W  S (t )W
PN
The layered uniform structure of identical processing units, postulated by
Mountcastle as a minicolumn organization [1][3], supports the biological intelligence
building in human neocortex.
 Neurons on different layers of minicolumns are proposed to have specific function
in the interaction between STM and LTM.
 When retrieving information from LTM to STM, particular layer of neurons
receives stimulation from LTM.
 When storing information from STM to LTM, stimulations from STM activate the
minicolumns corresponding to the elements of a sequence.
 The activation from STM or LTM is differentiated from the real environment input
by different level of the signal strength.
In this work, laminar minicolumn
structure with multiple layers of neurons,
proposed and studied in visual cortex by
Grossberg [4] (Fig.2), is used to implement
the fundamental learning mechanism of
spatio-temporal memory. It has several
characteristics:
LTM cell:
SHORT-TERM MEMORY
2
2
7
7
R
W/E
Output
1
A
R
R
A
Y
6
A
1
R
Y
N
Fig. 6 STM architecture
6
Write/erase Read
pointer pointer
W/E
R
Excitation link
Inhibition link
CONCLUSIONS
In this work, the laminar minicolumn is used in building the proposed structure of STM and
LTM. STM is built as a playback machine which stores and recalls a certain sequence without
making any associations. The sequential LTM built in a minicolumn can store and recall the
sequence by associating symbols and it is able to differentiate the real environment input from the
recalled information.
The proposed memory models have efficient and stable operation, are biologically plausible and
have a number of desired properties for building self-organizing, hierarchical hardware structures.
BIBLIOGRAPHY
From/to
STM
[1].Edward G. Jones, Microcolumns in the Cerebral Cortex, Proc. of National Academy of Science of
United States of America, vol. 97(10), 2000, pp. 5019-5021
[2].Mountcastle, V. B., Response Properties of Neurons of Cat’s Somatic Sensory Cortex to
Peripheral Stimuli, J. Neurophysiol, vol. 20, 1957, pp. 374-407
[3].S. Grossberg, How does the cerebral cortex work? Learning, attention, and grouping by the
laminar circuits of visual cortex, Technical report CAS/CNS-97-023, 1998.