Download Hopfield Networks - liacs

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Biological neuron model wikipedia , lookup

Synaptic gating wikipedia , lookup

Central pattern generator wikipedia , lookup

Artificial neural network wikipedia , lookup

Metastability in the brain wikipedia , lookup

Catastrophic interference wikipedia , lookup

Nervous system network models wikipedia , lookup

Convolutional neural network wikipedia , lookup

Recurrent neural network wikipedia , lookup

Types of artificial neural networks wikipedia , lookup

Transcript
Neural Networks
Chapter 2
Joost N. Kok
Universiteit Leiden
Hopfield Networks
• Network of McCulloch-Pitts neurons
• Output is 1 iff  wij S j   i and is -1
j
otherwise
Hopfield Networks
Hopfield Networks
Hopfield Networks
Hopfield Networks
• Associative Memory Problem:
Store a set of patterns in such a way that
when presented with a new pattern, the
network responds by producing whichever
of the stored patterns most closely
resembles the new pattern.
Hopfield Networks
• Resembles = Hamming distance
• Configuration space = all possible states of
the network
• Stored patterns should be attractors
• Basins of attractors
Hopfield Networks
• N neurons
• Two states: -1
(silent) and 1
(firing)
• Fully connected
• Symmetric Weights
• Thresholds
Hopfield Networks
w13
w16
w57
-1
+1
Hopfield Networks
• State:
• Weights:
• Dynamics:
S  S1 ... S25 
 w1,1  w1, 25 


w 

 
w


w
25, 25 
 25,1
Si : sgn

25
wS
i 1 ij j

Hopfield Networks
• Hebb’s learning rule:
– Make connection stronger if neurons have the
same state
– Make connection weaker if the neurons have a
different state
Hopfield Networks
neuron 1
synapse neuron 2
Hopfield Networks
• Weight between neuron i and neuron j is
given by
wij 
p
1
p



1
( )
i

( )
j
Hopfield Networks
• Opposite patterns give the same weights
• This implies that they are also stable points
of the network
• Capacity of Hopfield Networks is limited:
0.14 N
Hopfield Networks
• Hopfield defines the energy of a network:
E = - ½ ij SiSjwij +  i Sii
• If we pick unit i and the firing rule does
not change its Si, it will not change E.
• If we pick unit i and the firing rule does
change its Si, it will decrease E.
Hopfield Networks
• Energy function: H   1  wij Si S j
2
ij
• Alternative Form: H  C   wij Si S j
(ij )
• Updates:


S  sgn   wij S j 
 j

'
i
Hopfield Networks
H  H   wij S S j   wij Si S j
'
'
i
j i
H  H  2Si
'
w S
j i
ij
j i
j
 2Si
w S
ij
j
j
 2wii
Hopfield Networks
• Extension: use stochastic fire rule
– Si := +1 with probability g(hi)
– Si := -1 with probability 1-g(hi)
Hopfield Networks
1
• Nonlinear function:
g(x) =
1 + e – xb
b
g(x)
b0
x
Hopfield Networks
• Replace the binary threshold units by binary stochastic
units.
• Define b = 1/T
• Use “temperature” T to make it easier to cross energy
barriers.
– Start at high temperature where its easy to cross energy barriers.
– Reduce slowly to low temperature where good states are much
more probable than bad ones.
A
B
C
Hopfield Networks
• Kick the network our of spurious local
minima
• Equilibrium: Si becomes time
independent
1
Pr Si  1  f b  hi  
1  exp( 2.b .hi )
1
b
T