Download Neural networks 3

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Zero-configuration networking wikipedia , lookup

Distributed firewall wikipedia , lookup

Piggybacking (Internet access) wikipedia , lookup

Cracking of wireless networks wikipedia , lookup

Computer network wikipedia , lookup

Network tap wikipedia , lookup

Airborne Networking wikipedia , lookup

Transcript
Neural networks 3
1
Hopfield network (HN) model
• A Hopfield network is a
form of recurrent artificial
neural network invented by
John Hopfield in 1982
• The network is entirely
interconnected
– All neurons are both input
and output
• with binary threshold units
1, if x > 0
F(x) =
-1, otherwise
These are single layered
recurrent networks
2
Hopfield network model
Applications
• Recalling or Reconstructing corrupted
patterns
• Large-scale computational intelligence
systems
• Handwriting Recognition Software
• Hopfield is model used to solve
optimization problems such as traveler
salesman, task scheduling, etc.
• Practical applications of HNs are limited
because number of training patterns can
be at most about 14% the number of
nodes in the network.
• If the network is overloaded -- trained with
more than the maximum acceptable
number of attractors -- then it won't
converge to clearly defined attractors.
These are single layered
recurrent networks
3
Hopfield network model
• Hopfield nets serve as content-addressable
memory systems with binary threshold nodes
(associative memory)
– The information is distributed and stored in the
network connection.
• If one element disappear, the information is not lost
• We access to information using memory content (CAM :
Content addressable memory)
• The network build progressively its attractors
– In the learning phase neurones adjust there synapses
based on there activities and predefined rules
– Hebb rule
– Widrow-Hoff rule (delta rule)
4
Hopfield network model
Associative memory
Learning
5
Hopfield network model
Learning (Hebb rule):
Neurone are symetric:
Wij  W ji 
1
P
p p
s
 i sj
pP
Wii  0
|P|: number of training patterns (input combinations)
s
p
i is ith bit in the pattern P
6
Hopfield network model
• Lets train this network for following patterns
•
•
•
Pattern 1:Pattern 2:Pattern 3:-
ie Oa(1)=-1,Ob(1)=-1,Oc(1)=1
ie Oa(2)=1,Ob(2)=-1,Oc(3)=-1
ie Oa(3)=-1,Ob(3)=1,Oc(3)=1
w1,1 = 0
w1,2 = 1/3( OA(1) × OB(1) + OA(2) × OB(2) + OA(3) × OB(3)) =1/3( (-1) × (-1) + 1 × (-1) + (1) × 1) = 1/3
w1,3 = 1/3( OA(1) × OC(1) + OA(2) × OC(2) + OA(3) × OC(3)) = 1/3((-1) × 1 + 1 × (-1) + (-1)
× 1) = 1/3( -3)
w2,2 = 0
w2,1 = 1/3( OB(1) × OA(1) + OB(2) × OA(2) + OB(3) × OA(3)) = 1/3((-1) × (-1) + (-1) × 1 + 1
× (-1) = 1/3( -1)
w2,3 = 1/3( OB(1) × OC(1) + OB(2) × OC(2) + OB(3) × OC(3)) = 1/3((-1) × 1 + (-1) × (-1) + 1
× 1) = 1/3(1)
w3,3 = 0
w3,1 = 1/3( OC(1) × OA(1) + OC(2) × OA(2) + OC(3) × OA(3)) = 1/3( 1 × (-1) + (-1) × 1 + 1
× (-1)) = 1/3( -3)
w3,2 = 1/3( OC(1) × OB(1) + OC(2) × OB(2) + OC(3) × OB(3)) = 1/3( 1 × (-1) + (-1) × (-1) +
1 × 1) = 1/3( 1)
7
Hopfield network model
Utilization :
– A pattern is entered in the network by setting all
nodes (neurones) to a specific value, or by setting
only part of the nodes.
– The network is then subject to a number of
iterations using asynchronous or synchronous
updating.
– This is stopped after a while. The network neurons
are then read out to see which pattern is in the
network.
8
Hopfield network model
There are two ways of updating them:
• Asynchronous: one picks one neuron, calculates the
weighted input sum and updates immediately. This
can be done in a fixed order, or neurons can be
picked at random, which is called asynchronous
random updating.
• Synchronous: the weighted input sums of all neurons
are calculated without updating the neurons. Then all
neurons are set to their new value, according to the
value of their weighted input sum.
9
Hopfield network model
Neurone state modification:

t 1
t
if
W
.
s

0
then
s
  ij j
i  1
 jN
t 1
t
if
W
.
s

0
then
s
  ij j
i 1
jN

if Wij .s tj1  0 then sit  sit 1
 jN
10
Hopfield network model
Network energy : a stable states have a low energy
1
E    Wij .s j .si   Threshold i * si
2 i , jN
iN
This value is called the "energy" because the definition
ensures that when units are randomly chosen to update,
the energy E will either lower in value or stay the same.
11
Hopfield network model
Shortcomings of HNs
• Training patterns can be at most about 14% the number
of nodes in the network.
• If more patterns are used then
 the stored patterns become unstable;
 spurious stable states appear (i.e., stable states which
do not correspond with stored patterns).
• Can sometimes misinterpret the corrupted pattern.
12
Hopfield network model
Learning HNs through example
•
•
•
Moving onto little more complex problem described in Haykin’s Neural
Network Book
They book used N=120 neuron and trained network with 120 pixel
images where each pixel was represented by one neuron.
Following 8 patterns were used to train neural network.
13
Hopfield network model
Flow Chart summarizing overall process
Train HN using Standard patterns
Update weight vectors of Network
Run the trained network with corrupted pattern
Network returns the decrypted pattern
14
Hopfield network model
Learning HNs through example
15
16
Hopfield network model
Learning HNs through example
16
Hopfield network model
Shortcomings of HNs
17
Project (exam)
18