Download Document

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Chemical synapse wikipedia , lookup

Transcript
Introduction to Artificial Intelligence
(G51IAI)
Dr Matthew Hyde
Neural Networks
More precisely:
“Artificial Neural Networks”
Simulating, on a computer, what
we understand about neural
networks in the brain
Lecture Outline








Biology
History
Perceptrons
Multi-Layer Networks
The Neuron’s Activation Function
Linear Separability
Learning / Training
Time Steps
G51IAI – Introduction to AI
Lecture 1
Lecture 2
Biology
G51IAI – Introduction to AI
Neural Networks
G51IAI – Introduction to AI
Neural Networks




We have between 15-33 billion neurons
A neuron may connect to as many as 10,000 other
neurons
Many neurons die as we progress through life
We continue to learn
G51IAI – Introduction to AI

From a computational point of view, the
fundamental processing unit of a brain is a
neuron
G51IAI – Introduction to AI


A neuron consists of a cell body (soma with a
nucleus)
Each neuron has a number of dendrites which
receive signals from other neurons
G51IAI – Introduction to AI


Each neuron also has an axon, which goes out and
splits into a number of strands to make a connection
to other neurons
The point at which neurons join other neurons is
called a synapse
G51IAI – Introduction to AI


Signals move between neurons via electrochemical
reactions
The synapses release a chemical transmitter which
enters the dendrite. This raises or lowers the
electrical potential of the cell body
G51IAI – Introduction to AI


The soma sums the inputs it receives and once a
threshold level is reached an electrical impulse is sent
down the axon (often known as firing)
This increases or decreases the electrical potential of
another cell (excitatory or inhibitory)
G51IAI – Introduction to AI



Long-term firing patterns formed – basic learning
Plasticity of the network
Long term changes as patterns are repeated
G51IAI – Introduction to AI
Videos
http://www.youtube.com/watch?v=sQKma9uMCFk
http://www.youtube.com/watch?v=-SHBnExxub8
Short neuron animation
http://www.youtube.com/watch?v=vvxXnQuvTD8&NR=1
Various visualisations
G51IAI – Introduction to AI
History
G51IAI – Introduction to AI
Neural Networks

McCulloch & Pitts (1943) are generally
recognised as the designers of the first neural
network

One Neuron

The idea of a threshold

Many of their ideas still used today
G51IAI – Introduction to AI
Neural Networks

Hebb (1949) developed the first learning rule



McCulloch & Pitts network has fixed weights
If two neurons were active at the same time the
strength between them should be increased
Rosenblatt (1958)


The ‘perceptron’
Same architecture as McCulloch & Pitts, but with
variable weights
G51IAI – Introduction to AI
Neural Networks

During the 50’s and 60’s



Many researchers worked on the perceptron
amidst great excitement
This model can be proved to converge to the
correct weights
More powerful learning algorithm than Hebb
G51IAI – Introduction to AI
Neural Networks

1969 saw the death of neural network
research



Minsky & Papert
Perceptron can’t learn certain type of important
functions
Research of ANN went to decline for about 15
years
G51IAI – Introduction to AI
Neural Networks

Only in the mid 80’s was interest revived



Parker (1985) and LeCun (1986) independently
discovered multi-layer networks to solve problem
of non-linear separable
Bryson & Ho publish an effective learning
algorithm in 1969: ‘Backpropagation’
In fact Werbos and others make the link to neural
networks in the late 70s and early 80s
G51IAI – Introduction to AI
Perceptrons
(single layer,
feed forward
networks)
G51IAI – Introduction to AI
The First Neural Networks
It consisted of:
A
A
A
A
set of inputs - (dendrites)
set of weights – (synapses)
processing element - (neuron)
single output - (axon)
G51IAI – Introduction to AI
McCulloch and Pitts Networks
X1
X2
2
2
Y
-1
X3
The activation of a neuron is binary. That is,
the neuron either fires (activation of one) or
does not fire (activation of zero).
G51IAI – Introduction to AI
McCulloch and Pitts Networks
X1
X2
2
2
θ = threshold
Y
-1
X3
G51IAI – Introduction to AI
Output function:
If (input sum < Threshold)
output 0
Else
output 1
McCulloch and Pitts Networks
X1
X2
2
2
Y
-1
X3
Each neuron has a fixed threshold. If the
net input into the neuron is greater than
or equal to the threshold, the neuron
fires
G51IAI – Introduction to AI
McCulloch and Pitts Networks
X1
X2
2
2
Y
-1
X3
Neurons in a McCulloch-Pitts network
are connected by directed, weighted
paths
G51IAI – Introduction to AI
McCulloch and Pitts Networks
X1
X2
2
2
Y
-1
X3
If the weight on a path is positive the
path is excitatory, otherwise it is
inhibitory
x1 and x2 encourage the neuron to fire
x3 prevents the neuron from firing
G51IAI – Introduction to AI
McCulloch and Pitts Networks
X1
X2
2
2
Y
-1
X3
G51IAI – Introduction to AI
The threshold is set such that
any non-zero inhibitory input
will prevent the neuron from
firing
(This is only a rule for
McCulloch-Pitts Networks!!)
McCulloch and Pitts Networks
X1
X2
2
2
Y
-1
X3
It takes one time step for a signal to
pass over one connection.
G51IAI – Introduction to AI
Worked Examples
on Handout 1
Does this neuron fire?
Does it output a 0 or a 1?
Inputs
1
2
2
3.5
0
0.5
1.5
0
?
1.5
1
G51IAI – Introduction to AI
Threshold(θ) = 4
3.5 < 4
So neuron outputs 0
Threshold Function:
If input sum < Threshold
return 0
Else
return 1
1) Multiply the
inputs to the
neuron by the
weights on
their paths
2) Add the inputs
3) Apply the
threshold
function
Answers
• Using McCulloch-Pitts model we can model some
logic functions
• In the exercise, you have been working on logic
functions
•AND
•OR
•NOT AND
G51IAI – Introduction to AI
Answers
AND Function
X
1
Threshold(θ) = 2
Z
1
Y
Threshold Function:
If input sum < Threshold
return 0
Else
return 1
G51IAI – Introduction to AI
X
Y
Z
1
1
1
1
0
0
0
1
0
0
0
0
Answers
OR Function
X
2
Threshold(θ) = 2
Z
2
Y
Threshold Function:
If input sum < Threshold
return 0
Else
return 1
G51IAI – Introduction to AI
X
Y
Z
1
1
1
1
0
1
0
1
1
0
0
0
Answers (This one is not a
McCulloch-Pitts Network)
NOT AND (NAND) Function
X
-1
Threshold(θ) = -1
Z
-1
Y
Threshold Function:
If input sum < Threshold
return 0
Else
return 1
G51IAI – Introduction to AI
X
Y
Z
1
1
0
1
0
1
0
1
1
0
0
1
One additional example
AND NOT Function
X
2
Threshold(θ) = 2
Z
-1
Y
Threshold Function:
If input sum < Threshold
return 0
Else
return 1
G51IAI – Introduction to AI
X
Y
Z
1
1
0
1
0
1
0
1
0
0
0
0
Multi-Layer
Neural
Networks
G51IAI – Introduction to AI
Modelling Logic Functions
XOR
2
2
X1
-1
Y1
Z
X2
-1
Y2
X1
X2
Z
1
1
0
1
0
1
0
1
1
0
0
0
2
2
XOR
Function
X1 XOR X2 = (X1 AND NOT X2) OR (X2 AND NOT X1)
G51IAI – Introduction to AI
Modelling Logic Functions
X1
-1
Z
Y2
X2
2
2
AND
NOT
X1
X2
Y2
1
1
0
1
0
0
0
1
1
0
0
0
X1 XOR X2 = (X1 AND NOT X2) OR (X2 AND NOT X1)
G51IAI – Introduction to AI
Modelling Logic Functions
2
2
X1
Y1
Z
X2
-1
AND
NOT
X1
X2
Y1
1
1
0
1
0
1
0
1
0
0
0
0
X1 XOR X2 = (X1 AND NOT X2) OR (X2 AND NOT X1)
G51IAI – Introduction to AI
Modelling Logic Functions
XOR
2
Y1
Z
Y2
2
OR
X1
X2
Z
1
1
0
1
0
1
0
1
1
0
0
0
Y1
Y2
Z
1
1
1
1
0
1
0
1
1
0
0
0
X1 XOR X2 = (X1 AND NOT X2) OR (X2 AND NOT X1)
G51IAI – Introduction to AI
Modelling Logic Functions
X1
X2
Y1
Y2
Z
1
1
0
0
0
1
0
1
0
1
0
1
0
1
1
0
0
0
0
0
X1 XOR X2 = (X1 AND NOT X2) OR (X2 AND NOT X1)
G51IAI – Introduction to AI
Key Idea!
Perceptrons cannot learn
(cannot even represent) the
XOR function
 We will see why next lecture
 Multi-Layer Networks can, as
we have just shown

G51IAI – Introduction to AI
Worked Examples
Does this neuron fire?
Does it output a 0 or a 1?
Inputs
1
2
2
3.5
0
0.5
1.5
0
1.5
?
Threshold (θ) = 4
1
3.5 < 4
So neuron outputs 0
G51IAI – Introduction to AI
Threshold Function:
If input sum < Threshold
return 0
Else
return 1
1) Multiply the
inputs to the
neuron by the
weights on
their paths
2) Add the inputs
3) Apply the
threshold
function
Example of Learning with a multilayer neural network
http://www.youtube.com/watch?v=0Str0Rdkxxo
http://matthewrobbins.net/Projects/NeuralNetwork.html
Provided courtesy of:
Matthew Robbins
Games Programmer
Melbourne Australia
G51IAI – Introduction to AI
Example of Learning with a multilayer neural network
Left
wheel
speed
...
“Fully connected”
All neurons are connected to all
the neurons in the next level
G51IAI – Introduction to AI
...
Right
wheel
speed
Lecture Summary




Biology
History (remember the names and what they
did)
Perceptrons
Multi-Layer Networks
G51IAI – Introduction to AI