Download Roadshow - Homepages | The University of Aberdeen

Document related concepts

Artificial neural network wikipedia , lookup

Types of artificial neural networks wikipedia , lookup

Transcript
Artificial Intelligence
In the Real World
Computing Science
University of Aberdeen
Artificial Intelligence
In the Movies
Artificial Intelligence
In the Real World
Artificial Intelligence
In the Movies
Artificial Intelligence
In the Real World
Artificial Intelligence
In the Movies
Artificial Intelligence
In the Real World
Artificial Intelligence Began in 1956…
Great expectations…
“Machines will be capable,
within twenty years,
of doing any work that a man can do.”
Herbert Simon, 1965.
“Machines will be capable,
within twenty years,
of doing any work that a man can do.”
Herbert Simon, 1965.
What Happened?
“Machines will be capable,
within twenty years,
of doing any work that a man can do.”
Herbert Simon, 1965.
What Happened?
Machines can’t do everything a man can do…
People thought machines could replace humans…
instead they are usually supporting humans
“Machines will be capable,
within twenty years,
of doing any work that a man can do.”
Herbert Simon, 1965.
What Happened?
Machines can’t do everything a man can do…
People thought machines could replace humans…
instead they are usually supporting humans
 Healthcare, Science, Government, Business, Military…
“Machines will be capable,
within twenty years,
of doing any work that a man can do.”
Herbert Simon, 1965.
What Happened?
Machines can’t do everything a man can do…
People thought machines could replace humans…
instead they are usually supporting humans
 Healthcare, Science, Government, Business, Military…
Most difficult problems are solved my human+machine
 astronomy, nuclear physics, genetics, maths, drug discovery…
Neural Networks
 Neural Networks are a popular Artificial Intelligence technique
 Used in many applications which help humans
 The idea comes from trying to copy the human brain…
Fascinating Brain Facts…
 100,000,000,000 = 1011 neurons
 100 000 are irretrievably lost each day
 Each neuron connects to 10,000 -150,000 others
 Every person on planet make 200 000 phone calls
 same number of connections as in a single human brain in a day
 Grey part folded to fit - would cover surface of office desk
 The gray cells occupy only 5% of our brains
 95% is taken up by the communication network between them




About 2x106km of wiring (to the moon and back twice)
Pulses travel at more than 400 km/h (250 mph)
2% of body weight… but consumes 20% of oxygen
All the time! Even when sleeping
 What about copying neurons in Computers?
Biological Inspiration
Artificial Neural Network (ANN)
 loosely based on biological neuron
 Each unit is simple, but many
connected in a complex network
 If enough inputs are received
 Neuron gets “excited”
 Passes on a signal, or “fires”
 ANN different to biological:
 ANN outputs a single value
 Biological neuron sends out a complex
series of spikes
 Biological neurons not fully understood
Image from Purves et al., Life: The Science of Biology, 4th Edition, by
Sinauer Associates and WH Freeman
Now play with the flash animation to see
how synapses work
http://www.mind.ilstu.edu/curriculum/neurons_intro/flash_sum
mary.php?modGUI=232&compGUI=1828&itemGUI=3160
(Maybe this is a bit too long – about 3 or 4
mins)
The Perceptron
input1
add
output
input2
(threshold)
weight4
input3
input4
The Perceptron
input1
add
output
input2
(threshold)
weight4
input3
input4
Save Graph and Data
The Perceptron
student
first
last
year
male
works
hard
Lives in First
halls
this
year
1
Richard
1
1
0
1
0
2
Alan
1
1
1
0
1
3
Alison
0
0
1
0
0
4
Jeff
0
1
0
1
0
5
Gail
1
0
1
1
1
6
Simon
0
1
1
1
0
Save Graph and Data
Note: example from Alison Cawsey
The Perceptron
First last year
_
Male
_
add
_
output
Threshold
= 0.5
0.2
_
hardworking
_
Lives in halls
Note: example from Alison Cawsey
1
student
first
last
year
male
works
hard
Lives in First
halls
this
year
Richard
1
1
0
1
0
The Perceptron
First last year
_
Male
_
add
_
output
Threshold
= 0.5
0.15
_
hardworking
_
Lives in halls
Note: example from Alison Cawsey
1
student
first
last
year
male
works
hard
Lives in First
halls
this
year
Richard
1
1
0
1
0
The Perceptron
First last year
_
Male
_
add
_
output
Threshold
= 0.5
0.15
_
hardworking
_
Lives in halls
Note: example from Alison Cawsey
2
student
first
last
year
male
works
hard
Lives in First
halls
this
year
Alan
1
1
1
0
1
The Perceptron
First last year
_
Male
_
add
_
output
Threshold
= 0.5
0.15
_
hardworking
_
Lives in halls
Note: example from Alison Cawsey
2
student
first
last
year
male
works
hard
Lives in First
halls
this
year
Alan
1
1
1
0
1
The Perceptron
First last year
_
Male
_
add
_
output
Threshold
= 0.5
0.15
_
hardworking
_
Lives in halls
Note: example from Alison Cawsey
3
student
first
last
year
male
works
hard
Lives in First
halls
this
year
Alison
0
0
1
0
0
The Perceptron
First last year
_
Male
_
add
_
output
Threshold
= 0.5
0.15
_
hardworking
_
Lives in halls
Note: example from Alison Cawsey
4
student
first
last
year
male
works
hard
Lives in First
halls
this
year
Jeff
0
1
0
1
0
The Perceptron
First last year
_
Male
_
add
_
output
Threshold
= 0.5
0.15
_
hardworking
_
Lives in halls
Note: example from Alison Cawsey
5
student
first
last
year
male
works
hard
Lives in First
halls
this
year
Gail
1
0
1
1
1
The Perceptron
First last year
_
Male
_
add
_
output
Threshold
= 0.5
0.15
_
hardworking
_
Lives in halls
Note: example from Alison Cawsey
6
student
first
last
year
male
works
hard
Lives in First
halls
this
year
Simon
0
1
1
1
0
The Perceptron
First last year
_
Male
_
add
_
output
Threshold
= 0.5
0.10
_
hardworking
_
Lives in halls
Note: example from Alison Cawsey
6
student
first
last
year
male
works
hard
Lives in First
halls
this
year
Simon
0
1
1
1
0
The Perceptron
First last year
_
Male
_
add
_
output
Threshold
= 0.5
0.10
_
hardworking
_
Lives in halls
Note: example from Alison Cawsey
1
student
first
last
year
male
works
hard
Lives in First
halls
this
year
Richard
1
1
0
1
0
The Perceptron
First last year
_
Male
_
add
_
output
Threshold
= 0.5
0.10
_
hardworking
_
Lives in halls
Note: example from Alison Cawsey
2
student
first
last
year
male
works
hard
Lives in First
halls
this
year
Alan
1
1
1
0
1
The Perceptron
First last year
_
Male
_
add
_
output
Threshold
= 0.5
0.10
_
hardworking
_
Lives in halls
Note: example from Alison Cawsey
3
student
first
last
year
male
works
hard
Lives in First
halls
this
year
Alison
0
0
1
0
0
The Perceptron
First last year
_
Male
_
add
_
output
Threshold
= 0.5
0.10
_
hardworking
_
Lives in halls
Note: example from Alison Cawsey
4
student
first
last
year
male
works
hard
Lives in First
halls
this
year
Jeff
0
1
0
1
0
The Perceptron
First last year
_
Male
_
add
_
output
Threshold
= 0.5
0.10
_
hardworking
_
Lives in halls
Note: example from Alison Cawsey
5
student
first
last
year
male
works
hard
Lives in First
halls
this
year
Gail
1
0
1
1
1
The Perceptron
First last year
_
Male
_
add
_
output
Threshold
= 0.5
0.15
_
hardworking
_
Lives in halls
Note: example from Alison Cawsey
5
student
first
last
year
male
works
hard
Lives in First
halls
this
year
Gail
1
0
1
1
1
The Perceptron
First last year
_
Male
_
add
_
output
Threshold
= 0.5
0.15
_
hardworking
_
Lives in halls
Note: example from Alison Cawsey
6
student
first
last
year
male
works
hard
Lives in First
halls
this
year
Simon
0
1
1
1
0
The Perceptron
First last year
_
Male
_
add
_
output
Threshold
= 0.5
0.10
_
hardworking
_
Lives in halls
Note: example from Alison Cawsey
6
student
first
last
year
male
works
hard
Lives in First
halls
this
year
Simon
0
1
1
1
0
The Perceptron
First last year
_
Male
_
add
_
output
0.10
_
hardworking
_
Lives in halls
Finished
Threshold
= 0.5
The Perceptron
First last year
_
Male
_
add
_
output
0.10
_
hardworking
Threshold
= 0.5
_
Lives in halls
Finished
Ready to try unseen examples
The Perceptron
First last year
_
Male
_
add
_
output
Threshold
= 0.5
0.10
_
hardworking
_
Lives in halls
student
first
last
year
male
works
hard
Lives in First
halls
this
year
James
0
1
0
1
?
First last year
_
Male
_
The Perceptron
_
output
add
0.10
_
hardworking
Threshold
= 0.5
_
Lives in halls
 Simple perceptron works ok for this example
 But sometimes will never find weights that fit everything
 In our example:
 Important: Getting a first last year, Being hardworking
 Not so important: Male, Living in halls
 Suppose there was an “exclusive or”
 Important: (male) OR (live in halls), but not both
 Can’t capture this relationship
Stock Exchange Example
Company Name
Company
less than 2
years old
Paid dividend
>10% last year
Share price
increases in
following year
1
Robot Components
Ltd.
1
1
0
2
Silicon Devices
1
0
1
3
Bleeding Edge
Software
0
0
0
4
Human Interfaces
Inc.
1
1
0
5
Data Management
Inc.
0
1
1
6
Intelligent Systems
1
1
0
Multilayer Networks
 We saw: perceptron can’t capture relationships among inputs
 Multilayer networks can capture complicated relationships
Stock Exchange Example
Hidden
Layer
Neural Net example: ALVINN
 Autonomous vehicle controlled by Artificial Neural Network
 Drives up to 70mph on public highways
Note: most images are from the online slides for Tom Mitchell’s book “Machine Learning”
Neural Net example: ALVINN
 Autonomous vehicle controlled by Artificial Neural Network
 Drives up to 70mph on public highways
Neural Net example: ALVINN
Sharp
left
Straight
ahead
Sharp
right
30 output units
4 hidden units
1 input
pixel
Input is 30x32 pixels
= 960 values
Neural Net example: ALVINN
Sharp
left
Straight
ahead
Sharp
right
30 output units
4 hidden units
Learning means
adjusting weight
values
1 input
pixel
Input is 30x32 pixels
= 960 values
Neural Net example: ALVINN
Sharp
left
Straight
ahead
Sharp
right
30 output units
4 hidden units
1 input
pixel
Input is 30x32 pixels
= 960 values
Neural Net example: ALVINN
Neural Net example: ALVINN
 This shows one hidden node
 Input is 30x32 array of pixel values
 = 960 values
 Note: no special visual processing
 Size/colour corresponds to weight
on link
Neural Net example: ALVINN
 Output is array of 30 values
 This corresponds to steering
instructions
 E.g. hard left, hard right
 This shows one hidden node
 Input is 30x32 array of pixel values
 = 960 values
 Note: no special visual processing
 Size/colour corresponds to weight
on link
Let’s try a more complicated example with
the program…
In this example we’ll get the program to
help us to build the neural network
Neural Network Applications
Particularly good for pattern recognition
Neural Network Applications
Particularly good for pattern recognition
 Sound recognition – voice, or medical
Neural Network Applications
Particularly good for pattern recognition
 Sound recognition – voice, or medical
 Character recognition (typed or handwritten)
Neural Network Applications
Particularly good for pattern recognition
 Sound recognition – voice, or medical
 Character recognition (typed or handwritten)
 Image recognition (e.g. human faces)
Neural Network Applications
Particularly good for pattern recognition




Sound recognition – voice, or medical
Character recognition (typed or handwritten)
Image recognition (e.g. human faces)
Robot control - hand-arm-block.mpg
Neural Network Applications
Particularly good for pattern recognition





Sound recognition – voice, or medical
Character recognition (typed or handwritten)
Image recognition (e.g. human faces)
Robot control
ECG pattern – had a heart attack?
Neural Network Applications
Particularly good for pattern recognition






Sound recognition – voice, or medical
Character recognition (typed or handwritten)
Image recognition (e.g. human faces)
Robot control
ECG pattern – had a heart attack?
Application for credit card or mortgage
Neural Network Applications
Particularly good for pattern recognition







Sound recognition – voice, or medical
Character recognition (typed or handwritten)
Image recognition (e.g. human faces)
Robot control
ECG pattern – had a heart attack?
Application for credit card or mortgage
Data Mining on Customers
Neural Network Applications
Particularly good for pattern recognition








Sound recognition – voice, or medical
Character recognition (typed or handwritten)
Image recognition (e.g. human faces)
Robot control
ECG pattern – had a heart attack?
Application for credit card or mortgage
Data Mining on Customers
Other types of Data Mining - Science
Neural Network Applications
Particularly good for pattern recognition









Sound recognition – voice, or medical
Character recognition (typed or handwritten)
Image recognition (e.g. human faces)
Robot control
ECG pattern – had a heart attack?
Application for credit card or mortgage
Data Mining on Customers
Other types of Data Mining
Spam filtering
Neural Network Applications
Particularly good for pattern recognition










Sound recognition – voice, or medical
Character recognition (typed or handwritten)
Image recognition (e.g. human faces)
Robot control
ECG pattern – had a heart attack?
Application for credit card or mortgage
Data Mining on Customers
Other types of Data Mining
Spam filtering
Shape in Go
Neural Network Applications
Particularly good for pattern recognition









Sound recognition – voice, or medical
Character recognition (typed or handwritten)
Image recognition (e.g. human faces)
Robot control
ECG pattern – had a heart attack?
Application for credit card or mortgage
Data Mining on Customers
Other types of Data Mining
Spam filtering
 Shape in Go…
and many more!
What are Neural Networks Good For?
 When training data is noisy, or inaccurate
 E.g. camera or microphone inputs
 Very fast performance once network is trained
 Can accept input numbers from sensors directly
 Human doesn’t need to interpret them first
What are Neural Networks Good For?
 When training data is noisy, or inaccurate
 E.g. camera or microphone inputs
 Very fast performance once network is trained
 Can accept input numbers from sensors directly
 Human doesn’t need to interpret them first
Disadvantages?
 Need a lot of data – training examples
 Training time could be very long
 This is the big problem for large networks
 Network is like a “black box”
 A human can’t look inside and understand what has been learnt
 Precise logical rules would be easier to understand