Download Neural Networks

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Approaches to A. I.
Human
Thinking like humans
• Cognitive science
Thinking • Neuron level
• Neuroanatomical level
• Mind level
Acting
Acting like humans
• Understand language
• Play games
• Control the body
• The Turing Test
Rational
Thinking rationally
• Aristotle, syllogisms
• Logic
• “Laws of thought”
Acting rationally
• Business approach
• Results oriented
(Artificial) Neural Networks
•
•
•
•
•
•
•
Biological inspiration
Synthetic networks
non-Von Neumann
Machine learning
Perceptrons – MATH
Perceptron learning
Varieties of Artificial Neural Networks
Brain - Neurons
10 billion neurons (in humans)
Each one has an electro-chemical state
Brain – Network of Neurons
Each neuron has on average 7,000 synaptic
connections with other neurons.
A neuron “fires” to communicate with neighbors.
Modeling the Neural Network
von Neumann Architecture
Separation of processor and memory.
One instruction executed at a time.
Animal Neural Architecture
von Neumann
• Separate processor and
memory
• Sequential instructions
Birds and bees (and us)
• Each neuron has state and
processing
• Massively parallel,
massively interconnected.
The Percepton
• A simple computational model of a single
neuron.
• Frank Rosenblatt, 1957
• 𝑓 𝑥 = 1 if 𝑤 ∙ 𝑥 − 𝑏 > 0
0 otherwise
• The entries in 𝑤 and 𝑥 are usually real-valued
(not limited to 0 and 1)
The Perceptron
Perceptrons can be combined to make
a network
How to “program” a Perceptron?
• Programming a Perceptron means
determining the values in 𝑤.
• That’s worse than C or Fortran!
• Back to induction: Ideally, we can find 𝑤 from
a set of classified inputs.
Perceptron Learning Rule
Training data:
Input
x1
x2
12
9
-2
8
3
0
9 -0.5
Valid weights:
Output
1 if avg(x1, x2)>x3,
0 otherwise
x3
6
15
3
4
1
0
0
1
𝑤1 = 0.5, 𝑤2 = 0.5, 𝑤3 = −1.0, 𝑏 = 0
Perceptron function:
1 if 0.5𝑥1 + 0.5𝑥2 − 𝑥3 − 0 > 0
0 otherwise
Varieties of Artificial Neural Networks
• Neurons that are not Perceptrons.
• Multiple neurons, often organized in layers.
Feed-forward network
Recurrent Neural Networks
Hopfield Network
On Learning the Past Tense
of English Verbs
• Rumelhart and McClelland, 1980s
On Learning the Past Tense
of English Verbs
On Learning the Past Tense
of English Verbs
Neural Networks
• Alluring because of their biological inspiration
– degrade gracefully
– handle noisy inputs well
– good for classification
– model human learning (to some extent)
– don’t need to be programmed
• Limited
– hard to understand, impossible to debug
– not appropriate for symbolic information processing