Download Neural Networks

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Neurocomputational speech processing wikipedia , lookup

Single-unit recording wikipedia , lookup

Artificial general intelligence wikipedia , lookup

Donald O. Hebb wikipedia , lookup

Cortical cooling wikipedia , lookup

Neuroethology wikipedia , lookup

Connectome wikipedia , lookup

Neural coding wikipedia , lookup

Multielectrode array wikipedia , lookup

Neuroesthetics wikipedia , lookup

Binding problem wikipedia , lookup

Neuroeconomics wikipedia , lookup

Neuroanatomy wikipedia , lookup

Neural oscillation wikipedia , lookup

Artificial intelligence wikipedia , lookup

Biological neuron model wikipedia , lookup

Neural correlates of consciousness wikipedia , lookup

Holonomic brain theory wikipedia , lookup

Synaptic gating wikipedia , lookup

Optogenetics wikipedia , lookup

Neural modeling fields wikipedia , lookup

Channelrhodopsin wikipedia , lookup

Central pattern generator wikipedia , lookup

Neuropsychopharmacology wikipedia , lookup

Catastrophic interference wikipedia , lookup

Neural binding wikipedia , lookup

Development of the nervous system wikipedia , lookup

Neural engineering wikipedia , lookup

Convolutional neural network wikipedia , lookup

Artificial neural network wikipedia , lookup

Metastability in the brain wikipedia , lookup

Nervous system network models wikipedia , lookup

Types of artificial neural networks wikipedia , lookup

Recurrent neural network wikipedia , lookup

Transcript
Computational Similarity and The Advantage
Human brain computes differently from a computer.
Brain is a highly complex, nonlinear, parallel computational
engine.
It has approximately 1010 neurons with over 6 × 1013
interconnections.
Neural events occur at millisecond speeds as opposed to the
nanosecond intervals in computers.
The brain obtains its computational power with
the large number of neurons
even larger number of interconnections
A computer interconnection capacity is 5 to 6 times less.
Example: A brain takes 100-200 milliseconds to identify a familiar
face embedded in an unfamiliar scene. A computer might never
achieve this.
A Computer might take days to identify much less.
Dr. E.C. Kulasekere ()
Neural Networks
4 / 23
The Consequences and Expectations
Consequences
Interest in building a mathematical model of a brain cell.
Arrange such models into a network forming a computational
engine.
Expectation
To build a neuron based computer with as little as 0.1% of the
performance of the human brain.
Use this model to perform tasks that would be difficult to achieve
using conventional computations.
Dr. E.C. Kulasekere ()
Neural Networks
5 / 23
Artificial Neural Networks
An ANN is an information-processing system that has certain
performance characteristics common with biological neural networks.
A Neural Network is a massively parallel distributed processor that
has a natural propensity for storing experiential knowledge and
making it available for use
Haykin
it resembles the brain in two respects
Knowledge is acquired by the network through a learning process.
Inter-neuron connection strengths known as synaptic weights are
used to store knowledge.
Dr. E.C. Kulasekere ()
Neural Networks
6 / 23
Biological Neural Networks
The generic biological neuron
Axonal arborization
Axon from another cell
Synapse
Dendrite
Axon
Nucleus
Synapses
Cell body or Soma
Dr. E.C. Kulasekere ()
Neural Networks
7 / 23
Biological Neural Networks
The Structure
Soma: This is the cell body or the nucleus.
Dendrites: Inputs from other neurons arrive through these.
Hence they act as inputs.
Axon: Since this is electrically active unlike the Dendrites this is
considered the output channel.
Synapse: These are terminating points for axon signal. This will
either accelerate or retard the signal before it reaches the Soma.
Larger Synapse area are considered to be excitatory while smaller
ones are inhibitory. This is thought to be responsible for learning.
Dr. E.C. Kulasekere ()
Neural Networks
8 / 23
Biological Neural Networks
The similarities
The processing element receives many signals.
Signals may be modified by a weight at the receiving synapse.
The processing element sums the weighted inputs.
With sufficient input, a neuron transmits a single output.
The output from a particular neuron may go to many other
neurons.
Fault tolerance capacity.
Dr. E.C. Kulasekere ()
Neural Networks
9 / 23
Biological Neural Networks
Fault tolerance capability
Delayed recognition
Being able to recognize many input signals that are somewhat different
from any signal we have seen before
Damage tolerance
Being able to tolerate damage to the neural network itself. Even in the
case of traumatic loss, other neurons can sometimes be trained to take
over the functions of the damaged cells.
Dr. E.C. Kulasekere ()
Neural Networks
10 / 23
Assumptions for the Mathematical Model
Information processing occurs at many simple elements called
neurons.
Signals are passed between neurons over connection links.
Each connection link has an associated weight, which, in a typical
neural net, multiplies the signal transmitted.
Each neuron applies an activation function (usually nonlinear) to
its net input (sum of weighted input signals) to determine its output
signal.
Dr. E.C. Kulasekere ()
Neural Networks
11 / 23
A Simplified ANN Model
X1
w1
X2
w2
Y
w3
X3
The net input, s to neuron Y is
s = w1 x1 + w2 x2 + w3 x3 .
Dr. E.C. Kulasekere ()
Neural Networks
12 / 23
Fundamental Features of NNs Operation
Layered with same layer neurons behaving identically.
Behavior is determined by the activation function and weights.
Within a layer: same activation function and pattern of links.
Neurons within a layer are either fully interconnected or not
connected at all.
Arrangement of layers and connection patterns is called net
architecture.
Net architectures of feed-forward or recurrent type
Dr. E.C. Kulasekere ()
Neural Networks
13 / 23
Characterization of a Neural Network
Architecture: its patters of connections between the neurons.
Single layer feed forward network.
Multilayer feed forward network.
Recurrent network.
Training or learning algorithm: its methods of determining the
weights on the connections.
Supervised learning.
Unsupervised learning.
Reinforced learning.
Activation function: is methods of determining the output of the
neuron.
Threshold function.
Signum function.
Sigmoid function.
Dr. E.C. Kulasekere ()
Neural Networks
14 / 23
Network Architectures
Single-Layer Feed forward network
w11
X1
wi1
Y1
wn1
w1j
wij
Xi
Yj
wnj
w1m
wim
Xn
Ym
wnm
Dr. E.C. Kulasekere ()
Neural Networks
15 / 23
Network Architectures
Multi-Layer Feed forward network
w11
ν11
X1
νi1
Z1
w1k
Y1
w1m
νn1
ν1j
Xi
νij
wj1
Zj
wjk
Yk
wjm
νnj
ν1m
νim
Xn
Dr. E.C. Kulasekere ()
wp1
Zp
νnm
wpk
wpm
Neural Networks
Ym
16 / 23
Network Architectures
Recurrent Network
A1
−
Am
−
−
−
−
Ai
Dr. E.C. Kulasekere ()
−
Neural Networks
Aj
17 / 23
Learning Methods
Supervised learning
The ANN is trained repeatedly by a “teacher”
Each input presented to the network will have an associated
desired output that will also be presented.
Each learning cycle the error between the actual and the desired
output is used to adjust the weights.
When the error is acceptable amount the learning stops.
A network thus trained will have the recall capability
Dr. E.C. Kulasekere ()
Neural Networks
18 / 23
Learning Methods
Unsupervised learning
A “teacher” is not involved
The network uses only inputs
The inputs form automatic clustering based on some closeness or
similarity criteria.
Meaning is associated to these clustered depending on the data.
Sometimes this time of training is called self-organizing networks.
Dr. E.C. Kulasekere ()
Neural Networks
19 / 23
Learning Methods
Reinforcement learning
The error between the actual and the desired responses are not
computed.
The “teacher” assigns a pass/fail signal after each learning cycle.
If the signal is “fail” the network continues readjusting the weights
with a new learning cycle.
This type is a special case of the supervised training method.
Dr. E.C. Kulasekere ()
Neural Networks
20 / 23
Activation Functions
The activation y of neuron Y is given by some function of its net
input, y = F (s)
ai
ai
ai
+1
+1
t
+1
ini
ini
ini
−1
(a) Step function
Dr. E.C. Kulasekere ()
(b) Sign function
Neural Networks
(c) Sigmoid function
21 / 23
When to Use Neural Networks
Where ANN provides the only practical solution.
Other solutions exists but an ANN given an easier or better
solution.
An ANN solution is equal to others.
ANNs can provide solutions for problems which are generally
characterized by
Nonlinearities and high dimensionality systems.
Noisy, complex, imprecise or imperfect data
A lack of clearly stated mathematical solution or algorithm.
Dr. E.C. Kulasekere ()
Neural Networks
22 / 23
Where are Neural Networks Being Used?
Forecast stock market performance.
Detect credit card fraud.
Pattern Recognition.
Control robot motion and manipulators.
Recognize speech and finger prints in security systems.
Classify blood cell reactions and blood analysis
Performing brain modeling.
Dr. E.C. Kulasekere ()
Neural Networks
23 / 23