Download Inputs - eLisa UGM

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Azhari, Dr
Computer Science UGM
1
• Human brain is a densely interconnected
network of approximately 1011 neurons, each
connected to, on average, 104 others.
• Neuron activity is excited or inhibited through
connections to other neurons.
• The fastest neuron switching times are known
to be on the order of 10-3 sec.
2
• Gross physical structure:
– There is one axon that branches
– There is a dendritic tree that collects
input from other neurons
• Axons typically contact dendritic trees at
synapses
– A spike of activity in the axon causes
charge to be injected into the postsynaptic neuron
axon
body
• Spike generation:
– There is an axon hillock that
generates outgoing spikes whenever
enough charge has flowed in at
synapses to depolarize the cell
membrane
dendritic
tree

The cell itself includes a
nucleus (at the center).

To the right of cell 2, the
dendrites provide input
signals to the cell.

To the right of cell 1, the
axon sends output signals
to cell 2 via the axon
terminals. These axon
terminals merge with the
dendrites of cell 2.
4
• Signals can be transmitted
unchanged or they can be altered by
synapses. A synapse is able to
increase or decrease the strength
of the connection from the neuron to
neuron and cause excitation or
inhibition of a subsequence neuron.
This is where information is stored.
• The information processing abilities of
biological neural systems must follow
from highly parallel processes
operating on representations that are
distributed over many neurons. One
motivation for ANN is to capture this
kind of highly parallel computation
based on distributed representations.
5
• When a spike travels along an axon and arrives at a synapse
it causes vesicles of transmitter chemical to be released
– There are several kinds of transmitter
• The transmitter molecules diffuse across the synaptic cleft and bind to
receptor molecules in the membrane of the post-synaptic neuron thus
changing their shape.
– This opens up holes that allow specific ions in or out.
• The effectiveness of the synapse can be changed
– vary the number of vesicles of transmitter
– vary the number of receptor molecules.
• Synapses are slow, but they have advantages over RAM
– Very small
– They adapt using locally available signals (but how?)
• Each neuron receives inputs from other neurons
- Some neurons also connect to receptors
- Cortical neurons use spikes to communicate
- The timing of spikes is important
• The effect of each input line on the
neuron is controlled by a synaptic weight
– The weights can be positive or negative
• The synaptic weights adapt so that the whole network learns
to perform useful computations
– Recognizing objects, understanding language, making plans, controlling the
body
11
• You have about 10
neurons each with about 103 weights
– A huge number of weights can affect the computation in a very short time.
Much better bandwidth than pentium.
• An ANN is composed of
processing elements called
or perceptrons, organized
in different ways to form the
network’s structure.
• An ANN consists of
perceptrons. Each of the
perceptrons receives inputs,
processes inputs and
delivers a single output.
The input can be raw input data or the output of other perceptrons. The
output can be the final result (e.g. 1 means yes, 0 means no) or it can be
inputs to other perceptrons.
8
9
• Inputs:
– Each input corresponds to a single attribute of the
problem.
– For example for the diagnosis of a disease, each
symptom, could represent an input to one node.
– Input could be image (pattern) of skin texture, if we are
looking for the diagnosis of normal or cancerous cells.
• Outputs:
– The outputs of the network represent the solution to a
problem.
– For diagnosis of a disease, the answer could be yes or
no.
• Weights:
– A key element of ANN is weight.
– Weight expresses relative strength of the entering signal
from various connections that transfers data from input
point to the output point.
10
• These are simple but computationally limited
– If we can make them learn we may get insight
into more complicated neurons
y  b   xi wi
i
y
0
0
b   xi wi
i
• McCulloch-Pitts (1943): influenced Von Neumann!
– First compute a weighted sum of the inputs from other
neurons
– Then send out a fixed size spike of activity if the
weighted sum exceeds a threshold.
– Maybe each spike is like the truth value of a
proposition and each neuron combines truth values to
compute the truth value of another proposition!
z   xi wi
i
y
1 if
z 
0 otherwise
1
y
0
threshold
z
These have a confusing name.
They compute a linear weighted sum of their inputs
The output is a non-linear function of the total input
z j  b j   xi wij
i
yj 
z j if z j  0
0 otherwise
y
0
threshold
z
• These give a real-valued
output that is a smooth
and bounded function of
their total input.
– Typically they use the
logistic function
– They have nice
derivatives which
make learning easy.
• If we treat y as a
probability of producing a
spike, we get stochastic
binary neurons.
z  b   xi wi
i
y
1

z
1 e
1
y
0.5
0
0
z
1 AND 1
0
AND
X1
w1
w2
X2

Z

w1, w2 harus dicari berdasarkan nilai w1, w2 awal yang diberikan
initial weights,
threshold,
learning rate
Activation
function
Result for
stable
weight
• ANN learning is well-suited to problems
in which the training data corresponds
to noisy, complex sensor data.
• It is also applicable to problems for
which more symbolic representations
are used.
• It is appropriate for problems with the
characteristics:
– Input is high-dimensional discrete or realvalued
(e.g. raw sensor input)
– Output is discrete or real valued
– Output is a vector of values
– Possibly noisy data
– Long training times accepted
– Fast evaluation of the learned function
required.
– Not important for humans to understand
the weights
Examples:
 Speech phoneme
recognition
 Image classification
 Financial prediction
 Medical diagnosis
17
18
19
20
21
22
23
• A NN is a machine learning approach inspired by the
way in which the brain performs a particular learning
task
• Knowledge about the learning task is given in the
form of examples.
• Inter neuron connection strengths (weights) are used
to store the acquired information (the training
examples).
• During the learning process the weights are modified
in order to model the particular learning task correctly
on the training examples.
24