Download Document

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Multielectrode array wikipedia , lookup

Neuroethology wikipedia , lookup

Feature detection (nervous system) wikipedia , lookup

Binding problem wikipedia , lookup

Holonomic brain theory wikipedia , lookup

Artificial intelligence wikipedia , lookup

Neural oscillation wikipedia , lookup

Neural coding wikipedia , lookup

Biological neuron model wikipedia , lookup

Optogenetics wikipedia , lookup

Neural modeling fields wikipedia , lookup

Sparse distributed memory wikipedia , lookup

Neuropsychopharmacology wikipedia , lookup

Channelrhodopsin wikipedia , lookup

Synaptic gating wikipedia , lookup

Nervous system network models wikipedia , lookup

Neural engineering wikipedia , lookup

Metastability in the brain wikipedia , lookup

Development of the nervous system wikipedia , lookup

Central pattern generator wikipedia , lookup

Artificial neural network wikipedia , lookup

Hierarchical temporal memory wikipedia , lookup

Catastrophic interference wikipedia , lookup

Convolutional neural network wikipedia , lookup

Recurrent neural network wikipedia , lookup

Types of artificial neural networks wikipedia , lookup

Transcript
Neural Networks
Neural Networks
• Applications for neural networks
– Speech recognition
– Shape recognition
– Financial prediction
– The list goes on..
• The Idea behind neural networks is to make
computer recognize patterns the way brain
does it
Touko Hallasmaa
Perceptron
• Perceptron takes several binary
input and outputs a single binary
output
• Weight(w)
represents
the
importance of its input to the
output
– Weights are real numbers
• The perceptrons output(1 or 0) is
determined by whether the
weighted sum is less or greater
than the threshold value.
x
1
x
2
w1
w2
Σ
wn
Compare
x
n
Output=
{
0,
1,
𝑛
𝑖=0 𝑤𝑖𝑥𝑖
≤ threshold
𝑛
𝑖=0 𝑤𝑖𝑥𝑖
> threshold
Output
– Threshold is a real number
Touko Hallasmaa
Perceptrons In Neural Networks
• Neural networks can have several perceptrons
• Output of one perceptron can be used as a input of another percepton
• Output = 𝑛𝑖=0 𝑤𝑖𝑥𝑖 can be written as Output = w ◦ x (dot product)
– In the dot product, W and X are vectors of weights and inputs
• Perceptrons can be used to compute any logical function(laskentaoperaatiot)
Touko Hallasmaa
Sigmoid Neurons
• It’s hard to make accurate learning neural network only with perceptrons.
– For example, if we wanted to make change(determined by the output) to a weight of
perceptron it might cause the output of that perceptron to completely change, which
may cause the behaviour of the rest of the network to completely change.
• The sigmoid neurons are similiar to perceptrons but their inputs can take
any value between 0 and 1 and the output is provided by the sigmoid
function σ(w ◦ x - b)
–
b is the bias ( complement of thershold )
–
σ( x ) =
1
1+𝑒^(−𝑥)
– The behaviour of sigmoid neurons is similiar to
perceptons when x is very large or very small
Sigmoid funtion
Touko Hallasmaa
Sigmoid Neurons
• The shape of the sigmoid function makes learning neural networks possible
– Therefore other functions may be used in neurons but the sigmoid function is most
commonly used because it’s easy to work with(differientals)
• As we stated before, with perceptons if we make even a small change to
the weight of perceptron it might cause the output of that perceptron to
completely flip but that’s not the case with sigmoid neurons(small change
in the weight of the neuron will cause only small change in the output)
Touko Hallasmaa
Neural Networks - structure
• Neural networks can be visualized as layers of neurons
– First layer consists of inputs
– The last layer is the output layer
– Every neuron that’s not in the input nor in the output layer is in some hidden layer
• For example in shape recognition application we could have a input
neuron for every pixel of the pre-processed image (256x256 image would
therefore have 65536 input neurons)
• There may also be loops, neural networks which have loops are called
recurrent(jatkuva) or feedback networks. If a network doesn’t have any
loop it’s called feedforward neural network
Touko Hallasmaa
Neural Networks – Learning
• Associative mapping - network learn to produce a pattern on the set of
input units providing that another particular pattern is applied on the set
of input units
– Auto-association: can produce a pattern whenever a portion or distorted partion of it is
presented.
– Hetero-association - related to two recall mechanisms
• Nearest-neightbour: stored pattern closest to the input pattern is recalled
• Interpolative recall: the recalled pattern is a combination of outputs corresponding
to the input training patterns nearest to the given input test pattern(input
interpolation)
• Regularity detection – units learn to respond to particular properties of
the input patterns(whereas in associative mapping the network stores the relationship
among patterns)
performance
x
1
x
2
X
0
w1
w2
Σ
wn
x
n
Touko Hallasmaa
Neural Networks - Learning
• Two categories of neural networks:
– Fixed: weights cannot be changed(are fixed according to the problem)
– Adaptive: weights can be changed
• Learning can either be:
– Supervised: each output unit is told what its desired response to input signals is
supposed to be
– Unsupervised: based only on local information
Touko Hallasmaa
Classification With Neural Networks
• Classification systems make decisions
– Decision are not pre-programmed
– Rules are derived from data
• Classification will use features of the object to be classified. Such features
may include:
– Size(Length, width, height), color, pattern, shape
• Features usually objects contain lots of irrelevant data
– Feature extraction has to be made(We need a preprocessor)
– We will treat features as numeric values
• When teaching the classifier we have to give the classifier the class of the
object along with the feature data, whereas the test data is given without
a class
Touko Hallasmaa