Download November - School of Computer Science and Statistics

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Microneurography wikipedia , lookup

Neuropsychopharmacology wikipedia , lookup

Transcript
Financial Informatics –XIII:
Neural Computing Systems
Khurshid Ahmad,
Professor of Computer Science,
Department of Computer Science
Trinity College,
Dublin-2, IRELAND
November 19th, 2008.
https://www.cs.tcd.ie/Khurshid.Ahmad/Teaching.html
1
1
Neural Networks:
Real Networks
London, Michael and Michael Häusser (2005). Dendritic Computation.
Annual Review of Neuroscience. Vol. 28, pp 503–32
2
Real Neural Networks:
Cells and Processes
A neuron is a cell with appendages; every cell has a nucleus
and the one set of appendages brings in inputs – the dendrites
– and another set helps to output signals generated by the cell
NUCLEUS
DENDRITES
CELL
BODY
AXON
3
Real Neural Networks:
Cells and Processes
The human brain is mainly
composed of neurons: specialised
cells that exist to transfer
information rapidly from one part
of an animal's body to another.
This communication is achieved
by the transmission (and
reception) of electrical impulses
(and chemicals) from neurons
and other cells of the animal. Like
other cells, neurons have a cell
body that contains a nucleus
enshrouded in a membrane which
has double-layered ultrastructure
with numerous pores.
Dendrite
Axon
Terminals
Soma
Nucleus
SOURCE: http://en.wikipedia.org/wiki/Neurons
4
Real Neural Networks
All the neurons of an organism, together with their
supporting cells, constitute a nervous system. The
estimates vary, but it is often reported that there are as
many as 100 billion neurons in a human brain.
Neurobiologist and neuroethologists have argued that
intelligence is roughly proportional to the number of
neurons when different species of animals compared.
Typically the nervous system includes a :
Spinal Cord is the least differentiated component of the central nervous system and
includes neuronal connections that provide for spinal reflexes. There are also
pathways conveying sensory data to the brain and pathways conducting impulses,
mainly of motor significance, from the brain to the spinal cord; and,
Medulla Oblongata: The fibre tracts of the spinal cord are continued in the medulla,
which also contains of clusters of nerve cells called nuclei.
5
Real Neural Networks
Inputs to and outputs from an animal nervous
system
Cerebellum receives data from most
sensory systems and the cerebral cortex,
the cerebellum eventually influences motor
neurons supplying the skeletal
musculature. It produces muscle tonus in
relation to equilibrium, locomotion and
posture, as well as non-stereotyped
movements based on individual
experiences.
6
Real Neural Networks
Processing of some information in the nervous
system takes place in Diencephalon. This forms
the central core of the cerebrum and has influence
over a number of brain functions including
complex mental processes, vision, and the
synthesis of hormones reaching the blood stream.
Diencephalon comprises thalamus,
epithalmus, hypothalmus, and subthalmus.
The retina is a derivative of diencephalon; the
optic nerve and the visual system are therefore
intimately related to this part of the brain.
7
Real Neural Networks
Inputs to the nervous system are relayed thtough
the Telencephalon (Cereberal Hemispheres) which
includes the cerebral cortex, corpus striatum,
and medullary center. Nine-tenths of the human
cerebral cortex is neocortex, a possible result of
evolution, and contains areas for all modalities of
sensation (except smell), motor areas, and large
expanses of association cortex in which
presumably intellectual activity takes place. Corpus
striatum, large mass of gray matter, deals with motor
functions and the medullary center contains fibres to
connect cortical areas of two hemispheres.
8
Real Neural Networks
Inputs to the nervous system are relayed thtough
the Telencephalon (Cereberal Hemispheres) which
includes the cerebral cortex, corpus striatum,
and medullary center. Nine-tenths of the human
cerebral cortex is neocortex, a possible result of
evolution, and contains areas for all modalities of
sensation (except smell), motor areas, and large
expanses of association cortex in which
presumably intellectual activity takes place. Corpus
striatum, large mass of gray matter, deals with motor
functions and the medullary center contains fibres to
connect cortical areas of two hemispheres.
9
Real Neural Networks:
Cells and Processes
Neurons have a variety of appendages,
referred to as 'cytoplasmic processes
known as neurites which end in close
apposition to other cells. In higher
animals, neurites are of two varieties:
Axons are processes of generally of
uniform diameter and conduct
impulses away from the cell body;
dendrites are short-branched
processes and are used to conduct
impulses towards the cell body.
The ends of the neurites, i.e. axons and
dendrites are called synaptic
terminals, and the cell-to-cell contacts
they make are known as synapses.
Dendrite
Axon
Terminals
Soma
Nucleus
SOURCE: http://en.wikipedia.org/wiki/Neurons
10
Real Neural Networks:
Cells and Processes
1010 neurons with 104 connections and an average of 10 spikes per second
= 1015 adds/sec. This is a lower bound on the equivalent computational
power of the brain.
–
–
Asynchronous
firing rate,
c. 200 per sec.
4
+
10 fan-in
summation
4
10 fan-out
1 - 100 meters per sec.
11
Real Neural Networks:
Cells and Processes
Henry Markram (2006). The Blue
Brain Project. Nature Reviews
Neuroscience Vol. 7, 153-160
12
Neural Networks:
Real and Artificial
Observed Biological
Processes (Data)
Neural Networks &
Neurosciences
Biologically Plausible
Mechanisms for Neural
Processing & Learning
(Biological Neural Network Models)
Theory
(Statistical Learning Theory &
Information Theory)
http://en.wikipedia.org/wiki/Neural_network#Neural_networks_and_neuroscience
13
Neural Networks &
Neuro-economics
The behaviour of economic actors:
pattern recognition, risk-averse/prone
activities; risk/reward
Neural Networks &
Neurosciences
Biologically Plausible
Mechanisms for Neural
Processing & Learning
(Biological Neural Network Models)
Theory
(Statistical Learning Theory &
Information Theory)
http://en.wikipedia.org/wiki/Neural_network#Neural_networks_and_neuroscience
14
Neural Networks
Artificial Neural Networks
The study of the behaviour of neurons,
either as 'single' neurons or as cluster of
neurons controlling aspects of perception,
cognition or motor behaviour, in animal
nervous systems is currently being used
to build information systems that are
capable of autonomous and intelligent
behaviour.
15
Neural Networks
Artificial Neural Networks, the uses of
Application
Artificial Neural Networks
Classification: Given the knowledge of classes amongst
objects, train a network to recognise the
different classes using examples of
idiosyncratic features of each of the classes
Categorisation No class information is available. The network
trains itself by assigning ‘similar’ objects to
proximate neurons
Pattern Auto- and hetero associations between input
Association and generated output patterns
Forecasting Training neurons with time serial data with
one- or two step forecasting; Networks that
learn to remove noise from a time serial data.
16
Neural Networks
Artificial Neural Networks
Artificial neural networks emulate threshold
behaviour, simulate co-operative phenomenon by a
network of 'simple' switches and are used in a
variety of applications, like banking, currency
trading, robotics, and experimental and animal
psychology studies.
These information systems, neural networks or
neuro-computing systems as they are popularly
known, can be simulated by solving first-order
difference or differential equations.
17
Neural Networks
Artificial Neural Networks
The basic premise of the
course, Neural Networks, is
to introduce our students to
an alternative paradigm of
building information
systems.
18
Neural Networks
Artificial Neural Networks
• Statisticians generally have good mathematical backgrounds with which
to analyse decision-making algorithms theoretically. […] However, they
often pay little or no attention to the applicability of their own theoretical
results’ (Raudys 2001:xi).
• Neural network researchers ‘advocate that one should not make
assumptions concerning the multivariate densities assumed for pattern
classes’ . Rather, they argue that ‘one should assume only the structure
of decision making rules’ and hence there is the emphasis in the
minimization of classification errors for instance.
• In neural networks there are algorithms that have a theoretical
justification and some have no theoretical elucidation’.
•Given that there are strengths and weaknesses of both statistical and
other soft computing algorithms (e.g. neural nets, fuzzy logic), one
should integrate the two classifier design strategies (ibid)
Raudys, Šarûunas. (2001). Statistical and Neural Classifiers: An integrated approach to design. London: Springer-Verlag
19
Neural Networks
Artificial Neural Networks
• Artificial Neural Networks are extensively used in dealing
with problems of classification and pattern recognition. The
complementary methods, albeit of a much older discipline, are
based in statistical learning theory
•Many problems in finance, for example, bankruptcy prediction,
equity return forecasts, have been studied using methods
developed in statistical learning theory:
20
Neural Networks
Artificial Neural Networks
Many problems in finance, for example, bankruptcy prediction,
equity return forecasts, have been studied using methods
developed in statistical learning theory:
“The main goal of statistical learning theory is to provide a framework
for studying the problem of inference, that is of gaining knowledge,
making predictions, making decisions or constructing models from a set
of data. This is studied in a statistical framework, that is there are
assumptions of statistical nature about the underlying phenomena (in
the way the data is generated).” (Bousquet, Boucheron, Lugosi)
Statistical learning theory can be viewed as the study of algorithms that
are designed to learn from observations, instructions, examples and so
on.
Olivier Bousquet, Stephane Boucheron, and Gabor Lugosi. Introduction to Statistical Learning
Theory (http://www.econ.upf.edu/~lugosi/mlss_slt.pdf)
21
Neural Networks
Artificial Neural Networks
“Bankruptcy prediction models have used a variety of statistical
methodologies, resulting in varying outcomes. These methodologies include
linear discriminant analysis regression analysis, logit regression, and
weighted average maximum likelihood estimation, and more recently by
using neural networks.”
The five ratios are net cash flow to total assets, total debt to total assets,
exploration expenses to total reserves, current liabilities to total debt, and the
trend in total reserves (over a three year period, in a ratio of change in
reserves in computed as changes in Yrs 1 & 2 and changes in Yrs 2 &3).
Zhang et al (1999) found that (a variant of) ‘neural network and Fischer
Discriminant Analysis ‘achieve the best overall estimation’, although
discriminant analysis gave superior results.
Z. R. Yang, Marjorie B. Platt and Harlan D. Platt (1999)Probabilistic Neural Networks in
Bankruptcy Prediction. Journal of Business Research, Vol 44, pp 67–74
22
Neural Networks
Artificial Neural Networks
A neural network can be described as a type of
multiple regression in that it accepts inputs and
processes them to predict some output. Like a multiple
regression, it is a data modeling technique.
Neural networks have been found particularly suitable
in complex pattern recognition compared to statistical
multiple discriminant analysis (MDA) since the
networks are not subject to restrictive assumptions of
MDA models
Shaikh A. Hamid and Zahid Iqbal (2004). (1999) Using neural networks for forecasting volatility
of S&P 500 Index futures prices. Journal of Business Research, Vol 57, pp 1116-1125.
23
Neural Networks
Artificial Neural Networks
Neural Networks for forecasting volatility of S&P 500 Index:
A neural network was trained to learn to generate volatility forecasts for the
S&P 500 Index over different time horizons. The results were compared with
pricing option models used to compute implied volatility from S&P 500 Index
futures options (the Barone-Adesi and Whaley American futures options
pricing model)
Forecasts from neural networks outperform implied volatility
forecasts and are not found to be significantly different from
realized volatility.
Implied volatility forecasts are found to be significantly different from realized
volatility in two of three forecast horizons.
Shaikh A. Hamid and Zahid Iqbal (2004). (1999) Using neural networks for forecasting volatility
of S&P 500 Index futures prices. Journal of Business Research, Vol 57, pp 1116-1125.
24
Neural Networks
Artificial Neural Networks
The ‘remarkable qualities’ of neural networks: the dynamics of a
single layer perceptron progresses from the simplest algorithms
to the most complex algorithms:
 each pattern class characterized by sample mean
vector  neuron behaves like E[uclidean] D[istance] C[lassifier]  ;
• Further Training  neuron begins to evaluate correlations and
variances of features  neuron behaves like standard linear Fischer
classifier
• More training  neuron minimizes number of incorrectly identified
training patterns  neuron behaves like a support vector classifier.
• Initial Training
Statisticians and engineers usually design decision-making algorithms
from experimental data by progressing from simple algorithms to more
complex ones.
Raudys, Šarûunas. (2001). Statistical and Neural Classifiers: An integrated approach to design. London: Springer-Verlag
25
Neural Networks:
Real and Artificial
Observed Biological
Processes (Data)
Neural Networks &
Neurosciences
Biologically Plausible
Mechanisms for Neural
Processing & Learning
(Biological Neural Network Models)
Theory
(Statistical Learning Theory &
Information Theory)
http://en.wikipedia.org/wiki/Neural_network#Neural_networks_and_neuroscience
26
Neural Networks &
Neuro-economics
The behaviour of economic actors:
pattern recognition, risk-averse/prone
activities; risk/reward
Neural Networks &
Neurosciences
Biologically Plausible
Mechanisms for Neural
Processing & Learning
(Biological Neural Network Models)
Theory
(Statistical Learning Theory &
Information Theory)
http://en.wikipedia.org/wiki/Neural_network#Neural_networks_and_neuroscience
27
Neural Networks:
Neuro-economics
Observed Biological Processes (Data)
Investors systematically deviate from rationality when
making financial decisions, yet the mechanisms
responsible for these deviations have not been identified.
Using event-related fMRI, we examined whether
anticipatory neural activity would predict optimal and
suboptimal choices in a financial decision-making task. []
Two types of deviations from the optimal investment
strategy of a rational risk-neutral agent as risk-seeking
mistakes and risk-aversion mistakes.
Camelia M. Kuhnen and Brian Knutson (2005). “Neural Antecedents of Financial
28
Decisions.” Neuron, Vol. 47, pp 763–770
Neural Networks:
Neuro-economics
Observed Biological Processes (Data)
Nucleus accumbens
[NAcc] activation
preceded risky choices as
well as risk-seeking
mistakes, while anterior
insula activation
preceded riskless choices
as well as risk-aversion
mistakes. These findings
suggest that distinct neural
circuits linked to
anticipatory affect
promote different types of
financial choices and
indicate that excessive
activation of these circuits
may lead to investing
mistakes.
Camelia M. Kuhnen and Brian Knutson (2005). “Neural Antecedents of Financial Decisions.” Neuron, Vol. 47, pp
763–770
29
Neural Networks:
Observed Biological Processes (Data)
Neuro-economics
Association of Anticipatory Neural Activation with Subsequent Choice
The left panel indicates a significant effect of anterior insula activation on the odds of making
riskless (bond) choices and risk-aversion mistakes (RAM) after a stock choice (Stockt-1). The
right panel indicates a significant effect of NAcc activation on the odds of making risk-aversion
mistakes, risky choices, and risk-seeking mistakes (RSM) after a bond choice (Bondt-1). The
odds ratio for a given choice is defined as the ratio of the probability of making that choice
divided by the probability of not making that choice. Percent change in odds ratio results from a
0.1% increase in NAcc or anterior insula activation. Error bars indicate the standard errors of
the estimated effect. *coefficient significant at p < 0.05.
Camelia M. Kuhnen and Brian Knutson (2005). “Neural Antecedents of Financial Decisions.” Neuron, Vol. 47, pp
763–770
30
Neural Networks:
Observed Biological Processes (Data)
Neuro-economics
Nucleus accumbens [NAcc]
activation preceded risky
choices as well as riskseeking mistakes, while
anterior insula activation
preceded riskless choices as
well as risk-aversion
mistakes. These findings
suggest that distinct neural
circuits linked to
anticipatory affect promote
different types of financial
choices and indicate that
excessive activation of these
circuits may lead to
investing mistakes.
Camelia M. Kuhnen and Brian Knutson (2005). “Neural Antecedents of Financial Decisions.” Neuron, Vol. 47, pp
763–770
31
Neural Networks:
Neuro-economics
Observed Biological Processes (Data)
To explain investing decisions, financial theorists invoke two
opposing metrics: expected reward and risk. Recent advances in the
spatial and temporal resolution of brain imaging techniques enable
investigators to visualize changes in neural activation before financial
decisions. Research using these methods indicates that although the
ventral striatum plays a role in representation of expected reward,
the insula may play a more prominent role in the representation of
expected risk. Accumulating evidence also suggests that antecedent
neural activation in these regions can be used to predict upcoming
financial decisions. These findings have implications for predicting
choices and for building a physiologically constrained theory of
decision-making.
Brian Knutson and Peter Bossaerts (2007). “Neural Antecedents of Financial Decisions.” The Journal of
Neuroscience, (August 1, 2007), Vol. 27 (No. 31), pp 8174–8177
32
Neural Networks:
Real and Artificial
Organising Principles and Common
Themes:
Association between neurons and
competition amongst the neurons
Two examples: Categorisation and
attentional modulation of conditioning
Levine, Daniel S. (1991). Introduction to Neural and Cognitive Modeling. Hillsdale, NJ &
London: Lawrence Erlbaum Associates, Publishers (See Chapter 1)
33
Neural Networks:
Real and Artificial
Organising Principles and Common Themes:
Association between neurons and competition amongst the neurons
A network for identifying handwritten letters
of the alphabet.
Feature nodes respond
to the presence or
absence of marks at
particular locations.
Category Nodes
Weighted sum of inputs
Feature Nodes
Category nodes respond
to the patterns of
activation in the feature
nodes.
Levine, Daniel S. (1991). Introduction to Neural and Cognitive Modeling. Hillsdale, NJ &
London: Lawrence Erlbaum Associates, Publishers (See Chapter 1)
34
Artificial Neural Networks
Artificial Neural Networks (ANN) are
computational systems, either hardware
or software, which mimic animate
neural systems comprising biological
(real) neurons. An ANN is
architecturally similar to a biological
system in that the ANN also uses a
number of simple, interconnected
artificial neurons.
35
Artificial Neural Networks
In a restricted sense artificial neurons are simple emulations
of biological neurons: the artificial neuron can, in principle,
receive its input from all other artificial neurons in the ANN;
simple operations are performed on the input data; and, the
recipient neuron can, in principle, pass its output onto all
other neurons.
Intelligent behaviour can be simulated through
computation in massively parallel networks of simple
processors that store all their long-term knowledge in the
connection strengths.
36
Artificial Neural Networks
According to Igor Aleksander, Neural Computing is the
study of cellular networks that have a natural propensity
for storing experiential knowledge.
Neural Computing Systems bear a resemblance to the brain in
the sense that knowledge is acquired through training rather
than programming and is retained due to changes in node
functions.
Functionally, the knowledge takes the form of stable
states or cycles of states in the operation of the net. A
central property of such states is to recall these states or
cycles in response to the presentation of cues.
37
Neural Networks:
Real and Artificial
Organising Principles and Common Themes:
Association between neurons and competition amongst the neurons
A network for identifying handwritten letters of the
alphabet: e.g. 25 patterns for representing 32 letters.
O
1
1
0
0
1
1
1
0
0
0
I
1
0
0
1
0
E
1
0
1
0
0
A
1
0
0
0
0
U
The line patterns
(vertical, horizontal,
short. Long strokes) and
circular patterns in
Roman alphabet can be
represented in a binary
system.
Levine, Daniel S. (1991). Introduction to Neural and Cognitive Modeling. Hillsdale, NJ &
London: Lawrence Erlbaum Associates, Publishers (See Chapter 1)
38
Neural Networks:
Real and Artificial
Organising Principles and Common Themes:
Association between neurons and competition amongst the neurons
The transformation of linear and circular patterns into
binary patterns requires a degree of pre-processing and
judicious guesswork!
U
O
I
M
A
P
P
I
E
N
G
A
1
1
0
0
1
1
1
0
0
0
1
0
0
1
0
1
0
1
0
0
1
0
0
0
0
The line patterns
(vertical, horizontal,
short. Long strokes) and
circular patterns in
Roman alphabet can be
represented in a binary
system.
Levine, Daniel S. (1991). Introduction to Neural and Cognitive Modeling. Hillsdale, NJ &
London: Lawrence Erlbaum Associates, Publishers (See Chapter 1)
39
Neural Networks:
Real and Artificial
Organising Principles and Common Themes:
Association between neurons and competition amongst the neurons
A network for identifying handwritten letters
of the alphabet.
Association between
-
A
Category Nodes
Weighted sum of inputs
1
0
0
0
0
feature nodes and
category nodes should
be allowed to change
over time (during
training), more
specifically as a result of
repeated activation of
the connection
Levine, Daniel S. (1991). Introduction to Neural and Cognitive Modeling. Hillsdale, NJ &
London: Lawrence Erlbaum Associates, Publishers (See Chapter 1)
40
Neural Networks:
Real and Artificial
Organising Principles and Common Themes:
Association between neurons and competition amongst the neurons
A network for identifying handwritten letters
of the alphabet.
Association between
-
A
Category Nodes
E
Weighted sum of inputs
1
0
1
0
0
feature nodes and
category nodes should
be allowed to change
over time (during
training), more
specifically as a result of
repeated activation of
the connection
Levine, Daniel S. (1991). Introduction to Neural and Cognitive Modeling. Hillsdale, NJ &
London: Lawrence Erlbaum Associates, Publishers (See Chapter 1)
41
Neural Networks:
Real and Artificial
Organising Principles and Common Themes:
Association between neurons and competition amongst the neurons
A network for identifying handwritten letters
of the alphabet.
Category Nodes
Weighted sum of inputs
Feature Nodes
Competition amongst
to win over an
individual letter shape
and then to inhibit
other neurons to
respond to that shape.
Especially helpful when
there is noise in the
signal, e.g. sloppy
writing
Levine, Daniel S. (1991). Introduction to Neural and Cognitive Modeling. Hillsdale, NJ &
London: Lawrence Erlbaum Associates, Publishers (See Chapter 1)
42
Neural Networks:
Real and Artificial
Organising Principles and Common Themes:
Association between neurons and competition amongst the neurons
A network for identifying handwritten letters of the
alphabet: e.g. 25 patterns for representing 32 letters.
θ
1
1
0
0
1
1
1
0
0
0
ι
1
0
0
1
0
ε
1
0
1
0
0
α
1
0
0
0
0
υ
The line patterns
(vertical, horizontal,
short. Long strokes) and
circular patterns in
Greek alphabet can be
represented in a binary
system.
Levine, Daniel S. (1991). Introduction to Neural and Cognitive Modeling. Hillsdale, NJ &
London: Lawrence Erlbaum Associates, Publishers (See Chapter 1)
43
Neural Networks:
Real and Artificial
Organising Principles and Common Themes:
Association between neurons and competition amongst the neurons
A network for identifying handwritten letters
of the alphabet.
-
α
Category Nodes
Weighted sum of inputs
1
0
0
0
0
Since the weights
change due to
repeated
presentation our
system will learn
to ‘identify’ Greek
letters of the
alphabet
Levine, Daniel S. (1991). Introduction to Neural and Cognitive Modeling. Hillsdale, NJ &
London: Lawrence Erlbaum Associates, Publishers (See Chapter 1)
44
Neural Networks:
Real and Artificial
Organising Principles and Common Themes:
Association between neurons and competition amongst the neurons
A network for identifying handwritten letters
of the alphabet.
Association between
-
α
Category Nodes
ε
Weighted sum of inputs
1
0
1
0
0
feature nodes and
category nodes should
be allowed to change
over time (during
training), more
specifically as a result of
repeated activation of
the connection
Levine, Daniel S. (1991). Introduction to Neural and Cognitive Modeling. Hillsdale, NJ &
London: Lawrence Erlbaum Associates, Publishers (See Chapter 1)
45
Artificial Neural Networks
and Learning
Artificial Neural Networks 'learn' by adapting in
accordance with a training regimen: The network is
subjected to particular information environments on
a particular schedule to achieve the desired endresult.
There are three major types of training regimens
or learning paradigms:
SUPERVISED;
UN-SUPERVISED;
REINFORCEMENT or
GRADED.
46
Biological and Artificial NN’s
Entity
Biological Neural
Networks
Artificial Neural
Networks
Processing Units
Neurons
Network Nodes
Input
Dendrites
Network Arcs
(Dendrites may form synapses
onto other dendrites)
(No interconnection
between arcs)
Axons or Processes
Network Arcs
(Axons may form synapses onto
other axons)
(No interconnection
between arcs)
Synaptic Contact
Node to Node via Arcs
Output
Inter-linkage
(Chemical and Electrical)
Plastic Connections
Weighted Connections
Matrix
47
Biological and Artificial NN’s
Entity
Biological Neural
Networks
Artificial Neural
Networks
Processing Units
Neurons
Network Nodes
Input
Dendrites
Network Arcs
(Dendrites may form synapses
onto other dendrites)
(No interconnection
between arcs)
Axons or Processes
Network Arcs
(Axons may form synapses onto
other axons)
(No interconnection
between arcs)
Synaptic Contact
Node to Node via Arcs
Output
Inter-linkage
(Chemical and Electrical)
Plastic Connections
Weighted Connections
Matrix
48
Biological and Artificial NN’s
Entity
Inter-linkage
Biological Neural
Networks
Artificial Neural
Networks
Excitatory –
Positive connection
weights between
nodes
with asymmetrical
membrane specialisation,
thicker on the post-synaptic
side and presynaptic side
containing round vesicles
Inter-linkage
Inhibitory–
with symmetrical
membrane specialisation,
with ellipsoidal vesicles
Negative
connection weights
between nodes
49
Biological and Artificial NN’s
Entity
Output
Biological Neural
Networks
Artificial Neural
Networks
Dendrites bring inputs
from different locations:
so does the brain wait for
all the inputs and then
start up the summing
exercise or does it perform
many different
intermediate
computations?
All inputs arrive
instantaneously and are
summed up in the same
computational cycle:
distance (or location)
between neuronal nodes
is not an issue.
50
Biological and Artificial NN’s
Entity
Output
Biological Neural
Networks
Artificial Neural
Networks
The threshold: the
neurons, being in a noisy
environment, tend not to
abide by a fixed,
discontinuous threshold
and there is a degree of
tolerance of the input.
The threshold is usually
a discontinuous (step)
function and after the
threshold is ‘crossed’ the
amount of input is
immaterial
51