* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
Download How Neurons Communicate - Computing Science and Mathematics
Membrane potential wikipedia , lookup
Development of the nervous system wikipedia , lookup
Holonomic brain theory wikipedia , lookup
Clinical neurochemistry wikipedia , lookup
Recurrent neural network wikipedia , lookup
Resting potential wikipedia , lookup
Single-unit recording wikipedia , lookup
Long-term potentiation wikipedia , lookup
Neuroanatomy wikipedia , lookup
Donald O. Hebb wikipedia , lookup
Apical dendrite wikipedia , lookup
Biological neuron model wikipedia , lookup
Neuromuscular junction wikipedia , lookup
Synaptic gating wikipedia , lookup
Long-term depression wikipedia , lookup
Activity-dependent plasticity wikipedia , lookup
Neuropsychopharmacology wikipedia , lookup
End-plate potential wikipedia , lookup
Nervous system network models wikipedia , lookup
Nonsynaptic plasticity wikipedia , lookup
Synaptogenesis wikipedia , lookup
Stimulus (physiology) wikipedia , lookup
Neurotransmitter wikipedia , lookup
How Neurons Communicate Computing and the Brain The Synapse Dept. Computing Science & Mathematics Spring 2010 2 1 The Synapse Dept. Computing Science & Mathematics Spring 2010 3 The Synapse • The synapse converts an action potential (AP) into a postsynaptic potential (PSP) • The presynaptic AP causes calcium (Ca) entry • Ca causes vesicles of neurotransmitter to be released • Neurotransmitter binds to postsynaptic receptors (ion channels), causing them to open • The resulting ionic current generates the PSP AP PSP Dept. Computing Science & Mathematics Spring 2010 4 2 Excitatory Inputs • • • • A depolarising postsynaptic potential is excitatory (EPSP) Neurotransmitter is usually glutamate Postsynaptic receptor ion channels allow entry of Na and Ca Sufficient coincident EPSPs can cause a neuron to fire an AP – Commonly 20 to 100 coincident EPSPs Dept. Computing Science & Mathematics Spring 2010 5 Excitatory Inputs (2) • • • • Excitatory inputs are often on the dendrites They travel down the dendrites, summating on the way Final site of integration (summation) is the cell body (soma) A pyramidal cell may have 10,000 to 30,000 excitatory synapses Dept. Computing Science & Mathematics Spring 2010 6 3 Inhibitory Inputs • Inhibitory inputs are on the dendrites or around the soma • They may be depolarising, hyperpolarising or make no change to the membrane potential • The receptor ion channels pass potassium or chloride ions – Equilibrium potential is near threshold (just above or below) • In any case, an IPSP coincident with an EPSP will reduce the amplitude of the EPSP – And so reduce the chance that the neuron will fire an AP Dept. Computing Science & Mathematics Spring 2010 7 The Summing Neuron • The conductance, G, and hence the amplitude of a PSP, can be different at each synapse: equivalent to a connection weight • Excitatory inputs form a sum of positive weights: ΣwE • Hyperpolarising inhibitory inputs are like negative weights – Subtract from the sum of EPSPs: ΣwE - ΣwI • IPSPs that depolarise or do not change Vm are divisive – Scale down the summed EPSPs by dividing the sum: ΣwE / ΣwI AP or rate of firing Dept. Computing Science & Mathematics Spring 2010 8 4 Local Logic in the Dendrites • EPSPs and IPSPs can interact locally in the dendrites Dept. Computing Science & Mathematics Spring 2010 9 Plasticity in the Nervous System • Many parts of the nervous system change over time – Development over days to years – Changes in response to experience: seconds to years Dept. Computing Science & Mathematics Spring 2010 10 5 Learning in the Nervous System • ANNs “learn” by adapting the connection weights – Different learning rules • Synapses change their strength in response to neural activity – Long term potentiation (LTP) and depression (LTD) • Indirect evidence that this corresponds to “learning” – Associative or “Hebbian” learning (Hopfield net) Dept. Computing Science & Mathematics Spring 2010 11 Associative Learning • What learning rules does the brain use? • Associative learning – Increase synaptic strength if both pre- and postsynaptic neurons are active (LTP; predicted by Donald Hebb Hebb, 1949) – Decrease synaptic strength when the pre- or postsynaptic neuron is active and the other is silent (LTD) • This is like the Hopfield network Dept. Computing Science & Mathematics Spring 2010 12 6