Download Capacity Analysis of Attractor Neural Networks with Binary Neurons and Discrete Synapses

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Neuroanatomy wikipedia , lookup

Neurotransmitter wikipedia , lookup

Neuroeconomics wikipedia , lookup

Neuropsychopharmacology wikipedia , lookup

Biological neuron model wikipedia , lookup

Nonsynaptic plasticity wikipedia , lookup

Neural coding wikipedia , lookup

Environmental enrichment wikipedia , lookup

Catastrophic interference wikipedia , lookup

Convolutional neural network wikipedia , lookup

Activity-dependent plasticity wikipedia , lookup

Neural oscillation wikipedia , lookup

Channelrhodopsin wikipedia , lookup

Optogenetics wikipedia , lookup

Mathematical model wikipedia , lookup

Central pattern generator wikipedia , lookup

State-dependent memory wikipedia , lookup

Neural engineering wikipedia , lookup

Synaptic gating wikipedia , lookup

Artificial neural network wikipedia , lookup

Sparse distributed memory wikipedia , lookup

Reconstructive memory wikipedia , lookup

Metastability in the brain wikipedia , lookup

Recurrent neural network wikipedia , lookup

Neural modeling fields wikipedia , lookup

Holonomic brain theory wikipedia , lookup

Development of the nervous system wikipedia , lookup

Types of artificial neural networks wikipedia , lookup

Synaptogenesis wikipedia , lookup

Chemical synapse wikipedia , lookup

Nervous system network models wikipedia , lookup

Transcript
The University of Chicago
Department of Statistics
PhD Dissertation Presentation
YIBI HUANG
Department of Statistics
The University of Chicago
Capacity Analysis of Attractor Neural Networks with Binary
Neurons and Discrete Synapses
TUESDAY, August 3, 2010, at 2:00 PM
110 Eckhart Hall, 5734 S. University Avenue
ABSTRACT
Inspired by the delay activity observed in numerous delayed match-to-sample (DMS)
experiments, the attractor states of neural network dynamics are considered to be the underlying mechanism of memory storage in neural networks. For the simplest network with
binary neurons and standard asynchronous dynamics, we show that the dynamics cannot be
stable if all synapses are excitatory. With the introduction of inhibition, the dynamics can
be stabilized, especially when the coding levels of patterns varies. The activity of neurons
changes the efficacy of the synapses, and hence alters the attractor states. New memory is
thus created and old memory are gradually erased. A computationally efficient and highly
accurate approximation to the retrieval probability a learned pattern is first developed in binary synapse models and then extended to all finite state synapse models. With the retrieval
probabilities, the memory capacity of any local learning rule can be evaluated. The method
is applied to the sequential models (Fusi and Abbott, 2007) and meta-plasticity models (Fusi
et al, 2005; Leibold and Kempter, 2008). We show that as the number of synaptic states
increases, the capacity, as defined here, either plateaus or decreases. In the few cases where
multi-state models exceed the capacity of binary synapse models the improvement is small.
The method is partially verified in the case of binary synapse models and is extremely close
to the retrieval probabilities estimated from simulations.
Information about building access for persons with disabilities may be obtained in advance by calling Sandra Romero
at 773.702-0541 or by email ([email protected]).