* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
Download The extended BAM Neural Network Model
Neurocomputational speech processing wikipedia , lookup
Central pattern generator wikipedia , lookup
State-dependent memory wikipedia , lookup
Atkinson–Shiffrin memory model wikipedia , lookup
Agent-based model in biology wikipedia , lookup
Sparse distributed memory wikipedia , lookup
Mathematical model wikipedia , lookup
Catastrophic interference wikipedia , lookup
Biological neuron model wikipedia , lookup
Development of the nervous system wikipedia , lookup
Neural engineering wikipedia , lookup
Convolutional neural network wikipedia , lookup
Metastability in the brain wikipedia , lookup
Artificial neural network wikipedia , lookup
Holonomic brain theory wikipedia , lookup
Nervous system network models wikipedia , lookup
National Taiwan Ocean University Department of Communications, Navigation and Control Engineering Speaker:游佳龍 ID:19967034 Date:11/24/2010 Outline  Abstract  Introduction  The extended BAM Neural Network Model  Proof of the New Model’s Stability  Experiment Results Abstract  In this paper we propose an extended bidirectional associative memory (BAM) neural network model which can do auto- and hetero-associative memory. The theoretical proof for this neural network model’s stability is given. Experiments show that this neural network model is much more powerful than the M-P Model, Discrete Hopfield Neural Network, Continuous Hopfield Neural Network, Discrete Bidirectional Associative Memory Neural Network, Continuous and Adaptive Bidirectional Associative Memory Neural Network, BackPropagation Neural Network and Optimal Designed Nonlinear Continuous Neural Network. Experimental results also show that, when it does auto-associative memory, the power of this model is the same as the Loop Neural Network Model which can only do auto-associative memory. Introduction  Associative memory is an important part in neural network theory and it is also an efficient function in the applications of intelligent control, pattern recognition and artificial intelligence.  At present, many neural network models such as Loop model, M-P model, Discrete and Continuous Hopfield Model, Kosko’s Discrete BAM, Optimal Designed Nonlinear Continuous Neural Network, etc., which can do associative memory, have existed. Each model has its own advantages and disadvantages. Introduction  In practical applications, the more powerful the network is, the better the associative memory result are. One important task is to find or construct a powerful associative neural network. The so-called neural network model has two meanings, that is its structure and its training algorithm.  In this paper we propose an extended bidirectional associative memory(BAM) neural network model. The reason why we call this new model an extended BAM neural network model is that its structure is the same as the BAM model. The different between the BAM and the extended BAM is the training algorithm. The extended BAM Neural Network Model  This part introduces the architecture and learning algorithm for the Extended. This model can be used to carry out both auto-associative memory and heteroassociative memory. The BAM model(Kosk0 Model) is a memory consisting of two layers. It uses the forward and backward information flow to produce an associative search for stored stimulus-response association. The extended BAM Neural Network Model The extended BAM Neural Network Model  The firing function for both 1ayers:neuron is  Consider the stored association pairs as  The formula for the weight matrix is  For our extended BAM model, the learning algorithm is Delta Learning Rule. Delta learning rule r  [d j f (W jT X )] f ' (W jT X ) 1 T 2 E   (d j  f (W j X )) 2 j E T E   (d j  y j ) f ' (W j X ) X W j w ji   E   (d j  y j ) f ' (net j ) xi W j  r[W j (t ), X (t ), d j (t )] X (t ) The extended BAM Neural Network Model  During training we treat this two layer network as a feedforward neural network, and the activation function for output layer's neurons is sigmoid function.  After the training is finished, we use the following activation function in both layers to do associative memory.  By this training method the forward connection weight matrix M can be obtained. We use M as the backward connection weight matrix. Proof of the New Model’s Stability  We can define the energy function as Since we get the energy function equivalent form as follows  The energy change due to the state change of a is Proof of the New Model’s Stability  By the BAM theorem has only three values, i.e., -2,0 and 2. If , we have So, and So, we have This is the situation of zero change in and we don’t consider this case. The energy change due to the state change of is the same as . Hence along discrete trajectories as claimed. Proof of the New Model’s Stability  Since E is bounded below the associative memory of the extended BAM converges to some stable points, meaning that, the network is stable. Experiment Result  The experiment results show that the New Model is much more powerful than the other models to carry out associative memory.  In our experiment the network consists of 8 processing units(neurons) for each layer. The set of vector pairs to be stored is The experiments are carried out in the following four cases. Experiment Result Experiment Result Experiment Result  Using the same method as above, we get the following results. References
 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
                                             
                                             
                                             
                                             
                                             
                                             
                                             
                                             
                                             
                                             
                                            