Download Assignment 3

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Artificial intelligence wikipedia , lookup

Biological neuron model wikipedia , lookup

Neuropsychopharmacology wikipedia , lookup

Perceptual learning wikipedia , lookup

Nervous system network models wikipedia , lookup

Metastability in the brain wikipedia , lookup

Neural modeling fields wikipedia , lookup

Learning wikipedia , lookup

Eyeblink conditioning wikipedia , lookup

Inductive probability wikipedia , lookup

Donald O. Hebb wikipedia , lookup

Machine learning wikipedia , lookup

Neural coding wikipedia , lookup

Development of the nervous system wikipedia , lookup

Concept learning wikipedia , lookup

Artificial neural network wikipedia , lookup

Catastrophic interference wikipedia , lookup

Neural engineering wikipedia , lookup

Convolutional neural network wikipedia , lookup

Hierarchical temporal memory wikipedia , lookup

Efficient coding hypothesis wikipedia , lookup

Recurrent neural network wikipedia , lookup

Types of artificial neural networks wikipedia , lookup

Transcript
ASSIGNMENT 3:
HEBBIAN LEARNING
J. Elder
PSYC 6256 Principles of Neural Coding
Outline
Probability & Bayesian Inference
2

The assignment requires you to
 Download
the code and data
 Run the code in its default configuration
 Experiment with altering parameters of the model
 Optional (for bonus marks):
 Modify


the model, for example to:
Incorporate other forms of synaptic normalization
Incorporate other learning rules (e.g., ICA)
PSYC 6256 Principles of Neural Coding
J. Elder
Dataset
Probability & Bayesian Inference
3

I will provide you with 15 natural images of foliage
from the McGill Calibrated Image dataset
 These
images have been prefiltered, using a DOG
model of primate LGN selectivity (Hawken & Parker
1991)
 Thus the input simulates the feedforward input to V1
PSYC 6256 Principles of Neural Coding
J. Elder
Submission Details
Probability & Bayesian Inference
4


You will submit a short lab report on your
experiments.
For each experiment, the report will include:
 Any
code you may have developed
 The parameter values you tested
 The graphs you produced
 The observations you made
 The conclusions you drew
PSYC 6256 Principles of Neural Coding
J. Elder
Discussion Questions
Probability & Bayesian Inference
5

In particular, I want your reports to answer at least
the following questions:
 Why
does Hebbian learning yield the dominant
eigenvector of the stimulus autocorrelation matrix?
 What is the Oja rule and why is it needed?
 How does adding recurrence create diversity?
 What is the effect of varying the learning time
constants?
PSYC 6256 Principles of Neural Coding
J. Elder
Graphs
Probability & Bayesian Inference
6


The graphs you produce should be as similar as
possible to mine.
Make sure everything is intelligible!
PSYC 6256 Principles of Neural Coding
J. Elder
Due Date
Probability & Bayesian Inference
7

The report is due Wed Apr 13
PSYC 6256 Principles of Neural Coding
J. Elder
Eigenvectors
Probability & Bayesian Inference
8
function lgneig(lgnims,neigs,nit)
%Computes and plots first neigs eigenimages of LGN inputs to
V1
%lgnims = cell array of images representing normalized LGN
output
%nit = number of image patches on which to base estimate
dx=1.5; %pixel size in arcmin. This is arbitrary.
v1rad=round(10/dx); %V1 cell radius (pixels)
Nu=(2*v1rad+1)^2; %Number of input units
nim=length(lgnims);
Q=zeros(Nu);
for i=1:nit
u=im(y-v1rad:y+v1rad,x-v1rad:x+v1rad);
u=u(:);
Q=Q+u*u'; %Form autocorrelation matrix
end
Q=Q/Nu; %normalize
[v,d]=eigs(Q,neigs); %compute eigenvectors
PSYC 6256 Principles of Neural Coding
J. Elder
Output
9
Probability & Bayesian Inference
PSYC 6256 Principles of Neural Coding
J. Elder
Hebbian Learning (Feedforward)
Probability & Bayesian Inference
10
function hebb(lgnims,nv1cells,nit)
%Implements a version of Hebbian learning with the Oja rule, running on
simulated LGN
%inputs from natural images.
%lgnims = cell array of images representing normalized LGN output
%nv1cells = number of V1 cells to simulate
%nit = number of learning iterations
dx=1.5; %pixel size in arcmin. This is arbitrary.
v1rad=round(60/dx); %V1 cell radius
Nu=(2*v1rad+1)^2; %Number of input units
tauw=1e+6; %learning time constant
nim=length(lgnims);
w=normrnd(0,1/Nu,nv1cells,Nu); %random initial weights
for i=1:nit
u=im(y-v1rad:y+v1rad,x-v1rad:x+v1rad);
u=u(:);
%See Dayan Section 8.2
v=w*u; %Output
%update feedforward weights using Hebbian learning with Oja rule
w=w+(1/tauw)*(v*u'-repmat(v.^2,1,Nu).*w);
end
PSYC 6256 Principles of Neural Coding
J. Elder
Output
11
Probability & Bayesian Inference
PSYC 6256 Principles of Neural Coding
J. Elder
Hebbian Learning (With Recurrence)
12
Probability & Bayesian Inference
function hebbfoldiak(lgnims,nv1cells,nit)
%Implements a version of Foldiak's 1989 network, running on simulated LGN
%inputs from natural images. Incorporates feedforward Hebbian learning
and
%recurrent inhibitory anti-Hebbian learning.
%lgnims = cell array of images representing normalized LGN output
%nv1cells = number of V1 cells to simulate
%nit = number of learning iterations
dx=1.5; %pixel size in arcmin. This is arbitrary.
v1rad=round(60/dx); %V1 cell radius (pixels)
Nu=(2*v1rad+1)^2; %Number of input units
tauw=1e+6; %feedforward learning time constant
taum=1e+6; %recurrent learning time constant
zdiag=(1-eye(nv1cells)); %All 1s but 0 on the diagonal
w=normrnd(0,1/Nu,nv1cells,Nu); %random initial feedforward weights
m=zeros(nv1cells);
for i=1:nit
u=im(y-v1rad:y+v1rad,x-v1rad:x+v1rad);
u=u(:);
%See Dayan pp 301-302, 309-310 and Foldiak 1989
k=inv(eye(nv1cells)-m);
v=k*w*u; %steady-state output for this input
%update feedforward weights using Hebbian learning with Oja rule
w=w+(1/tauw)*(v*u'-repmat(v.^2,1,Nu).*w);
%update inhibitory recurrent weights using anti-Hebbian learning
m=min(0,m+zdiag.*((1/taum)*(-v*v')));
end
PSYC
6256 Principles of Neural Coding
J. Elder
Output
13
Probability & Bayesian Inference
PSYC 6256 Principles of Neural Coding
J. Elder