Download Experimenting with Neural Nets

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Binding problem wikipedia , lookup

Neural oscillation wikipedia , lookup

Single-unit recording wikipedia , lookup

Neural modeling fields wikipedia , lookup

Optogenetics wikipedia , lookup

Neuroanatomy wikipedia , lookup

Connectome wikipedia , lookup

Neural coding wikipedia , lookup

Central pattern generator wikipedia , lookup

Brain Rules wikipedia , lookup

Neural engineering wikipedia , lookup

Channelrhodopsin wikipedia , lookup

Artificial neural network wikipedia , lookup

Neuropsychopharmacology wikipedia , lookup

Biological neuron model wikipedia , lookup

Holonomic brain theory wikipedia , lookup

Development of the nervous system wikipedia , lookup

Metastability in the brain wikipedia , lookup

Synaptic gating wikipedia , lookup

Convolutional neural network wikipedia , lookup

Catastrophic interference wikipedia , lookup

Nervous system network models wikipedia , lookup

Recurrent neural network wikipedia , lookup

Types of artificial neural networks wikipedia , lookup

Transcript
CSC 325 / PHI 394 Spring 2011 - Homework
Experimenting with Neural Nets
Numbers marked with  indicate material you need to write up.
BEFORE YOU BEGIN
Download the Java Neural Net Simulator (JavaNNS) from:
http://www.ra.cs.uni-tuebingen.de/software/JavaNNS/welcome_e.html
For Windows, you download extract the contents of the JavsNNS-Win.zip file.
To run the program, double click on the JavaNNS.jar file.
Note: remember, in this program neurons are called “units”.
Practice 1: Training a pre-built neural net to recognize Even/Odd patterns (“Parity”)_
1. Download the network (.net) and pattern (.pat) files for this assignment from the class website.
2. Double-click on JavaNNS.jar file to launch the neural network simulator.
3. Pull down the File menu, and select Open…
Click up and over to the folder NKU_examples
Choose the file Network_4_12_1.net (from the course website).
4. A neural network should appear in a window. Resize the window to see all of it.
5. Pull down the View menu , and select Weights.
6. Pull down the View menu again, and select Error Graph.
7. Pull down the Tools menu, and select Control Panel.
8. Drag and resize these windows to make sure you can see them well.
Resize the enclosing window if necessary.
9. Click on the bottom right [>] button on the error graph several times
until the last number on the horizontal axis is 10,000. You might want
to stretch the window out to make it wider. (See picture below.) Also,
click on the second button in the lower left corner to turn on the grid.
10. Click on the [V] button on upper left of the error graph several
times until the last number on the vertical axis is 5. (See picture below.)
1
11. Pull down the File menu, and select Open…
Choose the file Parity4.pat (which you downloaded from the course website).
12. On the Control Panel, choose the Initializing tab, and click the Init button. This will set the weights
to their initial values. By default, the weights will be random, between -1 and +1. You can see them
displayed, color code, in the “Linkweights” window. (The square in column 3 row 15 is indicates the
weight from neuron 3 to neuron 15, for example.)
13. See how the neural network classifies patterns before it has learned anything. Go to the Updating
tab of the Control Panel. Click through the 16 input patterns, and see how the neural network responds.
Note that the output neuron will never produce a 0 or a 1 as an output. Why?
Let’s say that if the neuron produces an output > 0.9 when it should say 1, and < 0.1 when it shoul;d say
0, that is good enough.
If the neural net understood even vs. odd, the following should happen:
When the number of green inputs is even, the output neuron should be blue.
When the number of green inputs is odd, the output neuron should be green.
But it doesn’t understand this yet! Its brain has just random connections. So we must train it!
2
14. Go to the Learning tab. Set Cycles to 10000 (this is the number of passes through the training set
to use). Then click Learn All. Watch the error curve go down (we hope!!) If it the learning curve
doesn’t go down to (almost) zero, try it again:
Hit Init to reset all the synaptic weights to random values.
Hit Learn All and watch the learning curve. Hopefully it goes down!
(You can hit the clear button in lower left corner of the Error graph window to erase older graphs.)
Do this ten times. Record approximately how many passes (“cycles”) it took before it learned, in a table
like this example:
Pass
1
2
3
etc.
Number of cycles (write “>10K” if it did not get near 0 before reaching 10,00 0 cycles)
9500
6500
>10K
Just do this visually by watching the error graph curves. There is no point in having precise numbers.
Why is there such a range of different learning times?
15. After doing a run where the net successfully learned, walk through the training set (as in step 13)
and confirm that it is now getting the answers right. Watch the hidden layers. For how many of the 16
input patterns are the hidden neurons exclusively high or low (>0.9 or <0.1)? Can you make other
observations about how input patterns are represented in the hidden layer?
16. Time for brain damage! Making sure you have a network that has learned parity, go to the
network widow and right-click on the first (top) hidden neuron, and select delete. Now walk through
the training set again. How many of the 16 patterns does it correctly classify now? When it has an
answer wrong, is it slightly wrong (for example, saying 0.8 when it should be >0.9) or totally misclassified?
Now delete another hidden neuron, and answer the same question.
Now delete a third hidden neuron, and answer the same question.
17. At this point, our little brain has suffered a massive stroke: it has lost 3 of its 12 hidden neurons.
Can it recover from this trauma with a little therapy? Let’s see if it can learn the problem with its smaller
brain. Retrain it: Hit Learn All, and watch the learning curve. Does it learn?
18. Free experimentation: What’s the smallest brain that can learn this even/odd problem? Write up
your findings carefully and clearly. Notice you will need to do many experiments, as there is a lot of
random variation here.
____________________________________________________________________________________
3
Practice 2: Building your own network
Now you will build your own network to solve the (harder) 5-parity problem.
1. From the File menu pull down New. You do not need to save anything.
2. Pull down the Tools menu, and select Create > Layers….
3. Build the input layer, as follows, selecting a height of 5 neurons of type Input. Then hit Create.
4. Build a hidden layer, as follows, selecting a height of 12 neurons of type Hidden, with top left position 4.
Then hit Create.
5. Build another hidden layer of 12 hidden neurons, as you did above, with top left position 7.
6. Finally, build the output layer, selecting a height of 1 neuron of type Output, with top left position 10.
4
7. Those are the neurons, now we need to build the connections. In the Tools menu, pull down Create
Connection. In the box that pops up select “Connect feed-forward”, then hit the Connect button. Your
network should look like the one on the right below:
8. Save this to a file. A good name for it would be Network_5_12_12_1.net
9. Now you can train this network on the 5-parity problem (opening parity5.pat) as you did in the first
exercise. This is a harder problem, so give it 100,000 cycles to learn it. Also, change the horizontal scale
on the Error graph to 0…100,000 and the vertical scale to 0…10, so you can see the learning curves
clearly.
Replicate the table you did for #14 in Practice 1. After experimenting with “Backpropagation” (on the
Learning tab), try out “Backprop-momentum”, experimenting with parameters to try and get it to learn.
Congratulations, you are doing neural smithing!
Write up your experimental results and any conclusions you have reached.
5