Download Exercise Sheet 6 - Machine Learning

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Artificial intelligence wikipedia , lookup

Development of the nervous system wikipedia , lookup

Learning wikipedia , lookup

Neural modeling fields wikipedia , lookup

Eyeblink conditioning wikipedia , lookup

Perceptual learning wikipedia , lookup

Gene expression programming wikipedia , lookup

Pattern language wikipedia , lookup

Concept learning wikipedia , lookup

Neuroeconomics wikipedia , lookup

Artificial neural network wikipedia , lookup

Nervous system network models wikipedia , lookup

Central pattern generator wikipedia , lookup

Convolutional neural network wikipedia , lookup

Catastrophic interference wikipedia , lookup

Recurrent neural network wikipedia , lookup

Types of artificial neural networks wikipedia , lookup

Transcript
Universität Freiburg
Lehrstuhl für Maschinelles Lernen und natürlichsprachliche Systeme
Machine Learning (SS2011)
Prof. Dr. M. Riedmiller, Dr. Sascha Lange, Manuel Blum
Exercise Sheet 6
Exercise 1: MLP Training
The Java Neural Network Simulator (JavaNNS) is a neural network tool with
a comfortable graphical user interface.
Download and install JavaNNS from
http://www.ra.cs.uni-tuebingen.de/software/JavaNNS/ and create a feed-forward
network with two input neurons, one hidden layer with five hidden neurons and one
output neuron. Download the training pattern file from the course website and open
it. Try different learning algorithms with different parameter settings and observe the
results with the Error Graph View and the Projection View.
(a) Use the Backpropagation learning algorithm to train a MLP for the given dataset.
Set dmax = 0 and try different values for the learning rate.
(b) Use Backpropagation with momentum and compare the results.
(c) Compare the performance of the online learning algorithms from (a) and (b) to the
batch mode algorithm Resilient Propagation.
(d) Open the validation pattern set and check the performance of the learned function
during training. Discuss possible regularization techniques and try to use them in
order to avoid overfitting.
Exercise 2: Boosting with Decision Stumps
We subsequently consider the data set specified in Table 1 and apply the AdaBoost
algorithm to train a classifier for the illness problem.
We consider four decision stumps SN , SC , SR , and SF – one for each of the attributes
– that each classify different instances as positive and negative. So, for example, the
decision stump belonging to the “coughing” attribute classifies the patterns as
SC (di ) = true for i ∈ {1, 2, 6} and SC (di ) = f alse for i ∈ {3, 4, 5}.
(a) Apply T = 4 iterations of the AdaBoost algorithm to the training patterns provided.
Select in each iteration that decision stump that yields the lowest error on the
reweighted pattern distribution.
(b) Verify whether your final classifier Hf inal correctly classifies all training patterns.
Training
Example
d1
d2
d3
d4
d5
d6
N
C
R
F
(running nose)
(coughing)
(reddened skin)
(fever)
+
+
–
+
–
–
+
+
–
–
–
+
+
–
+
–
–
+
–
–
+
–
–
–
Table 1: List of training instances.
Classification
positive (ill)
positive (ill)
positive (ill)
negative (healthy)
negative (healthy)
negative (healthy)