Download Pattern Recognition Final Exam 2022

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Statistical Pattern Recognition Final
2-Hour Exam
Fall 2022
Shiraz University
School of ECE
Dr Z. Azimifar
Question 1. (30% marks) Find the truth value for each of these propositions (True or False):
1. EM algorithms are attractive because for problems such as estimating Gaussian Mixture
Models, they are guaranteed to find a global optimum in likelihood.
2. The EM algorithm does a kind of “gradient descent” in likelihood since both steps are
guaranteed to decrease the negative log-likelihood.
3. You have a 2-D training data set X of 100 instances, in which each feature has 8 possible
values, and a binary label y = ±1. You are asked to learn a Naive Bayes binary classification
model for predicting the label y. You also found another data set T of 100 instances that
are missing binary labels y. You want to use an EM algorithm to learn a better semisupervised model by incorporating unlabelled instances and treating unobserved labels
as latent variables Z. Answer the following questions:
• 3.1 The quantity wj = Pr(Zj = 1 | Xj = xj ) for an unlabelled instance xj∈T , is a
parameter of this EM model.
• 3.2 The smallest number of parameters needed to specify a model for this
classification using EM algorithm is 115.
4. Short Answer: How many parameters are needed to specify a Gaussian Mixture Model
with 4 clusters, data of dimension 5, and diagonal covariances?
Question 2. (30% marks)
For each of the following clustering of (2-dimensional) points into two clusters given by red points
for cluster 1 and blue color for cluster two, write down either K-means or Gaussian Mixture
Models or none of them could have led to the cluster assignments shown.
Statistical Pattern Recognition Final
2-Hour Exam
Fall 2022
Shiraz University
School of ECE
Dr Z. Azimifar
Question 3. (40% marks)
A game house has K regular players. On each day t, one of the K players comes in and plays mt
rounds of game G for that day and wins wt of those mt rounds. The game house agrees to share
with you just the data of how many rounds of game G were played on each day and how many
of those rounds were won by the player and the fact that there are K players playing the game.
You however do not know which of the K players played on which day.
You decide to use a probabilistic model, especially mixture model on this data. Specifically, for
each player k, you model the probability that he wins on any given round by the parameter pk
between 0 and 1. That is, if player k plays a round, the probability that he wins the round is pk.
Hence, on day t, if the kth player had played mt rounds, then the probability that he won wt of
those mt rounds is given by the binomial distribution by
w
!
"!
# 𝑝k t (1 − 𝑝# )(!! %"! ) .
"
!
Now with this model, you shall use a mixture of K binomials with parameters p1, … , pK to model
the data for n days given by (m1, w1), … ,(mn, wn). That is, on day t, we first pick one player out
of the K at random according to the distribution π as ct ∼ π. Next, the player for day t plays mt
rounds and given the player is ct, the number of wins wt out of the mt rounds is given by the
binomial distribution
w
!
"!
# 𝑝'!t (1 − 𝑝'! )(!! %"! ) .
"
!
Derive the EM algorithm for this problem. Specifically:
())
3.1. Write down the E-step update for Q’s. That is write down what 𝑄( (𝑘) is for any given
iteration i (in terms of parameters from previous iteration).
3.2. For any, mixture model, the M-step for π on iteration i is given by π()) (𝑘) =
(𝒊)
(#)
∑%
!&' +! (#)
,
.
(𝒊)
Derive the M-step update for 𝒑𝟏 , … , 𝒑𝑲 the K model parameters on iteration i, in terms of data
())
and 𝑄( ’s. First write down the maximization problem for the M-step and then solve for
(𝒊)
(𝒊)
𝒑𝟏 , … , 𝒑𝑲 showing that they are the maxima for the optimization problem.
Good Luck
Azimifar