Suppose we have three classes in two dimensions with the following underlying distributions: I Class !1: pxj!1

Question:

Suppose we have three classes in two dimensions with the following underlying distributions:

I Class !1: p¹xj!1º = N¹0, Iº.

I Class !2: p¹xj!2º = N

 h 11 i

, I



.

I Class !3: p¹xj!3º = 1 2N

 h 0.5 0.5 i

, I



+ 12 N

 h

????0.5 0.5 i

, I



.

Here,N¹, º denotes a two-dimensional Gaussian distribution with mean vector  and covariance matrix

, and I is the identity matrix. Assume class prior probabilities Pr¹!iº = 13, i = 1, 2, 3.

a. Classify the feature x =

h 0.25 0.25 i

based on the MAP decision rule.

b. Suppose the first feature is missing. Classify x =

 

0.25



using the optimal rule derived in Q10.1.

c. Suppose the second feature is missing. Classify x =

h 0.25



i using the optimal rule from Q10.1.

Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Question Posted: