Answered step by step
Verified Expert Solution
Question
1 Approved Answer
PLEASE PART D ONLY :) 2. Assume that x is the outcome of a random variable, X, y is the outcome of a random variable,
PLEASE PART D ONLY :)
2. Assume that x is the outcome of a random variable, X, y is the outcome of a random variable, y, and that (x,y) are drawn i.i.d. from some data generating process, D, i.e. (x,y) - Here D is characterised by Pxy(x, y) = Py(y|x)Px(x) for some pmf, py(l'), and some pdf, Px). We evaluate the performance of a prediction function, /, on a particular data point, (x,y), using a losss measure, E(F(x),y). We define the generalisation losses: L(E, D, S) - EpE(X),V)] f' is said to be Bayes Optimal if: 1* = urgmin Ep[E(F(X),Y) (c) 7 marks) In a binary classification setting, assume that classes are distributed according to a Bernoulli random variable, y, with outcomes y Bern(6), with pmf characterised by pyly = 1) = 8. Furthermore we model the class conditional probability distributions for the random variable, X, whose outcomes are given by instances of particular input attributes, x= I ER, as (note that we are dealing with a 1-dimensional attribute vector in this case): (y=0) Poisson(o) where: A CR (y = 1) Poissan(A1) where: 11 ER Here Ap With the aid of a well marked sketch, fully characterise the discriminant boundary between the two classes for each of the following cases: i) The loss matrix is balanced. ii) The loss matrix is similar to that in part (a). Hint: If a random variable 2, with outcomes 2 {0,1,2,...}, is Poisson distributed then the associated pdf is given by: Pzl; ) =, for some 1 R. (d) /6 marks] With the aid of a well marked sketch, and for the case of a balanced loss matrix, com- pare and contrast the boundary you generated in part (c) with the boundary generated by an appropriate Gaussian Naive Bayes model for each of the following cases: i) The class contingent variances of the Gaussian Naive Bayes model are equal. ii) The class contingent variances of the Gaussian Naive Bayes model are unequal. 2. Assume that x is the outcome of a random variable, X, y is the outcome of a random variable, y, and that (x,y) are drawn i.i.d. from some data generating process, D, i.e. (x,y) - Here D is characterised by Pxy(x, y) = Py(y|x)Px(x) for some pmf, py(l'), and some pdf, Px). We evaluate the performance of a prediction function, /, on a particular data point, (x,y), using a losss measure, E(F(x),y). We define the generalisation losses: L(E, D, S) - EpE(X),V)] f' is said to be Bayes Optimal if: 1* = urgmin Ep[E(F(X),Y) (c) 7 marks) In a binary classification setting, assume that classes are distributed according to a Bernoulli random variable, y, with outcomes y Bern(6), with pmf characterised by pyly = 1) = 8. Furthermore we model the class conditional probability distributions for the random variable, X, whose outcomes are given by instances of particular input attributes, x= I ER, as (note that we are dealing with a 1-dimensional attribute vector in this case): (y=0) Poisson(o) where: A CR (y = 1) Poissan(A1) where: 11 ER Here Ap With the aid of a well marked sketch, fully characterise the discriminant boundary between the two classes for each of the following cases: i) The loss matrix is balanced. ii) The loss matrix is similar to that in part (a). Hint: If a random variable 2, with outcomes 2 {0,1,2,...}, is Poisson distributed then the associated pdf is given by: Pzl; ) =, for some 1 R. (d) /6 marks] With the aid of a well marked sketch, and for the case of a balanced loss matrix, com- pare and contrast the boundary you generated in part (c) with the boundary generated by an appropriate Gaussian Naive Bayes model for each of the following cases: i) The class contingent variances of the Gaussian Naive Bayes model are equal. ii) The class contingent variances of the Gaussian Naive Bayes model are unequalStep by Step Solution
There are 3 Steps involved in it
Step: 1
Get Instant Access to Expert-Tailored Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started