Question
Consider a binary classification problem with two-dimensional features, i.e., K = 2, p = 2. Class 1 has multivariate Gaussian distribution N (1 , ),
Consider a binary classification problem with two-dimensional features, i.e., K = 2, p = 2. Class 1 has multivariate Gaussian distribution N (1 , ), and class 2 has multivariate Gaussian distribution N (2 , ). Let 1 = (1, 2)T and 2 = (1, 2)T . = [ 1 0.5 0.5 1]. is the common covariance matrix. Show the decision boundary in each of the following cases. First show the equation of the decision boundary, then draw the decision boundary as well as the data points (by sampling from the distributions) in a plot. Show the plots in a similar fashion to Figure 4.9. 1) Use LDA without dimension reduction. 2) Use reduced rank LDA by projecting data to the direction of greatest centroid spread. 3) Use reduced rank LDA by projecting data to the discriminant direction.