(b) Print the coefficients of the features in the model. Which features contribute mostly to the prediction? Which ones are positively correlated and which ones are negatively correlated with the SPAM class? (c) Vary the decision threshold? e {0.25, 0.5, 0.75, 0.9} and report for each value the model accuracy. precision, and recall. Comment on how these metrics vary with the choice of threshold. Problem 2 [Gradient Descent for Logistic regression] - 25 points Use your implementation of Gradient Descent from Homework 2 and adapt it for logistic regression. Take 3 values of the learning rate and report the cross- entropy loss objective after 10, 50, and 100 iterations. At 100 iterations. report the accuracy, precision, recall, and FI score for the 3 learning rates, and compare with the metrics given by the package on the training and testing sets. Problem 3 [Comparing classifiers] - 25 points In this problem. you will use existing packages of your choice for training and testing various classifiers, and then compare them. You will use the same SPAMBASE dataset. You can use the same training and testing data as in Problem 1. Train the following classifiers using the training data: 1. Logistic regression 2. LDA 3. KNN (a) Use cross- validation to select the * hyper-parameter for kNN. Show theaccuracy , error, precision, and recall metrics on the validation dataset for multiple values of. Select the value of that mini- mizes the average cross validation error. (b) Print the accuracy , error. precision, and recall metrics for all 3 classifiers on both training and testing data. Which model is performing best? Which one is performing worst? Write down some observations. (c) Generate a graph that includes ROC curve for the logistic regression classifier on the testing set. Compute the Area Under the Curve (AUC) metric. You can use a package for this