Hi!
please help me solve these questions.
Question 3. [8 marks] Consider an arbitrary prediction problem in which we have two features (z) and z2) which we wish to use to predict the output y, as seen in the table below: Feature $1 Feature $2 Output y 3.5 0.17 14.9 3.7 0.45 15.4 3.0 0.3 13.1 Assume that we've carried out several iterations of gradient descent and we've managed to 2.5 come up with the parameters 0 = 2.4 We now want to compute the cost corresponding -0.2 to these O parameters obtained. But since we've got multiple variables i.e. 1, 12, we want to use the vectorized expression to compute the cost much more efficiently. Remember that the vectorized expression for computing the cost J(9) is given by: 2m (6 - ex),(fi - ex)-, = (9) Your task in this question is to compute the cost using the vectorized expression above. To do this, you first need to formulate the matrix X (it should be a 3 x 3 matrix, so don't forget the column of Is for feature ro) and y (a 3 x 1 matrix) from the table above. I've already given you O. Then I would recommend computing (XO - y) which is multiplied by its own transpose in the expression above (so no need to compute it twice). Show your working fully and note that you should end up with a single value which is the cost. If necessary, round off your values and final answer to two decimal placesQuestion 5. [6 marks] Given the following matrices: X = 1.0 4.7 5.8 [1.0 3.5 5.2) y = -1.7 -1.6 The normal equation can be used to compute for the best O parameters as follows: 9 = (XX) 'xTy Given also that (XTX )is given by: -0.8 0.5 0.3 (XX ) ' - 0.5 0.3 -0.8 0.3 -0.8 -0.9 Your task in this question is to use the normal equation given to you above to compute the optimal 0 parameters for this data. If necessary, round off your final / values to two decimal places. Also note that you should present your final answer O as a (3 x 1) matrix that is clearly visible as your final