Question
Similarly, a question on gradient descent but on multiplate features, Extend the code from the previous question so that it is able to find 0,1,,
Similarly, a question on gradient descent but on multiplate features, Extend the code from the previous question so that it is able to find 0,1,, for multiple features. Write the function gradient_descent_multi_variable(X, y, learning rate, n) that returns: 0 - a number representing the bias constant 1,2,, - (,1) NumPy matrix, where each element denotes the weight constant of a certain feature - a list that contains the MSE scores calculated during the gradient descent process. You can use mean_squared_error function for this task.
def gradient_descent_multi_variable(X, y, learning rate = 1e-5, n = 250): '''
Parameters ---------- X (np.ndarray) : (m, n) numpy matrix representing feature matrix y (np.ndarray) : (m, 1) numpy matrix representing target values learning rate (float) : Learning rate n (int) : Number of gradient descent iterations Returns ------- bias (float): The bias constant weights (np.ndarray): A (n, 1) numpy matrix that specifies the weight constants. loss (list): A list where the i-th element denotes the MSE score at i-th iteration. ''' # Do not change bias = 0 weights = np.full((X.shape[1], 1), 0).astype(float) loss = []
for _ in range(number_of_iterations): # TODO: add your solution here and remove `raise NotImplementedError` raise NotImplementedError return bias, weights, loss finally, Compare the pros and cons of using normal equation and gradient descent for linear regression. Moreover, select the algorithm you think is more suitable for this problem set and explain why you choose it.
Step by Step Solution
There are 3 Steps involved in it
Step: 1
Get Instant Access to Expert-Tailored Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started