Answered step by step
Verified Expert Solution
Link Copied!

Question

1 Approved Answer

How do I revise the gradient descent function below to give me a result of mean = 6.2 and standard descent of 2.4? import numpy

How do I revise the gradient descent function below to give me a result of mean = 6.2 and standard descent of 2.4?

import numpy as np from numpy import random from scipy import stats

data = np.array([4, 5, 7, 8, 8, 9, 10, 5, 2, 3, 5, 4, 8, 9])

def partial_deriv_mean(data, est_mean, est_var):

data_sub_est_mean = data - est_mean sum_data_sub_est_mean= np.sum(data_sub_est_mean) pd_mean = (1/est_var) * sum_data_sub_est_mean return pd_mean * -1

def partial_deriv_var(data, est_mean, est_var):

data_sub_est_mean = data - est_mean data_sub_est_mean_squared = data_sub_est_mean * data_sub_est_mean sum_data_sub_est_mean_squared = np.sum(data_sub_est_mean_squared) pd_var = 1/2 * est_var * (-14 + 1/est_var * sum_data_sub_est_mean_squared)

return pd_var *-1

def neg_log_likelihood(data, est_mean, est_var): ll = 0 for i in data: ll += np.log(stats.norm.pdf(i, est_mean, np.sqrt(est_var)))

return ll * -1

def gradient_descent(data, est_mean, est_var, learning_rate, epochs): prev_w = np.array([est_mean, est_var]) for k in range(epochs): d_mean = partial_deriv_mean(data, est_mean, est_var) d_var = partial_deriv_var(data, est_mean, est_var) cost = neg_log_likelihood(data, est_mean, est_var)

est_mean = est_mean - (learning_rate*d_mean) est_var = est_var - (learning_rate*d_var) return est_mean, est_var, cost

gradient_descent(data, 0, 1, 0.0001, 1000000)

import numpy as np from numpy import random from scipy import stats data = np.array([4, 5, 7, 8, 8, 9, 10, 5, 2, 3, 5, 4, 8, 9]) def partial_deriv_mean(data, est_mean, est_var): data_sub_est_mean = data - est_mean sum_data_sub_est_mean= np.sum(data_sub_est_mean) pd_mean = (1/est_var) * sum_data_sub_est_mean return pd_mean * -1 def partial_deriv_var(data, est_mean, est_var): data_sub_est_mean = data - est_mean data_sub_est_mean_squared = data_sub_est_mean * data_sub_est_mean sum_data_sub_est_mean_squared = np.sum(data_sub_est_mean_squared) pd_var = 1/2 * est_var * (-14 + 1/est_var * sum_data_sub_est_mean_squared) return pd_var *-1 def neg_log_likelihood(data, est_mean, est_var): ll = 0 for i in data: ll += np.log(stats.norm.pdf(i, est_mean, np.sqrt(est_var))) return ll * -1 def gradient_descent(data, est_mean, est_var, learning_rate, epochs): prev_w = np.array([est_mean, est_var]) for k in range(epochs): d_mean = partial_deriv_mean(data, est_mean, est_var) d_var = partial_deriv_var(data, est_mean, est_var) cost = neg_log_likelihood(data, est_mean, est_var) est_mean = est_mean - (learning_rate*d_mean) est_var = est_var - (learning_rate*d_var) return est_mean, est_var, cost gradient_descent(data, 0, 1, 0.0001, 1000000)

Step by Step Solution

There are 3 Steps involved in it

Step: 1

blur-text-image

Get Instant Access to Expert-Tailored Solutions

See step-by-step solutions with expert insights and AI powered tools for academic success

Step: 2

blur-text-image

Step: 3

blur-text-image

Ace Your Homework with AI

Get the answers you need in no time with our AI-driven, step-by-step assistance

Get Started

Recommended Textbook for

Recent Developments Of Mathematical Fluid Mechanics

Authors: Herbert Amann, Yoshikazu Giga, Hideo Kozono, Hisashi Okamoto, Masao Yamazaki

1st Edition

3034809395, 9783034809399

More Books

Students also viewed these Mathematics questions

Question

What is a baseline?

Answered: 1 week ago