Answered step by step
Verified Expert Solution
Link Copied!

Question

1 Approved Answer

Q 4 ) Gradient Descent. Implement the Gradient descent algorithm using the source code given for the loss function L = 3 x 2 +

Q4) Gradient Descent.
Implement the Gradient descent algorithm using the source code given for the loss function L =3 x2+2 y2+20 cos(x) cos(y)
Modify the source code given only in-between the lines marked with com- ment as described in Q3. Do not modify any other line of the script.
Deliverables: A) A screenshot of the 3D plot obtained when you execute the python script after adding your lines of code. B) A screenshot of the console output obtained obtained when you execute the python script after adding your lines of code. C) The equations of the Loss function and gradient of loss function on which the gradient descent was performed. D) A snapshot of the lines of code that was added by you in the designated places. E) State whether the algorithm converged to the global minima, a local minima or if it failed to converge in your case.
import numpy as np
import matplotlib.pyplot as plt
import random
##### To be Updated #####
# e.g.,if your BITS email id is 023ab12345@wilp.bits-pilani.com
# update the below line as student_id ="023xx12345"
student_id =
#########################
student_id =''.join([i for i in student_id if i.isdigit()])
random.seed(student_id)
# set the number of iterations and learning rate
iters = random.randint(100,300)
learning_rate =0.01
# Evaluate the function at x
def C(x):
##### To be Updated #####
# NOTE: return value of this function will
# **ALSO** change for Q4 of the assignment
#########################
return (x.T@np.array([[2,1],[1,20]])@x)-(np.array([5,3]).reshape(2,1).T@x)
# Evaluate the gradient of function at x
def dC(x):
##### To be Updated #####
# 1. Compute and return the gradient
return
#########################
def plot_grad_change(X,Y,Z, c, grad_xs0, grad_xs1, grad_ys):
fig = plt.figure()
title_str = "Gradient Descent:"+"lr="+str(learning_rate)
plt.title(title_str)
ax = fig.add_subplot(projection='3d')
ax.plot_surface(X, Y, Z, cmap=plt.cm.YlGnBu_r,alpha=0.7)
for i in range(len(grad_xs0)):
ax.plot([grad_xs0[i]],[grad_xs1[i]], grad_ys[i][0], markerfacecolor='r', markeredgecolor='r', marker='o', markersize=7)
ax.text(grad_xs0[-1],grad_xs1[-1],grad_ys[-1][0][0],
"("+str(round(grad_xs0[-1],2))+","+
str(round(grad_xs1[-1],2))+"),"+
str(round(grad_ys[-1][0][0],2)))
plt.show()
def GD(start,x,y,z, c, dc, iters, eta):
px = start.astype(float)
py = c(px).astype(float)
print("GD Start Point:",px,py)
print("Num steps:",iters)
grad_xs0, grad_xs1, grad_ys =[px[0][0]],[px[1][0]],[py]
for iter in range(iters):
##### To be Updated #####
# 2. Update px using gradient descent
px =
# 3. Update py
py =
#########################
grad_xs0.append(px[0][0])
grad_xs1.append(px[1][0])
grad_ys.append(py)
print("Converged Point:",px,py)
plot_grad_change(x,y,z, c, grad_xs0,grad_xs1, grad_ys)
lo =-10
hi =10
x1= round(random.uniform(lo,0),4)
x2= round(random.uniform(lo,0),4)
x = np.linspace(lo,1, hi)
y = np.linspace(lo,1, hi)
X, Y = np.meshgrid(x, y)
Z = np.zeros_like(X)
for i in range(X.shape[0]):
for j in range(X.shape[1]):
Z[i][j]= C(np.array([X[i][j],Y[i][j]]).reshape(2,1))
# start Gradient Descent
GD(np.array([x1,x2]).reshape(2,1),X,Y,Z, C, dC, iters, learning_rate)

Step by Step Solution

There are 3 Steps involved in it

Step: 1

blur-text-image

Get Instant Access to Expert-Tailored Solutions

See step-by-step solutions with expert insights and AI powered tools for academic success

Step: 2

blur-text-image

Step: 3

blur-text-image

Ace Your Homework with AI

Get the answers you need in no time with our AI-driven, step-by-step assistance

Get Started

Recommended Textbook for

Database Theory Icdt 97 6th International Conference Delphi Greece January 8 10 1997 Proceedings Lncs 1186

Authors: Foto N. Afrati ,Phokion G. Kolaitis

1st Edition

3540622225, 978-3540622222

More Books

Students also viewed these Databases questions

Question

here) and other areas you consider relevant.

Answered: 1 week ago