Question
IMPLETMENTING SIMPLIFIED GRADIENT DESCENT. Your are given most of the code in Python. Will upvote if instructions followed correctly and done ASAP: import numpy as
IMPLETMENTING SIMPLIFIED GRADIENT DESCENT. Your are given most of the code in Python. Will upvote if instructions followed correctly and done ASAP:
import numpy as np
# returns funkier function over a reasonable range of X, Y values to plot
def funkier(delta=0.01):
delta = 0.01
x = np.arange(-3.3, 3.3, delta)
y = np.arange(-2.8, 2.8, delta)
X, Y = np.meshgrid(x, y)
Z1 = np.exp(-X**2 - Y**2) # centered at (0,0)
Z2 = np.exp(-(X - 1)**2 - (Y - 1)**2) # centered at (1,1)
Z3 = np.exp(-(X + 1)**2 - (Y + 1)**2) # centered at (-1,-1)
Z = Z1 - Z2 - 0.7*Z3
return X, Y, Z
# given X and Y, returns Z
# X and Y can be arrays or single values...numpy will handle it!
def funkier_z(X, Y):
Z1 = np.exp(-X**2 - Y**2)
Z2 = np.exp(-(X - 1)**2 - (Y - 1)**2)
Z3 = np.exp(-(X + 1)**2 - (Y + 1)**2)
Z = Z1 - Z2 - 0.7*Z3
return Z
X, Y, Z = funkier()
# Setting the figure size and 3D projection
fig = plt.figure(figsize=(12,10))
ax = fig.add_subplot(projection='3d')
# Creating labels for the axes
ax.set_xlabel('X')
ax.set_ylabel('Y')
ax.set_zlabel('Z')
_ = ax.plot_surface(X, Y, Z, cmap=cm.coolwarm)
STEP 1: Implement the gradient function. For a certain x and y, compute dfunkier/dx and dfunkier/dy. funkier_grad should return a tuple with 2 values for the gradient:
def funkier_grad(x,y):
# Your code here
STEP 2: Implement descent. Use the code below to implement descent. Simply call on your gradient function from the previous step and implement the gradient step. Additionally, add comments before each line in entire minimizer function:
def funkier_minimize(x0, y0, eta):
x = np.zeros(len(eta) + 1)
y = np.zeros(len(eta) + 1)
x[0] = x0
y[0] = y0
print(' Using starting point: ', x[0], y[0])
for i in range(len(eta)):
if i % 5 == 0:
print('{0:2d}: x={1:6.3f} y={2:6.3f} z={3:6.3f}'.format(i, x[i], y[i], funkier_z(x[i], y[i])))
### Your code here ###
if (abs(x[i+1] - x[i]) < 1e-6):
return x[:i+2], y[:i+2]
if abs(x[i+1]) > 100:
print('Oh no, diverging?')
return x[:i+2], y[:i+2]
return x, y
STEP 3 PLOTTING: You are given the max number of iterations and step size, as well as funtion to 3D plot the trajectory of gradient descent:
max_iter = 30
eta = 0.1 * np.ones(max_iter)
def plot_3D(xs, ys, zs):
fig = plt.figure(figsize=(12,10))
ax = fig.add_subplot(projection='3d')
ax.set_xlabel('X')
ax.set_ylabel('Y')
ax.set_zlabel('Z')
elev=ax.elev
azim=ax.azim
ax.view_init(elev= elev, azim = azim)
_ = ax.plot_surface(X, Y, Z, cmap=cm.coolwarm, alpha=0.5)
ax.plot(xs, ys, zs, color='orange', markerfacecolor='black', markeredgecolor='k', marker='o', markersize=5)
x_opt, y_opt = funkier_minimize(0.05, 0.05, eta)
z_opt = funkier_z(x_opt, y_opt)
plot_3D(x_opt, y_opt, z_opt)
Using descent to find the optimum:
x_opt, y_opt = funkier_minimize(0.05, 0.05, eta)
z_opt = funkier_z(x_opt, y_opt)
plot_3D(x_opt, y_opt, z_opt)
Now, TRY IT ON YOUR OWN: starting at (x,y)=(0.05,0.05) and using the same eta. Run gradient descent and plot the results:
# Your code here
Step by Step Solution
There are 3 Steps involved in it
Step: 1
Get Instant Access to Expert-Tailored Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started