Answered step by step
Verified Expert Solution
Link Copied!

Question

1 Approved Answer

Need help with the following: see log_loss and gradient at the end for reference Part Five: Weight Update of Gradient Ascent [Graded] Write code below

Need help with the following: see log_loss and gradient at the end for reference

Part Five: Weight Update of Gradient Ascent [Graded]

Write code below to implement the weight update of gradient descent on the log_loss function. Hint: use the gradient and log_loss functions from above. Please use a constant learning rate throughout (i.e. do not decrease the learning rate).

def logistic_regression(X, y, max_iter, alpha):

 n, d = X.shape
 w = np.zeros(d)
 b = 0.0
 losses = np.zeros(max_iter) 
 
 for step in range(max_iter):
 # YOUR CODE HERE
 return w, b, losses
 
weight, b, losses = logistic_regression(features, labels, 1000, 1e-04)
plot(losses)
xlabel('iterations')
ylabel('log_loss')
# your loss should go down :-)

#code must pass the follwingTesting

def test_logistic_regression1():

 XUnit = np.array([[-1,1],[-1,0],[0,-1],[-1,2],[1,-2],[1,-1],[1,0],[0,1],[1,-2],[-1,2]])
 YUnit = np.hstack((np.ones(5), -np.ones(5)))
 
 w1, b1, _ = logistic_regression(XUnit, YUnit, 30000, 5e-5)
 w2, b2, _ = logistic_regression_grader(XUnit, YUnit, 30000, 5e-5)
 return (np.linalg.norm(w1 - w2) < 1e-5) and (np.linalg.norm(b1 - b2) < 1e-5)
 
def test_logistic_regression2():
 X = np.vstack((np.random.randn(50, 5), np.random.randn(50, 5) + 2))
 Y = np.hstack((np.ones(50), -np.ones(50)))
 max_iter = 300
 alpha = 1e-5
 w1, b1, _ = logistic_regression(X, Y, max_iter, alpha)
 w2, b2, _ = logistic_regression_grader(X, Y, max_iter, alpha)
 return (np.linalg.norm(w1 - w2) < 1e-5) and (np.linalg.norm(b1 - b2) < 1e-5)
 
def test_logistic_regression3(): # check if losses match predictions
 X = np.vstack((np.random.randn(50, 5), np.random.randn(50, 5) + 2))
 Y = np.hstack((np.ones(50), -np.ones(50)))
 max_iter = 30
 alpha = 1e-5
 w1, b1, losses1 = logistic_regression(X, Y, max_iter, alpha)
 return np.abs(log_loss(X,Y,w1,b1)-losses1[-1])<1e-09
 
def test_logistic_regression4(): # check if loss decreases
 X = np.vstack((np.random.randn(50, 5), np.random.randn(50, 5) + 2))
 Y = np.hstack((np.ones(50), -np.ones(50)))
 max_iter = 30
 alpha = 1e-5
 w1, b1, losses1 = logistic_regression(X, Y, max_iter, alpha)
 return losses[-1] 
 
runtest(test_logistic_regression1, 'test_logistic_regression1')
runtest(test_logistic_regression2, 'test_logistic_regression2')
runtest(test_logistic_regression3, 'test_logistic_regression3')
runtest(test_logistic_regression4, 'test_logistic_regression4')

#Reference Log loss and gradient, already solved below

def log_loss(X, y, w, b=0): # Input: # X: nxd matrix # y: n-dimensional vector with labels (+1 or -1) # w: d-dimensional vector # Output: # a scalar assert np.sum(np.abs(y))==len(y) # check if all labels in y are either +1 or -1 predicted = y_pred(X, w, b) nll = -(1/2)*np.sum((y+1)*np.log(predicted)-(y-1)*np.log(1-predicted)) return nll

Part Four: Compute Gradient [Graded]

Now, verify that the gradient of the log-loss with respect to the weight vector is:

(,,,)==1((+)).NLL(X,y,w,b)w=i=1nyi(yi(wxi+b))xi.

(,,,)==1((+)).NLL(X,y,w,b)b=i=1nyi(yi(wxi+b)).

Implement the function gradient which returns the first derivative with respect to w, b for a given X, y, w, b.

Hint: remember that you derived earlier that ()=()(1())(z)=(z)(1(z)).

def gradient(X, y, w, b):

 # Input:
 # X: nxd matrix
 # y: n-dimensional vector with labels (+1 or -1)
 # w: d-dimensional vector
 # b: a scalar bias term
 # Output:
 # wgrad: d-dimensional vector with gradient
 # bgrad: a scalar with gradient
 
 n, d = X.shape
 wgrad = np.zeros(d)
 bgrad = 0.0
 # YOUR CODE HERE
 m = X.shape[1]
 A = sigmoid(np.multiply((-1)*y,(np.dot(X,w) + b))) 
 A = A.reshape(n,1) 
 y = y.reshape(n,1) 
 wgrad = (1/m)*np.sum(np.dot(np.dot(X.T,A),(-1)*y.T), axis=1)
 bgrad = (1/m)*np.sum((np.multiply(-y,A)))
 return wgrad, bgrad

Step by Step Solution

There are 3 Steps involved in it

Step: 1

blur-text-image

Get Instant Access to Expert-Tailored Solutions

See step-by-step solutions with expert insights and AI powered tools for academic success

Step: 2

blur-text-image

Step: 3

blur-text-image

Ace Your Homework with AI

Get the answers you need in no time with our AI-driven, step-by-step assistance

Get Started

Recommended Textbook for

Marketing Database Analytics

Authors: Andrew D. Banasiewicz

1st Edition

0415657881, 978-0415657884

Students also viewed these Databases questions