Answered step by step
Verified Expert Solution
Link Copied!

Question

1 Approved Answer

Please solve this using python. Please show the code and explain! Question 3. Practice with logistic regression Let's first load the textbook's implementation of logistic

Please solve this using python. Please show the code and explain!

image text in transcribedimage text in transcribed

Question 3. Practice with logistic regression Let's first load the textbook's implementation of logistic regression with gradient descent. In [6]: class LogisticRegressionGD (object): ""Logistic Regression Classifier using gradient descent. Parameters eta : float Learning rate (between 0.0 and 1.0) n_iter : int Passes over the training dataset. random_state : int Random number generator seed for random weight initialization. Attributes W_ : id-array Weights after fitting. cost_ : list Logistic cost function value in each epoch. def __init__(self, eta=0.65, n_iter=100, random_state=1): self.eta = eta self.n_iter = n_iter self.random_state = random_state def fit(self, x, y): " Fit training data. Parameters X : {array-like), shape = [n_examples, n_features] Training vectors, where n_examples is the number of examples and n_features is the number of features. y : array-like, shape = [n_examples] Target values. Returns self : object rgen = np.random. RandomState (self.random_state) self.w_ = rgen. normal(loc=0.0, scale=0.01, size=1 + X. shape[1]) self.cost_ = [] for i in range(self.n_iter): net_input = self.net_input(X) output = self.activation (net_input) errors = (y - output) self.w_[1:] += self.eta * X.T.dot(errors) self.w_[@] += self.eta * errors. sum() # note that we compute the Logistic cost now # instead of the sum of squared errors cost cost = -y.dot(np.log(output)) - ((1 - y).dot (np.log(1 - output) self.cost_.append(cost) return self def net_input(self, X): """Calculate net input""". return np.dot(X, self.w_[1:]) = self.w_[@] def activation(self, z): ""'"Compute logistic sigmoid activation""". return 1. / (1. + np.exp(-np.clip(z, -250, 250))) def predict(self, X): """Return class label after unit step""" return np.where(self.net_input(x) >= 0.0, 1, 0) # equivalent to: # return np.where(self activation(self.net_input(X)) >= 0.5, 1, 0) Below you can see the first 3 data points of the data set, all labeled as 'setosa'. Let's set the numerical value for 'setosa' to 1. (i.e. y = 1). In [5]: x[0:3] Out[5]: array([[5.1, 1.4], [4.9, 1.4]]) Suppose the initial weights of the logistic neuron are wo=0.1, W1=-0.2, W2=0.1 (i) Write the weights after processing data points 0,1,2, with n = 0.1 and show your calculations. (This is similar to the previous assignment, only done now for the logistic neuron) (ii) Using LogisticRegression GD check if the data set you constructed in Question 2, also force logistic regression to fail. You can experiment with the number of iterations and the learning rate n. (iii) (optional) If logistic regression does not fail for your data set, can you construct another (linearly separable) data set which causes it to fail? (Please insert cells below for your answers. Clearly id the part of the question you answer) In [ ]: N Question 3. Practice with logistic regression Let's first load the textbook's implementation of logistic regression with gradient descent. In [6]: class LogisticRegressionGD (object): ""Logistic Regression Classifier using gradient descent. Parameters eta : float Learning rate (between 0.0 and 1.0) n_iter : int Passes over the training dataset. random_state : int Random number generator seed for random weight initialization. Attributes W_ : id-array Weights after fitting. cost_ : list Logistic cost function value in each epoch. def __init__(self, eta=0.65, n_iter=100, random_state=1): self.eta = eta self.n_iter = n_iter self.random_state = random_state def fit(self, x, y): " Fit training data. Parameters X : {array-like), shape = [n_examples, n_features] Training vectors, where n_examples is the number of examples and n_features is the number of features. y : array-like, shape = [n_examples] Target values. Returns self : object rgen = np.random. RandomState (self.random_state) self.w_ = rgen. normal(loc=0.0, scale=0.01, size=1 + X. shape[1]) self.cost_ = [] for i in range(self.n_iter): net_input = self.net_input(X) output = self.activation (net_input) errors = (y - output) self.w_[1:] += self.eta * X.T.dot(errors) self.w_[@] += self.eta * errors. sum() # note that we compute the Logistic cost now # instead of the sum of squared errors cost cost = -y.dot(np.log(output)) - ((1 - y).dot (np.log(1 - output) self.cost_.append(cost) return self def net_input(self, X): """Calculate net input""". return np.dot(X, self.w_[1:]) = self.w_[@] def activation(self, z): ""'"Compute logistic sigmoid activation""". return 1. / (1. + np.exp(-np.clip(z, -250, 250))) def predict(self, X): """Return class label after unit step""" return np.where(self.net_input(x) >= 0.0, 1, 0) # equivalent to: # return np.where(self activation(self.net_input(X)) >= 0.5, 1, 0) Below you can see the first 3 data points of the data set, all labeled as 'setosa'. Let's set the numerical value for 'setosa' to 1. (i.e. y = 1). In [5]: x[0:3] Out[5]: array([[5.1, 1.4], [4.9, 1.4]]) Suppose the initial weights of the logistic neuron are wo=0.1, W1=-0.2, W2=0.1 (i) Write the weights after processing data points 0,1,2, with n = 0.1 and show your calculations. (This is similar to the previous assignment, only done now for the logistic neuron) (ii) Using LogisticRegression GD check if the data set you constructed in Question 2, also force logistic regression to fail. You can experiment with the number of iterations and the learning rate n. (iii) (optional) If logistic regression does not fail for your data set, can you construct another (linearly separable) data set which causes it to fail? (Please insert cells below for your answers. Clearly id the part of the question you answer) In [ ]: N

Step by Step Solution

There are 3 Steps involved in it

Step: 1

blur-text-image

Get Instant Access to Expert-Tailored Solutions

See step-by-step solutions with expert insights and AI powered tools for academic success

Step: 2

blur-text-image

Step: 3

blur-text-image

Ace Your Homework with AI

Get the answers you need in no time with our AI-driven, step-by-step assistance

Get Started

Recommended Textbook for

Database Processing Fundamentals Design

Authors: Marion Donnie Dutton Don F. Seaman

14th Edition Globel Edition

1292107634, 978-1292107639

More Books

Students also viewed these Databases questions

Question

Have I incorporated my research into my outline effectively?

Answered: 1 week ago