Answered step by step
Verified Expert Solution
Link Copied!

Question

1 Approved Answer

Machine Learning(Python) I Need Help: 1.) Writing a ridge regression from the stochastic gradient descent code (see below) and how to write an objective function

Machine Learning(Python)

I Need Help:

1.) Writing a ridge regression from the stochastic gradient descent code (see below) and how to write an objective function for the Ridge Regression.

2.) How to implement Least Squares Linear Regression(LSRL), Ridge Regression,

Lasso Regression, and Elastic Net Regressions using Python and its existing libraries.

HERE IS THE CODE FOR LOADING THE CSV DATASET TO TEST STOCHASTIC GRADIENT DESCENT:-

  • HERE WE HAVE USED BOSTON HOUSING DATASET WHICH IS IN CSV.
  • ACCORDING TO THE VALUE SPECIFIED IN THE CODE, WE HAVE IMPLEMENTED THE SGD FUNCTION.
  • IN LAST STEP WE HAVE PLOTTED THE COMPARISON BETWEEN ACTUAL RESULT AND PREDICTED RESULT.
import pandas as pd import numpy as np import matplotlib.pyplot as plt from sklearn.metrics import mean_squared_error from sklearn.model_selection import train_test_split from sklearn.datasets import load_boston from random import seed from random import randrange from csv import reader from math import sqrt from sklearn import preprocessing X = load_boston().data Y = load_boston().target X_train, X_test, y_train, y_test = train_test_split(X, Y, test_size=0.3, random_state=0) scaler = preprocessing.StandardScaler() X_train = scaler.fit_transform(X_train) X_test = scaler.transform(X_test) X_train = pd.DataFrame(data = X_train, columns=load_boston().feature_names) X_train['cost'] = list(y_train) X_test = pd.DataFrame(data = X_test, columns=load_boston().feature_names) X_test['cost'] = list(y_test) def sgd(X, y, learning_rate=0.3, n_epochs=1000, k=30): w = np.random.randn(1,13) b = np.random.randn(1,1) epoch=1 while epoch <= n_epochs: temp = X.sample(k) X_tr = temp.iloc[:,0:13].values y_tr = temp.iloc[:,-1].values Lw = w Lb = b loss = 0 y_predicted = [] least_loss = [] for i in range(k): Lw = (-2/k * X_tr[i]) * (y_tr[i] - np.dot(X_tr[i],w.T) - b) Lb = (-2/k) * (y_tr[i] - np.dot(X_tr[i],w.T) - b) w = w - learning_rate * Lw b = b - learning_rate * Lb y_pred = np.dot(X_tr[i],w.T) y_predicted.append(y_pred) loss = mean_squared_error(y_predicted, y_tr) epoch+=1 learning_rate = learning_rate/1.03 return w,b def predict_using_sgd(x,w,b): y_predicted=[] for i in range(len(x)): tmp = x X_test = tmp.iloc[:,0:13].values y = np.asscalar(np.dot(w,X_test[i])+b) y_predicted.append(y) return np.array(y_predicted) w,b = sgd(X_train,y_train) y_predicted_sgd = predict_using_sgd(X_test,w,b) from matplotlib.pyplot import figure plt.figure(figsize=(30,6)) plt.plot(y_test, label='actual_result') plt.plot(y_predicted_sgd, label='Predicted_result') plt.legend(prop={'size': 15}) plt.show() 

Step by Step Solution

There are 3 Steps involved in it

Step: 1

blur-text-image

Get Instant Access to Expert-Tailored Solutions

See step-by-step solutions with expert insights and AI powered tools for academic success

Step: 2

blur-text-image

Step: 3

blur-text-image

Ace Your Homework with AI

Get the answers you need in no time with our AI-driven, step-by-step assistance

Get Started

Recommended Textbook for

More Books

Students also viewed these Databases questions

Question

d. What language(s) did they speak?

Answered: 1 week ago

Question

Have ground rules been established for the team?

Answered: 1 week ago

Question

Is how things are said consistent with what is said?

Answered: 1 week ago

Question

Do you currently have a team agreement?

Answered: 1 week ago