Answered step by step
Verified Expert Solution
Question
1 Approved Answer
Machine learning question, use jupyter lab, no AI plz: Import this: import sys from packaging import version import sklearn import matplotlib.pyplot as plt import numpy
Machine learning question, use jupyter lab, no AI plz:
Import this:
import sys
from packaging import version
import sklearn
import matplotlib.pyplot as plt
import numpy as np
from sklearn.preprocessing import adddummyfeature
import numpy as np
from sklearn import metrics #Import scikitlearn metrics module for accuracy calculation
from sklearn.modelselection import traintestsplit
from sklearn.metrics import confusionmatrix, ConfusionMatrixDisplay
from sklearn.datasets import makeblobs
printSklearn package",sysversioninfo
printSklearn package",sklearn.version
assert sysversioninfo
#assert version.parsesklearnversion version.parse
pltrcfont size
pltrcaxes labelsize titlesize
pltrclegend fontsize
pltrcxtick labelsize
pltrcytick labelsize
Questions:
Q Early Stopping
Modify the gradient descent code for linear regression so that the gradient descent stops updating the weights when the changes in the loss is too small.
You can stop updating when absnewloss oldloss is less than a tolerance threshold of tolerance e
def fxw:
returnxT@witem
def JvectorizedXyw:
m Xshape
V X@wy
sumsquarederror VT @ V
return sumsquarederrormitem
def JdeltavectorizedXyw:
m Xshape
sumw XT @ X@wy
#return summ
return sumwm
def simplegradientvectorizedXytheta, nepochs, eta:
thetapath theta
for epoch in rangenepochs:
gradients JdeltavectorizedXytheta
theta theta eta gradients
thetapath.appendtheta
returnthetapath
nprandom.seed # to make this code example reproducible
m # number of instances
X nprandom.randm # column vector
y X nprandom.randnm # column vector
Xorig X
X adddummyfeatureX # add x to each instance
printBefore adding a X Xorig
printAfter adding a X X
pltplotXorig, yb
pltgrid
pltxlim
pltylim
##Without early stopping
eta # learning rate
nepochs
m lenX # number of instances
nprandom.seed
theta nprandom.randn # randomly initialized model parameters
thetapath simplegradientvectorizedXytheta,nepochs, eta
printLoss at final theta",JvectorizedXythetapath
printNumber of updates made",lenthetapath
def simplegradientvectorizedearlystoppingXytheta, nepochs, eta,tolerance:
thetapath theta
currentloss JvectorizedXytheta #initial loss
#iterate over nepochs, update the gradient, calculate the new loss, break from the loop if
#the loss does not improve compared to the loss in the previous iteration.
#ifnpabscurrentlossprevloss tolerance
returnthetapath
The code should stop after around iterations
eta # learning rate
nepochs
m lenX # number of instances
nprandom.seed
theta nprandom.randn # randomly initialized model parameters
thetapath simplegradientvectorizedearlystoppingXytheta,nepochs, eta,e
printLoss at final theta",JvectorizedXythetapath
printNumber of updates made",lenthetapath
Step by Step Solution
There are 3 Steps involved in it
Step: 1
Get Instant Access to Expert-Tailored Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started