Answered step by step
Verified Expert Solution
Question
1 Approved Answer
### Task 2 . 2 - Backward propagation In this task, you will start with random weights for ` w 0 ` and ` w
### Task Backward propagation
In this task, you will start with random weights for w and w and iteratively perform forward passes and backward propagation multiple times to converge on a solution.
Submit your values of ww and loss value onto Coursemology. Your loss value should be less than
torch.manualseed # Set seed to some fixed value
w torch.randn requiresgradTrue
w torch.randn requiresgradTrue
learningrate e
printiter 'loss',
sept
for t in range:
# Forward pass: compute predicted y
ypred forwardpassx w w torch.relu
loss torch.meantorchsquarey ypred
loss.backward
if t :
printt loss.item sept
with torch.nograd:
# Update weights and then reset the gradients to zero
raise NotImplementedError
print w w sep
print w w sep
ypred forwardpassx w w torch.relu
pltplotx y linestyle'solid', labelx
pltplotx ypred.detachnumpy linestyle'dashed', label'perceptron'
pltaxisequal
plttitleFit NN on abs function'
pltlegend
pltshow
# Task : Submit the values of ww and loss values after fitting
# Note: An acceptable loss value should be less than
# You should try adjusting the random seed, learning rate, or
# number of iterations to improve your model.
w # to be computed
w # to be computed
loss # to be computed
w torch.tensorw
w torch.tensorw
x torch.linspacereshape
y torch.absx
#IMPORTANT: Your forward pass above have to be correctly implemented
ypred forwardpassx w w torch.relu
computedmseloss torch.meantorchsquarey ypreditem
assert loss
assert isclosecomputedmseloss, loss, atole rtole
Step by Step Solution
There are 3 Steps involved in it
Step: 1
Get Instant Access to Expert-Tailored Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started