Question: DEEP LEARNING # Notice that you don't need any other packages for this mid - term import numpy as np import pandas as pd import
DEEP LEARNING
# Notice that you don't need any other packages for this midterm
import numpy as np
import pandas as pd
import random
from matplotlib import pyplot as plt
random.seed # NEVER change this line; this is for grading
# Reading the dataset
data pdreadcsvfashiondata.csv
# The data preprocessing is done for you. Please do NOT edit the cell
# However, you should understand what these codes are doing
data nparraydata
m n data.shape
nprandom.shuffledata # shuffle before splitting into dev and training sets
datadev data:T
Ydev datadev
Xdev datadev:n
Xdev Xdev
datatrain data:mT
Ytrain datatrain
Xtrain datatrain:n
Xtrain Xtrain
mtrain Xtrain.shape
Part : Building your own neural network
In :
# define a global variable specifying the number of hidden neurons after the first layer
# not the best practice, but we will do it for this midterm project
numhiddenneurons
This is the main part of the midterm. You must not change the definition of the function. In fact, the comments are going to help you go through the implementation and they are all you need
Initialize the parameter in the neural network
In :
# Initialize the parameters in the neural network
# Based on the figure above, we need the weight and bias matrices.
# W b are the matrices for the first layer
# W b are the matrices for the second layer
# You should think about the sizes of the matrices
# then initialize elements in the matrix to be random numbers between to
def initparams:
W # Your code here
b # Your code here
W # Your code here
b # Your code here
return W b W b
Implement the nonlinearity functions and its derivatives
In :
# As a starting point, you only need a ReLu function, its derivative, and the softmax function
def ReLUZ:
# Your code here
def ReLUderivZ:
# Your code here
def softmaxZ:
# Your code here
return A
Implement the forward propagation function
In :
# In the forward propagation function, X is the inputs the image in vector form and we pass all the weights and biases
def forwardpropW b W b X:
Z # Your code here
A # Your code here
Z # Your code here
A # Your code here
return Z A Z A
Implement the backward propagation function
In :
# This one hot function is to convert a numeric number into a onehot vector
def onehotY:
# Your code here
return onehotY
# Now performing the backward propagation
# Each function is only one line, but lots of Calculus behind
def backwardpropZ A Z A W W X Y:
onehotY onehotY
dZ # Your code here
dW # Your code here
db # Your code here
dZ # Your code here
dW # Your code here
db # Your code here
return dW db dW db
# Finally, we are ready to update the parameters
def updateparamsW b W b dW db dW db alpha:
W # Your code here
b # Your code here
W # Your code here
b # Your code here
return W b W b
Performing the gradient descent
In :
# Implement the helper function. We need to convert the softmax output into a numeric label
# This is done through getpredictions function
def getpredictionsA:
# Your code here
# We also want to have a simple function to compute the accuracy. Notice that "predictions" and Y are the same shape
def getaccuracypredictions Y:
return # Your code here
# Finally, we are ready to implement gradient descent
def gradientdescentX Y alpha, iterations:
W b W b # Your code here using the function you have implemented
for i in rangeiterations:
Z A Z A # Your code here using the function you have implemented
dW db dW db # Your code here using the function you have implemented
W b W b # Your code here using the function you have implemented
if i :
printIteration: i
predictions getpredictionsA
printgetaccuracypredictions Y
return W b W b
In :
W b W b gradientdescentXtrain, Ytrain,
Validation Set Performance
In :
def makepredictionsX W b W b:
A forwardpropW b W b X
predictions getpredictionsA
return predictions
In :
devpredictions makepredictionsXdev, W b W b
getaccuracydevpredictions, Ydev
Exploring some samples
In :
def testpredictionindex W b W b:
currentimage Xtrain: index, None
prediction makepredictionsXtrain: index, None W b W b
label Ytrainindex
printP
Step by Step Solution
There are 3 Steps involved in it
1 Expert Approved Answer
Step: 1 Unlock
Question Has Been Solved by an Expert!
Get step-by-step solutions from verified subject matter experts
Step: 2 Unlock
Step: 3 Unlock
