Answered step by step
Verified Expert Solution
Question
1 Approved Answer
Please only replace the questions marks without adding or altering any existing code: # Define the activation function. sigma = np . tanh # Let's
Please only replace the questions marks without adding or altering any existing code: # Define the activation function.
sigma nptanh
# Let's use a random initial weight and bias.
W nparray
b nparray
# define our feed forward function
def aa :
# Notice the next line is almost the same as previously,
# except we are using matrix multiplication rather than scalar multiplication
z
# Everything else is the same though,
return sigmaz
# Next, if a training example is
x nparray
y nparray
# Then the cost function is
d # Vector difference between observed and expected activation
C # Absolute value squared of the difference.
sigma nptanh
# Next define the feedforward equation.
def aw b a :
z
return sigmaz
# This function returns the derivative of the cost function with
# respect to the weight.
def dCdw w b x y :
dCda # Derivative of cost with activation
dadz # derivative of activation with weighted sum z
J
dzdw # derivative of weighted sum z with weight
J
return J # Return the chain rule product.
# This function returns the derivative of the cost function with
# respect to the bias.
# It is very similar to the previous function.
# You should complete this function.
def dCdb w b x y :
dCda
dadz
# Change the next line to give the derivative of
# the weighted sum, z with respect to the bias, b
dzdb
return dCda dadz dzdb
Step by Step Solution
There are 3 Steps involved in it
Step: 1
Get Instant Access to Expert-Tailored Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started