Answered step by step
Verified Expert Solution
Question
1 Approved Answer
please fill in the question marks: # First we set the state of the network sigma = np . tanh w 1 = 1
please fill in the question marks: # First we set the state of the network sigma nptanh w b # Then we define the neuron activation. def aa: z w a b return sigma z # Experiment with different values of x below. x printax # First define our sigma function. sigma nptanh # Next define the feedforward equation. def aw b a : z w a b return sigmaz # The individual cost function is the square of the difference between # the network output and the training data output. def C w b x y : return aw b x y # This function returns the derivative of the cost function with # respect to the weight. def dCdw w b x y : z dCda # Derivative of cost with activation dadz npcoshz # derivative of activation with weighted sum z dzdw # derivative of weighted sum z with weight return # Return the chain rule product. # This function returns the derivative of the cost function with # respect to the bias. # It is very similar to the previous function. # You should complete this function. def dCdb w b x y : z dCda dadz # Change the next line to give the derivative of # the weighted sum, z with respect to the bias, b dzdb return w b # We can test on a single data point pair of x and y x y # Output how the cost would change # in proportion to a small change in the bias print dCdbw b x y You should get
please fill in the question marks:
# First we set the state of the network
sigma nptanh
w
b
# Then we define the neuron activation.
def aa:
z w a b
return sigma z
# Experiment with different values of x below.
x
printax
# First define our sigma function.
sigma nptanh
# Next define the feedforward equation.
def aw b a :
z w a b
return sigmaz
# The individual cost function is the square of the difference between
# the network output and the training data output.
def C w b x y :
return aw b x y
# This function returns the derivative of the cost function with
# respect to the weight.
def dCdw w b x y :
z
dCda # Derivative of cost with activation
dadz npcoshz # derivative of activation with weighted sum z
dzdw # derivative of weighted sum z with weight
return # Return the chain rule product.
# This function returns the derivative of the cost function with
# respect to the bias.
# It is very similar to the previous function.
# You should complete this function.
def dCdb w b x y :
z
dCda
dadz
# Change the next line to give the derivative of
# the weighted sum, z with respect to the bias, b
dzdb
return
w
b
# We can test on a single data point pair of x and y
x
y
# Output how the cost would change
# in proportion to a small change in the bias
print dCdbw b x y
You should get
Step by Step Solution
There are 3 Steps involved in it
Step: 1
Get Instant Access with AI-Powered Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started