Answered step by step
Verified Expert Solution
Link Copied!

Question

1 Approved Answer

Back - propagation Algorithm 2 . Back - propagation Algorithm Once we set up the architecture of our ( feedforward ) neural network, our goal

Back-propagation Algorithm 2. Back-propagation Algorithm Once we set up the architecture of our (feedforward) neural network, our goal will be to find weight parameters that minimize our loss function. We will use the stochastic gradient descent algorithm (which you learned in Lecture 4 and revisited in lecture 5) to carry out the optimization. This involves computing the gradient of the loss function with respect to the weight parameters. Since the loss function is a long chain of compositions of activation functions with the weight parameters entering at different stages, we will break down the computation of the gradient into different pieces via the chain rule; this way of computing the gradient is called the backpropagation algorithm. In the following problems, we will explore the main step in the stochastic gradient descent algorithm for training the following simple neural network from the video:This simple neural network is made up of L hidden layers, but each layer consists of only one unit, and each unit has activation function f. As usual, x is the input, z_i is the weighted combination of the inputs to the i^t h hidden layer. In this one-dimensional case, weighted combination reduces to products:
z_1=x w_1
for i=2... L: z_i =f_i-1 w_i where f_i-
We will use the following loss function:
(y, f_L)=(y-f_L)^2
where y is the true value, and f_L is the output of the neural network.and the true label y. Let \delta _i=/ z_i In this problem, we derive a recurrence relation between \delta _i and \delta _i+1 Assume that f is the hyperbolic tangent function:
f(x)=tanh (x)
f^'(x)=(1-tanh ^2(x))
Which of the following option is the correct expression for \delta _1 in terms of \delta _2?
\delta _1=(1-f_1^2) w_2\delta _2
\delta _1=(1-f_1^2) w_1\delta _2
\delta _1=(1-f_2^2) w_2\delta _2
\delta _2=(1-f_1^2) w_2\delta _1

Step by Step Solution

There are 3 Steps involved in it

Step: 1

blur-text-image

Get Instant Access to Expert-Tailored Solutions

See step-by-step solutions with expert insights and AI powered tools for academic success

Step: 2

blur-text-image

Step: 3

blur-text-image

Ace Your Homework with AI

Get the answers you need in no time with our AI-driven, step-by-step assistance

Get Started

Recommended Textbook for

Intelligent Databases Technologies And Applications

Authors: Zongmin Ma

1st Edition

1599041219, 978-1599041216

More Books

Students also viewed these Databases questions

Question

The nature and importance of the global marketplace.

Answered: 1 week ago