Answered step by step
Verified Expert Solution
Link Copied!

Question

1 Approved Answer

Consider the following. A trivial neural network that takes a single, scalar x as input and a scalar y as output. It has a first

Consider the following. A trivial neural network that takes a single, scalar x as input and a scalar y as output. It has a first layer with weight w1=1.5, bias b1=1, and nonlinearity (x) = x^2; and it has a second layer with weight w2=4, bias b2=450 and no nonlinearity. You can assume that all weights and biases are scalar, so the output y is a scalar as well. The loss function is L(y', y) = (y'-y)^2. One compute graph for this neural network and loss is the following:

h = x*w1 + b1

o = h^2o

y' = o * w2 + b2

L = (y'-y)^2

Now use back-propagation to compute, the gradient of the loss for the training sample (x1=2, y1=520) relative to w1 is

a. -192

b. -96

c. -1314

d. -768

The gradient of the loss for the same training sample relative to b1 is

a. -12

b. -6

c. -288

d. -384

Step by Step Solution

There are 3 Steps involved in it

Step: 1

blur-text-image

Get Instant Access to Expert-Tailored Solutions

See step-by-step solutions with expert insights and AI powered tools for academic success

Step: 2

blur-text-image

Step: 3

blur-text-image

Ace Your Homework with AI

Get the answers you need in no time with our AI-driven, step-by-step assistance

Get Started

Recommended Textbook for

An Introduction to Analysis

Authors: William R. Wade

4th edition

132296381, 978-0132296380

More Books

Students also viewed these Mathematics questions