Answered step by step
Verified Expert Solution
Question
1 Approved Answer
Consider the simplest deep linear neural network that is described by the following equations: 2= w12, y = W22, with c, w1, W2, Y ER.
Consider the simplest deep linear neural network that is described by the following equations: 2= w12, y = W22, with c, w1, W2, Y ER. This is indeed a very simple DNN that receives a one dimensional input, has one hidden layer with one unit, and one output. This simplicity is to ensure that the calculations are all easy. Consider the squared error loss function (y,t) = (y - t)? Part (a) Show that one can replace this 2-layer NN with a 1-layer NN (show the relation of the input I to the output y). (2 MARKS] Part (b) Compute the gradient of the loss of the 2-layer NN with respect to wi and w2. [4 MARKS] Part (c) Is the loss function of the 2-layer NN convex with respect to wi and w2 or not? Prove your claim. [4 MARKS Consider the simplest deep linear neural network that is described by the following equations: 2= w12, y = W22, with c, w1, W2, Y ER. This is indeed a very simple DNN that receives a one dimensional input, has one hidden layer with one unit, and one output. This simplicity is to ensure that the calculations are all easy. Consider the squared error loss function (y,t) = (y - t)? Part (a) Show that one can replace this 2-layer NN with a 1-layer NN (show the relation of the input I to the output y). (2 MARKS] Part (b) Compute the gradient of the loss of the 2-layer NN with respect to wi and w2. [4 MARKS] Part (c) Is the loss function of the 2-layer NN convex with respect to wi and w2 or not? Prove your claim. [4 MARKS
Step by Step Solution
There are 3 Steps involved in it
Step: 1
Get Instant Access to Expert-Tailored Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started