Answered step by step
Verified Expert Solution
Link Copied!

Question

1 Approved Answer

Consider the following network structure. You can assume the initial weights. Assume bias to be zero for easier computations. Given that < x 1 ,

Consider the following network structure. You can assume the initial weights. Assume bias to be zero for easier computations. Given that < x1, x2,y1,y2>=<1,1,0,1> where y is the target. Assume \beta =0.9 and \eta =0.01.
x1====>h1====>y1
x1====>h2====>y1
x1====>h1====>y2
x2====>h1====>y2
x2====>h2====>y1
x2====>h2====>y2
(a) Compute the forward propagation and generate the output. Use Relu for hidden layers and Sigmoid activation function for output layer.
(b) Compute the Softmax loss function for both outputs.
(c) Let the initial weights that assumed be the weights [at time (t-1). Compute the weights v21, w12 and w22 at time t using SGD.
(d) Let the weight at time t be the ones computed in part (c). Compute the weights v21, w12 and w22 at (t +1) when momentum is used.

Step by Step Solution

There are 3 Steps involved in it

Step: 1

blur-text-image

Get Instant Access to Expert-Tailored Solutions

See step-by-step solutions with expert insights and AI powered tools for academic success

Step: 2

blur-text-image

Step: 3

blur-text-image

Ace Your Homework with AI

Get the answers you need in no time with our AI-driven, step-by-step assistance

Get Started

Recommended Textbook for

Data And Information Quality Dimensions, Principles And Techniques

Authors: Carlo Batini, Monica Scannapieco

1st Edition

3319241060, 9783319241067

More Books

Students also viewed these Databases questions

Question

What does an organization chart show? What doesnt it show?

Answered: 1 week ago

Question

1 Why might people resist change?

Answered: 1 week ago