Answered step by step
Verified Expert Solution
Question
1 Approved Answer
Question 4 (40 points): For a sequential input x neuron is as follows: X1 , , xt and y ={y1, , yt}, a recurrent ai
Question 4 (40 points): For a sequential input x neuron is as follows: X1 , , xt and y ={y1, , yt}, a recurrent ai - sigmoid(Wyzi+ by) where Wx, Wh and Wy are weight matrices, bh and by are bias vectors, and z_i is the hidden state, ai is the output at time step i, and error is computed as a difference of ai and yi. 4.1) For a sequence input data x1, x2 and x3, and output sequence of y1, y2 and y3, write the forward propagation. 4.2) To compute back propagation gradients, you need to computes three error values, loss(y1, a1), loss(y2,a2) and loss(y3,a3). You can suppose the loss function is a mean square loss. Write back propagation chain rule in terms of weight matrices. Question 4 (40 points): For a sequential input x neuron is as follows: X1 , , xt and y ={y1, , yt}, a recurrent ai - sigmoid(Wyzi+ by) where Wx, Wh and Wy are weight matrices, bh and by are bias vectors, and z_i is the hidden state, ai is the output at time step i, and error is computed as a difference of ai and yi. 4.1) For a sequence input data x1, x2 and x3, and output sequence of y1, y2 and y3, write the forward propagation. 4.2) To compute back propagation gradients, you need to computes three error values, loss(y1, a1), loss(y2,a2) and loss(y3,a3). You can suppose the loss function is a mean square loss. Write back propagation chain rule in terms of weight matrices
Step by Step Solution
There are 3 Steps involved in it
Step: 1
Get Instant Access to Expert-Tailored Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started