Forward Mode Differentiation: The backpropagation algorithm needs to compute node-to-node derivatives of output nodes with respect to
Question:
Forward Mode Differentiation: The backpropagation algorithm needs to compute node-to-node derivatives of output nodes with respect to all other nodes, and therefore computing gradients in the backwards direction makes sense. Consequently, the pseudocode on page 228 propagates gradients in the backward direction. However, consider the case where we want to compute the node-to-node derivatives of all nodes with respect to source (input) nodes s1 . . . sk. In other words, we want to compute ∂x
∂si for each non-input node variable x and each input node si in the network. Propose a variation of the pseudocode of page 228 that computes node-to-node gradients in the forward direction.
Fantastic news! We've Found the answer you've been seeking!
Step by Step Answer:
Related Book For
Question Posted: