5.26 ( ) Consider a multilayer perceptron with arbitrary feed-forward topology, which is to be trained by

Question:

5.26 ( ) Consider a multilayer perceptron with arbitrary feed-forward topology, which is to be trained by minimizing the tangent propagation error function (5.127) in which the regularizing function is given by (5.128). Show that the regularization term Ω can be written as a sum over patterns of terms of the form

Ωn =

1 2



k

(Gyk)2 (5.201)

where G is a differential operator defined by G ≡



i

τi

∂xi

. (5.202)

By acting on the forward propagation equations zj = h(aj), aj =



i wjizi (5.203)

with the operator G, show that Ωn can be evaluated by forward propagation using the following equations:

αj = h

(aj)βj, βj =



i wjiαi. (5.204)

where we have defined the new variables

αj ≡ Gzj, βj ≡ Gaj . (5.205)

Now show that the derivatives of Ωn with respect to a weight wrs in the network can be written in the form

∂Ωn

∂wrs

=



k

αk {φkrzs + δkrαs} (5.206)

where we have defined

δkr ≡ ∂yk

∂ar

, φkr ≡ Gδkr. (5.207)

Write down the backpropagation equations for δkr, and hence derive a set of backpropagation equations for the evaluation of the φkr.

Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Related Book For  book-img-for-question
Question Posted: