3.11 Rademacher complexity of regularized neural networks. Let the input space be X = Rn1 . In...

Question:

3.11 Rademacher complexity of regularized neural networks. Let the input space be X = Rn1 . In this problem, we consider the family of regularized neural networks de ned by the following set of functions mapping X to R:

H =

8<

:x 7!

Xn2 j=1 wj(uj  x) : kwk1  0; kujk2  ; 8j 2 [n2]

9=

;;

where  is an L-Lipschitz function. As an example,  could be the sigmoid function which is 1-Lipschitz.

(a) Show that bR S(H) = 0 m E

h supkuk2 j Pm i=1 i(u  xi)j i

.

(b) Use the following form of Talagrand's lemma valid for all hypothesis sets H and L-Lipschitz function :

1 m

E

"

sup h2H Xm i=1

i(  h)(xi)

#



L m

E

"

sup h2H Xm i=1

ih(xi)

#

;

to upper bound bR S(H) in terms of the empirical Rademacher complexity of H0, where H0 is de ned by H0 = fx 7! s(u  x) : kuk2  ; s 2 f????1; +1gg :

(c) Use the Cauchy-Schwarz inequality to show that bR S(H0) = 
m E
"
Xm i=1 ixi 2 #
:

(d) Use the inequality Ev[kvk2] 
p Ev[kvk22 ], which holds by Jensen's inequality to upper bound bR S(H0).

(e) Assume that for all x 2 S, kxk2  r for some r > 0. Use the previous questions to derive an upper bound on the Rademacher complexity of H in terms of r.

Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Related Book For  book-img-for-question

Foundations Of Machine Learning

ISBN: 9780262351362

2nd Edition

Authors: Mehryar Mohri, Afshin Rostamizadeh

Question Posted: