6.19 ( ) Another viewpoint on kernel regression comes from a consideration of regression problems in which

Question:

6.19 ( ) Another viewpoint on kernel regression comes from a consideration of regression problems in which the input variables as well as the target variables are corrupted with additive noise. Suppose each target value tn is generated as usual by taking a function y(zn) evaluated at a point zn, and adding Gaussian noise. The value of zn is not directly observed, however, but only a noise corrupted version xn = zn + ξn where the random variable ξ is governed by some distribution g(ξ).

Consider a set of observations {xn, tn}, where n = 1, . . . , N, together with a corresponding sum-of-squares error function defined by averaging over the distribution of input noise to give E =

1 2

N n=1



{y(xn − ξn) − tn}2 g(ξn) dξn. (6.99)

By minimizing E with respect to the function y(z) using the calculus of variations

(Appendix D), show that optimal solution for y(x) is given by a Nadaraya-Watson kernel regression solution of the form (6.45) with a kernel of the form (6.46).

Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Related Book For  book-img-for-question
Question Posted: