Answered step by step
Verified Expert Solution
Question
1 Approved Answer
1. Support Vector Regression Support vector regression (SVR) is a method for regression analogous to the support vector classifier. Let (xi, yi) ERd x R,
1. Support Vector Regression Support vector regression (SVR) is a method for regression analogous to the support vector classifier. Let (xi, yi) ERd x R, i = 1, . . ., n be training data for a regression problem. In the case of linear regression, SVR solves min w,b,+ ,5 7 1 1 20/ 13 + - [(+ + 57) s.t. wxitb - yi 0, e > 0 are fixed, and || . |/2 is the Euclidean norm. a. (5pts) Show that for an appropriate choice of 1, SVR solves min w,b [ le(yi, w xi + 6) + Allw /13 where le(y, t) = max{0, ly - t) - e} is the so-called e-insensitive loss, which does not penalize prediction errors below a level of c. Note: This part does not play a role in the subsequent parts. b. (5 pts) The optimization problem is convex with affine constraints, and therefore strong duality holds. Use the KKT conditions to derive the dual optimization problem in a manner analogous to the support vector classifier. As in the SVC, you should eliminate the dual variables corresponding to the constraints S, 2 0, S; 2 0. c. (3 pts) Explain how to kernelize SVR. Be sure to explain how to determine b* and evaluate the final predicton function. d. (2 pts) Argue that the final predictor will only depend on a subset of training examples, and characterize those training examples. Your characterization should be analogous to the characterization of support vectors being "on the wrong side of the margin" in support vector classifiers
Step by Step Solution
There are 3 Steps involved in it
Step: 1
Get Instant Access to Expert-Tailored Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started