Answered step by step
Verified Expert Solution
Question
1 Approved Answer
Please solve for a, b Estimating equations for L1 loss 4. (10 points) In lecture, we derived the estimating equations for SLR, which we obtained
Please solve for a, b
Estimating equations for L1 loss 4. (10 points) In lecture, we derived the estimating equations for SLR, which we obtained by differentiating the mean squared error with respect to a and b. Now, suppose that we choose the same model, y = a + bx;, but we minimize the LI loss instead. That is, we minimize the mean absolute error, defined as MAE(a, b) = _ Elyi - a - bail. (a) (2 points) Show that the partial derivatives of lyi - a - bx;| with respect to a and b are da yi - a - bxil = -sgn(yi - a - bxi) anyi - a - bxil = -xi . sgn(yi - a - bxi), and undefined if yi = a + bx;, where sgn is the sign function defined by +1 if z> 0 sgn (z) = -1 if z in}, and let #(S') and #(S'l'l represent the sizes of the two sets. Then any MAE minimizing solution for (i? 6 must approximately solve the two equations? #(S') = am and E as;- = Z 9:1. (1) 1'63 iES'l' More precisely: we must have S |#(S_) - #(S+)| g #(SD), and : (2) Ens-Za- iES' I'ES+ Za- i630 where 80 is the set of data points with zero residual. This would typically be only one or two data points, whereas 8+ and 3 each have about n/ 2 data points, so the bounds in (2) are typically small compared to the quantities in (1). You do not need to show the more complicated result (2). Instead just show that, if there is any l\\-'IAE-minimizing solution with S\" = 0 (no residuals are exactly zero), then we have the two equalities in (1)Step by Step Solution
There are 3 Steps involved in it
Step: 1
Get Instant Access to Expert-Tailored Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started