In the linear model y = X???? + ????, suppose ????i has the Laplace density, f(????) =

Question:

In the linear model y = X???? + ????, suppose ????i has the Laplace density, f(????) =

(1∕2b) exp(−|????|∕b). Show that the ML estimate minimizes ∑

i |yi − ????i|.

Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Question Posted: