Question
Multi-task regression (by Andrew Ng) Thus far, we only considered regression with scalar-valued responses. In some applications, the response is itself a vector: yi R
Multi-task regression (by Andrew Ng) Thus far, we only considered regression with scalar-valued responses. In some applications, the response is itself a vector: yi R p . We posit the relationship between the features and the vector-valued response is linear: y T i x T i B , where B R dp is a matrix of regression coefficients. (a) Express the sum of squared residuals (SSR) in matrix notation (i.e. without using any summations). Hint: work out how to express the SSR in terms of X = x T 1 . . . x T n R nd , Y = y T 1 . . . y T n R np . (b) Find the matrix of regression coefficients that minimizes the SSR. (c) Instead of minimizing the SSR, we break up the problem into p regression problems with scalar-valued responses. That is, we fit p linear models of the form (yi)k x T i k, where k R d . How do the regression coefficients from the p separate regressions compare to the matrix of regression coefficients that minimizes the SSR.
Step by Step Solution
There are 3 Steps involved in it
Step: 1
Get Instant Access to Expert-Tailored Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started