Answered step by step
Verified Expert Solution
Link Copied!

Question

1 Approved Answer

General linear regression with regularization Let A E RNXN, B E RDXD be symmetric, positive definite matrices. From the lectures, we can use symmetric positive

General linear regression with regularization

image text in transcribed
Let A E RNXN, B E RDXD be symmetric, positive definite matrices. From the lectures, we can use symmetric positive definite matrices to define a corresponding inner product, as shown below. We can also define a norm using the inner products. (x, y)A := x] Ay | XIA := (x, x) A (x, y)B := x By |x/B := (X, x) B Suppose we are performing linear regression, with a training set { (x1, y1), ..., (xN, yN)}, where for each i, x; E RD and y; E R. We can define the matrix X = [X1, . ..,XN] E RNXD and the vector y = [y1, . .. .yN] ERN We would like to find O E RD, ce RN such that y ~ X0 + c, where the error is measured using || . |A. We avoid overfitting by adding a weighted regularization term, measured using | | | |B. We define the loss function with regularizer: LA,By,x(0, c) = lly - Xe - clla + 110|/B + 1/cl/A For the sake of brevity we write C(0, c) for LA, By, x(0, c). HINTS: . You may use (without proof) the property that a symmetric positive definite matrix is invertible. . We assume that there are sufficiently many non-redundant data points for X to be full rank. In particular, you may assume that the null space of X is trivial (that is, the only solution to Xz = 0 is the trivial solution, z = 0.) . You may use identities of gradients from the lectures slides, so long as you mention as such. 1. Find the gradient Vec(0, c). 2. Let Vec(0, c) = 0, and solve for 0. If you need to invert a matrix to solve for 0, you should prove the inverse exists. 3. Find the gradient Vec(0, c). We now compute the gradient with respect to c. 4. Let VcL(0) = 0, and solve for c. If you need to invert a matrix to solve for c, you should prove the inverse exists. 5. Show that if we set A = I, c = 0, B = AI, where A E R, your answer for 3.2 agrees with the analytic solution for the standard least squares regression problem with L2 regularization, given by 0 = ( XX + )I)- xTy

Step by Step Solution

There are 3 Steps involved in it

Step: 1

blur-text-image

Get Instant Access to Expert-Tailored Solutions

See step-by-step solutions with expert insights and AI powered tools for academic success

Step: 2

blur-text-image

Step: 3

blur-text-image

Ace Your Homework with AI

Get the answers you need in no time with our AI-driven, step-by-step assistance

Get Started

Recommended Textbook for

Intermediate Accounting

Authors: Donald E. Kieso, Jerry J. Weygandt, And Terry D. Warfield

13th Edition

9780470374948, 470423684, 470374942, 978-0470423684

Students also viewed these Mathematics questions