Answered step by step
Verified Expert Solution
Question
1 Approved Answer
Please help with these questions, thank you Geometric Perspective of Simple Linear Regression 1. (7 points) In Lecture 12: we viewed both the simple linear
Please help with these questions, thank you
Geometric Perspective of Simple Linear Regression 1. (7 points) In Lecture 12: we viewed both the simple linear regression model and the multiple linear regression model through the lens of linear algebra. The key geometric insight was that if we train a model on some design matrix X and true response vector Y; our predicted response Y = X?) is the vector in span(X) that is closest to Y. In the simple linear regression case: our optimal vector :9 is I9 : [90. 61]. and our design matrix is [1 .1'1 1 JI'2 ' i X = . = u Km 1 :14\" ' ' r 5 r i This means we can write our predicted response vector as Y = X [6'0] : 901171 + any. 1 In this problem, ll\" is the nevector of all 1's and X\") refers to the nrlength vector [:rl. mg. .I'an. Note. X:,1 is a feature; not an observation. For this problem, assume we are working with the simple linear regression model, though the properties we establish here hold for any linear regression model that contains an intercept term. It (a) (3 points) Recall in the last assignment, we showed that 2 c1 : (l algebraically. In {:1 this question. explain why 2 cl : 0 using a geometric property. (Hint: r7. : Y 7 Y, i:1 and 6: [e]. 9.2. (EMF) Homework #6 4 (b) (2 points) Similarly. explain why 2 cm : 0 using a geometric property. (Hint: 1':1 Your answer should be very similar to the above) (c) (2 points) Briey explain why the vector if must also he orthogonal to the residual vector (7Step by Step Solution
There are 3 Steps involved in it
Step: 1
Get Instant Access to Expert-Tailored Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started