Answered step by step
Verified Expert Solution
Question
1 Approved Answer
Let 0 be an unknown population parameter associated with the collection of observations ( X1 , Y1), ( X2, Y2 ), .... ( Xn, Yn)
Let 0 be an unknown population parameter associated with the collection of observations ( X1 , Y1), ( X2, Y2 ), .... ( Xn, Yn) where, for E1, E2, . .. , En ~ N(0, 02), Yi = BXit ci, for i = 1, 2, ..., n. This similar to the regression model we examined in class, but without the intercept term ! In this question, we are going to derive the regression estimator 0 for the model, using the same approach from the lecture. 1. (5 points) Write down the log-likelihood function for 1, E2, . . . , En based on the pdf of N(0, o2), i.e., derive the expression for: e(02 ) = >log f (cio? ) Using this answer and the fact that ci = Yi - 0Xi, derive the expression for the log-likelighood function w.r.t. the unknown parameters (0, 02), i.e., e(0, 02) = >log f(Xi, Yi;0, 02) 3. (8 points) Using your answer for &(0, o2) from part #2, maximize the log-likelihood function w.r.t. 0, o2 to derive the maximum likelihood estimators OMLE, O'MLE = arg max e(0, 02) 0,04. (2 points) Using your answer for OMLE, consider the estimated regression line, given by y = OMLE - 2 Which of the following statements is true regarding the estimated regression line: 1. The regression line always passes through the point (Y, X) 2. The regression line always passes through the origin (0, 0) 3. The regression line always passes through the point (Xmin, Ymin) 4. The regression line always passes through the point (Xmax, Ymax) where Xmin = min(X1, X2, . .., Xn), Xmax = max(X1, X2, . .., Xn) (mutatis mutandis for Ymin, Ymax)
Step by Step Solution
There are 3 Steps involved in it
Step: 1
Get Instant Access to Expert-Tailored Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started