Answered step by step
Verified Expert Solution
Link Copied!

Question

1 Approved Answer

,.kindly solve please 1. Consider the simple regression model: V/i = Po+ Piri +ui, for i = 1, ..., n, with E(ur,) 7 0 and

image text in transcribedimage text in transcribedimage text in transcribedimage text in transcribedimage text in transcribedimage text in transcribedimage text in transcribedimage text in transcribed

,.kindly solve please

image text in transcribedimage text in transcribedimage text in transcribedimage text in transcribedimage text in transcribedimage text in transcribedimage text in transcribedimage text in transcribedimage text in transcribed
1. Consider the simple regression model: V/i = Po+ Piri +ui, for i = 1, ..., n, with E(ur,) 7 0 and let z be a dummy instrumental variable for a, such that we can write: with E(uilz;) = 0 and E(vilzi) = 0. (c) Denote by no, the number of observations for which z =0 and by n, the number of observations for which z, = 1. Show that: (a - 2) = =(n-m). 1=1 and that: [(8 -=)(3: - 9) = 7 "(n - ni) (31 - 90) . where to and g are the sample means of y for z equal to 0 and 1 respectively. (Hint: Use the fact that n = nj + no, and that = = m). (d) Now we regress y on i to obtain an estimator of 81. From the standard formula of the slope estimator for an OLS regression and using the result in (c), show that: By1 - 90 I1 - To This estimator is called the Wald estimator.Sometimes it is known in advance that the least-squares regression line must go through the origin, i.e., the regression model is of the form Y=BX;+;. i=1.2. ... /. where &;'s are i.id. N(0, 6- ). and the equation of the regression line is y = B .x. In this case, finding the least-squares line reduces to finding the value B that minimizes the expression f(B)= [[y-B.x;]2. i-l Use the derivative of f with respect to B to derive the formula for the slope of the least-squares regression line in this case.Prob 2 - Mixed Integer Program (MIP) Formulation (33%) Review the problem formulation Sections 1 1.2 through 1 1.4 of Hillier and Lieberman (2010) again. Then consider the following mathematical model Minimize = =f,(x,) +f(x2) subject to the restrictions one at a time (1) Either x, 23 or x, 23 (2) At least one of the following inequalities holds: 2x1 + 12 27 (3) |x, - X21 = 0, or 3, or 6. *+21, 27 (4) x, 20, X2 20; where fi(x , ) = ] 17+5x, if x, >0 0 if x, =0. 12( x , ) = 3 15+6X2 if x,>0 0 if x, =0. Formulate each problem as an MIP problem.The point of this exercise is to show that tests for functional form cannot be relied on as a general test for omitted variables. Suppose that, conditional on the explanatory variables x, and x2, a linear model relating y to x, and x2 satisfies the Gauss-Markov assumptions: y = Bo + Biki + Betz + u E(w/x1, x2) = 0 Var(w/x1, x2) = 0. To make the question interesting, assume B * 0. Suppose further that x, has a simple linear relationship with x1: * = bo + 6 1 + r E(rx,) = 0 Var(rx ) = 1. (1) Show that E(vx, ) = (Bo + B250) + (B, + B25,)x. Under random sampling, what is the probability limit of the OLS estimator from the simple regression of y on x,? Is the simple regression estimator generally consistent for B,? (ii) If you run the regression of y on x1, xi, what will be the probability limit of the OLS estimator of the coefficient on x7? Explain. (iii) Using substitution, show that we can write y = (Bo + Bzoo) + ( B, + B25, )x, + u + Byr. It can be shown that, if we define v = a + By/ then E(vix, ) = 0, Var(vix, ) = o' + Bir?. What consequences does this have for the / statistic on x, from the regression in part (ii)?3. [10 pts] Consider the simple linear regression model yi = 60+81(xi-x) +ci(i= 1,2,..,n), where x= ni=1 xi. From the least-squares criterion S(80,$1), find the least-squares estimators of 80 and B1 for this model. Hint: Do not expand the term (xi-x). That is, do not expand ni= 1 yi*(xi-x) as nZi=1 yixi- nZi=1 yi x for easier computations. Also remember what nEi=1(xi-x) is. 3. [10 pts] Consider the simple linear regression model yi = Bo + BI(z; - I) + ; (i = 1, 2, ..., n), where I = > I;. From the least-squares criterion S(Bo, 81), find the least-squares estimators of Bo and B, for this model. n Hint: Do not expand the term (r; - I). That is, do not expand ) yi(x; - I) as ) Vidi - 1=1 n it for easier computations. Also remember what E(x - I) is. 1-1 i-12. Again, consider the general linear model Y = XB + , with & ~ Nn(0, o?/), where the first column of X consists of all ones. (a) Using facts about the mean and variance/covariance of random vectors given in lecture, show that the least squares estimate from multiple linear regression satisfies E(B) = B and Var(B) = 03(XTX)-1. (b) Let H = X(X X)-1XT be the hat matrix, Y = HY be the fitted values and e = (I - H)Y be residuals. Using properties derived in class, show that n Ex = 0. i=1 This fact is used to provide the ANOVA decomposition SSTO = SSE + SSR for multiple linear regression. (Hint: The sum above can be written as e Y. Apply properites of H.)1. Some (More) Math Review a) Let N = 3. Expand out all the terms in this expression: Cov Xi b) Now write out all the terms using the formula from last class for variance of a sum: Var( X:) = _Var(X) + > > Cov(X, X;) i-1 1=1 i-lj=1ifi Verify that (a) and (b) are giving you the same thing. Hint: Cov(X, X) = Var(X). c) Suppose that D is a roulette wheel that takes on the values {1, 2, 3} all with probability 1/3. What is the expected value of this random variable? What about the variance? d) Now suppose that M is another roulette wheel that takes on the values {1, 2,3} all with probability 1/3. Solve for the expected value of 1/M. e) Finally, suppose that D and M are independent. Solve for: E Hint: You do not need to do any new calculations here. Just remember that for independent RVs, E(XY) = E(X)E(Y). f) Does E(D/M) = E(D)/ E(M)

Step by Step Solution

There are 3 Steps involved in it

Step: 1

blur-text-image

Get Instant Access to Expert-Tailored Solutions

See step-by-step solutions with expert insights and AI powered tools for academic success

Step: 2

blur-text-image

Step: 3

blur-text-image

Ace Your Homework with AI

Get the answers you need in no time with our AI-driven, step-by-step assistance

Get Started

Recommended Textbook for

Discrete Mathematics, Revised

Authors: Seymour Lipschutz, Marc Lipson

3rd Edition

0071615873, 9780071615877

More Books

Students also viewed these Mathematics questions