Question
Fitting a wrong model: Suppose you have 100 data points that arose from the following model: y = 3 + 0.1 x1 + 0.5 x2
Fitting a wrong model: Suppose you have 100 data points that arose from the following model: y = 3 + 0.1 x1 + 0.5 x2 + error, with independent errors drawn from a t distribution with mean 0, scale 5, and 4 degrees of freedom. We shall explore the implications of fitting a standard linear regression to these data.
(a) Simulate data from this model. For simplicity, suppose the values of x1 are simply the integers from 1 to 100, and that the values of x2 are random and equally likely to be 0 or 1. In R, you can define x_1 <- 1:100, simulate x_2 using rbinom, then create the linear predictor, and finally simulate the random errors in y using the rt function. Fit a linear regression (with normal errors) to these data and see if the 68% confidence intervals for the regression coefficients (for each, the estimates 1 standard error) cover the true values.
(b) Put the above step in a loop and repeat 1000 times. Calculate the confidence coverage for the 68% intervals for each of the three coefficients in the model.
Step by Step Solution
There are 3 Steps involved in it
Step: 1
Get Instant Access to Expert-Tailored Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started