An experiment was carried out by geologists to see how the time necessary to drill a distance

Question:

An experiment was carried out by geologists to see how the time necessary to drill a distance of 5 feet in rock (y, in minutes) depended on the depth at which the drilling began (x, in feet, between 0 and 400). We show part of the Minitab output obtained from fitting the simple linear regression model (“Mining Information,”

American Statistician [1991]: 4– 9).

The regression equation is Time = 4.79 + 0.0144depth Predictor Coef Stdev t-ratio p Constant 4.7896 0.6663 7.19 0.000 depth 0.014388 0.002847 5.05 0.000 s = 1.432 R-sq = 63.0% R-sq(adj) = 60.5%

Analysis of Variance Source DF SS MS F p Regression 1 52.378 52.378 30.768 2.051 Total 16 83.146

a. What proportion of observed variation in time can be explained by the simple linear regression model?

b. Does the simple linear regression model appear to be useful?

c. Minitab reported that sa1b12002 5 .347. Calculate a 95% confidence interval for the mean time when depth 5 200 feet.

d. A single observation on time is to be made when drilling starts at a depth of 200 feet. Use a 95%

prediction interval to predict the resulting value of time.

e. Minitab gave (8.147, 10.065) as a 95% confidence interval for mean time when depth 5 300. Calculate a 99% confidence interval for this mean.

Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Related Book For  book-img-for-question
Question Posted: