Answered step by step
Verified Expert Solution
Link Copied!

Question

1 Approved Answer

6.1. In Output 6.1: (a) What information suggests that we might have a problem of collinearity? (b) How does multicollinearity affect results? (c) What is

image text in transcribedimage text in transcribedimage text in transcribedimage text in transcribedimage text in transcribedimage text in transcribedimage text in transcribed

6.1. In Output 6.1: (a) What information suggests that we might have a problem of collinearity? (b) How does multicollinearity affect results? (c) What is the adjusted R2 and what does it mean?

I have attached screen shots from the textbook for the data for 6.1 outputs from SPSS.

image text in transcribedimage text in transcribedimage text in transcribedimage text in transcribedimage text in transcribedimage text in transcribedimage text in transcribedimage text in transcribedimage text in transcribed
\fInterpretation of Output 6.1a The correlation matrix indicates large correlations between momatian and competence and between mother's education andfarher'; education lfprecltctor variables are highly correlated and conceptually related to one another. we would usually aggregate them. not only to reduce the llkethDd of mujoicollinearity but also to reduce the number of predictors. which typically increases power. If predictor variables are highly correlated but conceptually are distinctly di'erent (so aggregation does not seem appropriate}. we might decide to eliminate the less important predictor before running the regression. Howeyer. if we have reasons for wanting to include both variables as separate predictors. we should run collinearity diagnostics to see if collmearity actually is a problem. For this problem. we also want to show how the collinearity problems created by these highly correlated predictors aect the Tolerance values and the signicance of the beta coefcients. so we will run the regressaon Without altering the variables. To run the regresmon. follow the steps below: - Click on the following: Analyze - Regression * Linear... The Linear Regression window (Fig. 6.1) should appeal. 0 Select mark achievement and click it over to the Dependent box (dependent variable) - Next select the variables motivation scafe, competence scale, pleasure scale, grade: in hum. father's education. mask-9F: education. and gender and click them oyer to the Independens) box (independent variables) - Under )Iethod. be sure that Enter is selected. Descriptive Statistics __-_ math achievement test 12. 7536 6. 66293 , , motivation scale 2.8913 .62676 N 136.9 because SIX competence scale 3.3188 .62262 part1c1pants have pleasure scale 3.1486 .60984 some mlssmg data. grades in h.s. 5.71 1.573 father's education 4.65 2.764 mother's education 4.07 2.185 gender .54 .502 \fInterpretation of Output 6.1b first. the output provides the usual Descriptive Statistics for all eight variables. Note that .V is 69 because six participants are missing a score on one or more variables. Multiple regresston uses only the participants who have complete data for all the variables. The next table is a correiation matrix similar to the one in Output 6.1a. Note that the first colunui shows the correlations of the other variables with math achievement and that mutilation. competence grades in fag}: school. fatherli education, mothers education. and gender are all significantly correlated \"1&1 mat}: achietemenf. As ive observed before. several of the predictor independent variables are highly correlated with each other, that is competence and intimation (.51 I") and mother's education andither's education (.649). The Model Summary table shows that the multiple correlation coefficient (R). using ail the predictors simultaneoustv. is .65 (R2 = .43) and the adjusted R2 is .36. meaning that 36% of the variance l.l1 math achievement can be predicted from gender. competence. etc. combined Note that the adjusted R: is lotver than the unadjusted R2. This is. in part. related to the number of variables in the equation. The adjustment is also a'ected by the magnitude of the effect and the sample size. Because so many independent variables were used. especially given difculties with collinearity. a reduction in the number of variables might help us nd an equation that explains more ofthe variance in the dependent variable. It is helpful to use the concept ofparsitnonj' with multiple regression and use the smallest number ofpredictors needed. ANOVA Sum of Model Squares df Mean Square F Sig Regression 1285.701 7 183.672 6.465 000a Residual 1733.137 61 28.412 Total 3018.838 68 a. Predictors: (Constant), gender, competence scale, father's education, pleasure scale, grades in h.s., motivation scale, mother's education b. Dependent Variable: math achievement test Indicates that the combination of these variables significantly (p <.001 predicts the dependent variable. coefficients standardized unstandardized correlations collinearity statistics model b std. error beta sig. zero-order partial part tolerance vif motivation scale .127 competence .122 .260 .008 .559 pleasure j85 grades in h.s. .386 father education .332 .088 mother .528 gender .267 ..239 .797 a. variable: math achievement test and give same information therefore you should report one or other. they tell us if there is multicollinearity. value low then probably a problem with this case since adjusted r .36 .64 tolerances are for indicating we have multicollinearity.interpretation of output continued anoya table shotvs that f="6.4?" significant. indicates combination predictors signicantly predict main most important tables coefcients table. it provides coefcients. which interpreted similarly to correlation coeicients factor weights chapter tvalne sig opposite each independent variable whether contributing equation predicting mar from whole set predictors. thus. n r. gender. example. only variables significantly adding anything prediction when other five also entered. note all being considered together these values computed. therefore. ifyou delete not can ajfect size ofthe betas levels ofsignicance particularly true ifhigh collinearib exists. another aspect correlations: zero order part. values. tvhen squared. an indication amount unique variance explained by any outcome predicted output. see explains least ofunique high schoor captains moreover. as suggest. will results somewhat misleading. although two parent measures were correlated achievement. did contribute muett regression what has happened here highly multiple elirninates overlap between neither nor had much was used predictor. too multicollinearity way handle combine related makes conceptual sense could make new calledparent education. competence. morh andpieertire discuss do inthis diagnostic pro condition dimension ei-envalue>

Step by Step Solution

There are 3 Steps involved in it

Step: 1

blur-text-image

Get Instant Access to Expert-Tailored Solutions

See step-by-step solutions with expert insights and AI powered tools for academic success

Step: 2

blur-text-image

Step: 3

blur-text-image

Ace Your Homework with AI

Get the answers you need in no time with our AI-driven, step-by-step assistance

Get Started

Recommended Textbook for

Calculus Early Transcendentals

Authors: Jon Rogawski, Colin Adams, Robert Franzosa

4th Edition

1319055907, 9781319055905

More Books

Students also viewed these Mathematics questions