Question
hw #2 exam 2 review 10points 1. For a given set of data, the larger the explained variance, Question 1 options: The larger the total
hw #2 exam 2 review 10points
1. For a given set of data, the larger the explained variance,
Question 1 options:
The larger the total variance | |
The larger is the R-squared value | |
The greater the slope of the regression line | |
The greater the total of the sum of the dependent variable values |
Question 2
In logistic regression, the Y intercept
Question 2 options:
Is equal to the square of (y-bar) | |
Is equal to 1 (one) over the slope of Y | |
Is a value between 0 (zero) and 1 (one) | |
Any value (it can't be determined from the question) |
Question 3
Temperature, where the differences between any two degrees represents the same change (difference) as the difference between any other two degrees, would be an example of
Question 3 options:
Ratio scale | |
Interval scale | |
Partial correlation |
Question 4
In a traditional regression analysis, the predictors are
Question 4 options:
The values obtained after subtracting the Y values and the mean of Y | |
The X values | |
The beta coefficients | |
The expected values of Y given any value of X |
Question 5
In multiple regression, beta coefficients
Question 5 options:
Can be different for each variable | |
Can only be positive values or 0 (zero) | |
Become larger in value as the number of predictor variables increase |
Question 6
Assume a partial correlation is computed (with X, Y and Z). If the correlation between the original X and original Y values remains the same (as compared to the outcome of the partial correlational analysis) then the
Question 6 options:
Controlled variable (third variable) explains all of the original correlation | |
Correlation between the original X and Y values must be 1.0 | |
Correlation between either X and Z or Y and Z must be 1.0 | |
Influence of the third variable is 0 (zero) |
Question 7
If a regression line improves upon predicting the Y mean value for any X value, then based on the regression line, the
Question 7 options:
Sum of squares explained increases | |
Sum of square total increases | |
The slope of the regression line increases | |
The beta weights approach 0 (zero) |
Question 8
R-squared (r2or R2) describes
Question 8 options:
Explained variance is about 0 (zero) | |
The total sum of squares (sum of squares total) | |
The strength of a regression model |
Question 9
A regression analysis is run and a best fit line is computed. If all of the actual Y values are located on the regression line,
Question 9 options:
Explained variance is about 0 (zero) | |
Unexplained variance is positive infinity | |
The unexplained variance is about negative infinity | |
None of the above |
Question 10
As compared to a correlation analysis, an advantage of a scientific experiment is
Question 10 options:
Experiments can be completed in any set of circumstances while correlations are dependent on randomizing participants | |
Experiments produce computed beta/coefficient weights while with correlation these values are only estimated | |
Causality |
Question 11
The Y intercept is
Question 11 options:
0 (zero) | |
1 (one) | |
Equal to the sum of squares total | |
The point where a regression line crosses the Y axis |
Question 12
In a regression analysis, as the predicted outcome (value) of a variable increases in distance from what was actually measured for the variable,
Question 12 options:
The residual value increases | |
The explained variance increases | |
The mean of the outcome variable increases | |
The modal (the mode) value of the outcome variable approaches 0 (zero) |
Question 13
In a regression analysis, the dependent variable
Question 13 options:
Reflects what the regression model is trying to predict | |
Has a beta weight between the values of 0 (zero) and 1 (one) | |
Is the confound variable |
Question 14
If the sum of squares for the residuals is 0 (zero)
Question 14 options:
The residuals will each be 1 (one) standard deviation (of the independent variable) from the regression line | |
The difference between the predicted values of Y and the actual values is also 0 (zero) | |
None of the actual Y values equal any of the predicted values for Y |
Question 15
If the regression line is equal to the mean of the Y values (the intercept is the Y mean and the slope is 0 (zero))
Question 15 options:
The regressor values must each also equal 0 (zero) | |
The sum of squares residuals (explained) equals the sum of squares unexplained | |
The slope of the regression line is not computable |
Question 16
In a partial correlation of three variables (for example, X, Y and Z)
Question 16 options:
The partial correlation is simply the original r value divided by 3 | |
All three variables are considered outcome variables | |
The computed partial correlation can be any value between -1.0 and 1.0 |
Question 17
According to the concept "regression to the mean"
Question 17 options:
The likelihood of an extreme value occurring is 0 (zero) | |
If an event produces two outcomes, those outcomes are correlated | |
An extreme outcome of any event is likely to be followed by a less extreme outcome | |
A perfect regression line is one that is fit to the mean of the Y values |
Question 18
In the ordinal scale, the difference between any two points of the scale
Question 18 options:
Is equal to 0 (zero) | |
Is as large as the number of points making up the scale | |
Can vary | |
Is consistent (can't vary) |
Question 19
In the following "rxy.z", the possible confound is
Question 19 options:
X | |
Y | |
Z | |
r |
Question 20
After a linear regression analysis has been completed and a linear model has been produced,
Question 20 options:
The slope of the regression line must be equal to the mean of the Y variable | |
The intercept of the regression line is always where Y = 0 (zero) | |
If, for any X, Y is a negative number, the linear model is invalid | |
A dependent variable value can be computed for any value of the independent variable |
Question 21
In correlational analyses (as discussed in class), the term "control"
Question 21 options:
Refers to randomization of participants into different groups | |
Is synonymous (means the same) with "causality" | |
Refers to factoring out any impact of a confound | |
Is synonymous (means the same) with "independent variable" |
Question 22
In a linear regression analysis, a goal is to
Question 22 options:
Separate data points into one of two different outcome conditions | |
Find the mean of the Y variable | |
Produce a model that predicts the values of the independent variable | |
Produce a model that predicts the values of the dependent variable |
Question 23
An attempt to predict the probability of correctly placing an observation into one of two different conditions
Question 23 options:
Is an example of logistic regression | |
Involves grouping values defined by nominal scale measurements | |
Can only occur if the slope of a best fit regression line is a non-zero value | |
Requires partial correlations |
Question 24
Regressor values are
Question 24 options:
The independent variable values | |
The dependent variable values | |
In any linear models, those values with the smallest residuals | |
Any dependent variable values that are greater than the mean of the Y variable |
Question 25
The (numeric) value of the Coefficient of Determination
Question 25 options:
Can never be 0 (zero) | |
Can be any value | |
Must be between 0 (zero) and 1 (one) | |
Will always be higher than the value of r |
Question 26
The measure of central tendency for the Ratio scale is
Question 26 options:
The coefficient of determination | |
The mean | |
The mode | |
0 (zero) |
Question 27
Unexplained variance can be directly computed using
Question 27 options:
1 (one) and R2 (R squared) | |
X values and the mean of X | |
The predicted Y values and the mean of X | |
The Y intercept and the mean of Y () |
Question 28
In measurement scale, the value 0 (zero) can be part of
Question 28 options:
Ordinal scales | |
Interval scales | |
All of the scales | |
Only the ratio scale |
Question 29
(Y-bar) is
Question 29 options:
Involved with calculating the regressors | |
0 (zero) when the slope of a regression line is 0 (zero) | |
The symbol for the mean of the Y variable values |
Question 30
Which of the following involves a bivariate condition
Question 30 options:
Pearson's r | |
The mean of the Y-axis values | |
Values associated with a measurement based on the interval scale | |
Regression predictors |
Question 31
In correlation, a confound is
Question 31 options:
Another name for the correlation coefficient | |
A computational error (e.g. r = 5) | |
A possible explanation for a potential association that extends beyond the variables used in the original correlational analysis | |
Any value among the data that is considered an outlier and artificially alters the computed value of the correlational analysis |
Question 32
The sum of squares (sum of squares total, sum of squares residual and sum of squares explained) is squared because
Question 32 options:
R-squared is also a square | |
To keep the sum of squares between the values of 0 (zero) and 1 (one) | |
Otherwise all of the values involved in the sum of squares would be negative values | |
All of the relevant values would cancel out and add up to a total of 0 (zero) |
Question 33
A regression line
Question 33 options:
Represents all of the predicted X values | |
Has a slope equal to the explained variance | |
Can't be a horizontal line (parallel to the X axis) | |
Represents the predicted Y values for any value of X |
Step by Step Solution
There are 3 Steps involved in it
Step: 1
Get Instant Access to Expert-Tailored Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started