Fill This Form To Receive Instant Help

Help in Homework
trustpilot ratings
google ratings


Homework answers / question archive / Brooklyn College, CUNYSTAT MISC TRUE/FALSE QUESTIONS CHAPTER 7 1)The regression line is essentially an equation that expresses X as a function of Y

Brooklyn College, CUNYSTAT MISC TRUE/FALSE QUESTIONS CHAPTER 7 1)The regression line is essentially an equation that expresses X as a function of Y

Business

Brooklyn College, CUNYSTAT MISC

TRUE/FALSE QUESTIONS

CHAPTER 7

1)The regression line is essentially an equation that expresses X as a function of Y. [opposite: express Y as a function of X] -

 

2.            Tolerance is a measure of collinearity among IVs, where possible values range from 0–1. -

 

3.            The multiple correlation (R) is a Pearson correlation coefficient between the predicted and actual scores of the IVs. [DVs - pg 182] -

 

4.            Interpretation of multiple regression focuses on determining the inadequacy of the regression model that has been developed. [adequacy] -

 

5.            A secondary purpose is to use regression analysis as a means of explaining causal relationships among variables.

-             

 

6.            Stepwise multiple regression is often used in studies that are explanatory in nature. [exploratory] -

 

7.            In order to make predictions, three important facts about the regression line must be known. One of them is: The point at which the line crosses the X-axis. [Y-axis] -

 

8.            Multicollinearity tends to increase the variances in regression coefficients, which ultimately results in a more stable prediction equation. [unstable] -

 

9.            The coefficient of determination in multiple regression is the proportion of DV variance that can be explained by at least one IV. [by the combination of the IVs] -

 

10.          The F test in multiple regression examines the degree to which the relationship between the IVs is linear. [between the DV and IVs - pg 182] -

 

11.          The other main calculation in multiple regression is the determination of the value for R² and its associated significance test. -

 

12.          Multiple regression can be very sensitive to extreme outliers. -

 

13.          Model validation, sometimes called model cross-validation, is an important issue in multiple regression. -

 

14.          One of the assumptions in multiple regression with regard to the raw scale variables is that the IVs are normally distributed. [fixed or measured without error or that relationship between DV and IVs is linear - pg 177] -

 

15.          Multicollinearity is desirable in multiple regression. -

 

16.          Another assumption in multiple regression with regard to the residuals is that the errors are correlated with the IVs. [not correlated - pg 177] -

 

17.          Sequential multiple regression is also sometimes referred to as statistical multiple regression. [hierarchical - pg 175] -

 

18.          Multiple regression is used to predict the value of a single DV from a weighted, linear combination of IVs. [pg172] -

 

19.          Regression analysis procedures have as their primary purpose the development of an equation that can be used for predicting values on some DV for all members of a population. [pg 169] -

 

20.          Partial correlation is a measure of the relationship between an IV and a DV, holding all other IVs constant. [pg 181] -

 

21.          In standard multiple regression, the IV that has the highest correlation with the DV is entered into the analysis first. [all IVs are entered into the analysis simultaneously - pg 175] -

 

22.          Residuals (errors of prediction) are essentially calculated as the difference between the actual value and the predicted value for the IV. [DV - pg 171] -

 

23.          The reason that we obtain the best-fitting line as our regression equation is that we mathematically calculate the line with the smallest amount of total squared error. [pg 171] -

 

24.          The variance inflation factor (VIF) for a given predictor “indicates whether there exists a strong linear association between it and all remaining predictors” (Stevens, 2001). [pg 174] -

 

25.          In cases that involve moderate violations of linearity and homoscedasticity, one should be aware that these violations weaken and invalidate the regression analysis. [merely weaken, and they do not invalidate it - pg 179] -

 

Option 1

Low Cost Option
Download this past answer in few clicks

4.83 USD

PURCHASE SOLUTION

Already member?


Option 2

Custom new solution created by our subject matter experts

GET A QUOTE