Download File: Ch14, Chapter 14: Advanced Multiple Regression Analysis

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
File: Ch14, Chapter 14: Advanced Multiple Regression Analysis
True/False
1. Regression models in which the highest power of any predictor variable is 1 and in which there
are no cross product terms are referred to as first-order models.
Ans: True
Response: See section 14.1 Nonlinear Models: Mathematical Transformation
Difficulty: Easy
2. The regression model y = 0 + 1 x1 + 2 x2 + 3 x1x2 + is a first order model.
Ans: False
Response: See section 14.1 Nonlinear Models: Mathematical Transformation
Difficulty: Easy
3. The regression model y = 0 + 1 x1 + 2 x2 + 3 x3 + is a third order model.
Ans: False
Response: See section 14.1 Nonlinear Models: Mathematical Transformation
Difficulty: Easy
4. The regression model y = 0 + 1 x1 + 2 x21 + is called a quadratic model.
Ans: True
Response: See section 14.1 Nonlinear Models: Mathematical Transformation
Difficulty: Easy
5. A linear regression model cannot be used to explore the possibility that a quadratic
relationship may exist between two variables.
Ans: False
Response: See section 14.1 Nonlinear Models: Mathematical Transformation
Difficulty: Medium
6. A linear regression model can be used to explore the possibility that a quadratic relationship
may exist between two variables by suitably transforming the independent variable.
Ans: True
Response: See section 14.1 Nonlinear Models: Mathematical Transformation
Difficulty: Medium
7. A useful tool in improving the regression model fit is recoding data.
Ans: True
Response: See section 14.1 Nonlinear Models: Mathematical Transformation
Difficulty: Easy
8. A logarithmic transformation may be applied to both positive and negative numbers.
Ans: False
Response: See section 14.1 Nonlinear Models: Mathematical Transformation
Difficulty: Medium
9. If a square root transformation is applied to a series of positive numbers, the numerical values
of the numbers in the transformed series will be smaller than the corresponding numbers in the
original series.
Ans: True
Response: See section 14.1 Nonlinear Models: Mathematical Transformation
Difficulty: Medium
10. If a square-transformation is applied to a series of positive numbers, the numerical values of
the numbers in the transformed series will be smaller than the corresponding numbers in the
original series.
Ans: False
Response: See section 14.1 Nonlinear Models: Mathematical Transformation
Difficulty: Medium
11. If the effect of an independent variable (e.g., humidity) on a dependent variable (e.g.,
hardness) is affected by different ranges of values for a second independent variable (e.g.,
temperature), the two independent variables are said to interact.
Ans: True
Response: See section 14.1 Nonlinear Models: Mathematical Transformation
Difficulty: Medium
12. The interaction between two independent variables can be examined by including a new
variable, which is the sum of the two independent variables, in the regression model.
Ans: False
Response: See section 14.1 Nonlinear Models: Mathematical Transformation
Difficulty: Medium
13. Qualitative data cannot be incorporated into linear regression models.
Ans: False
Response: See section 14.2 Indicator (Dummy) Variables
Difficulty: Medium
14. A qualitative variable which represents categories such as geographical territories or job
classifications may be included in a regression model by using indicator or dummy variables.
Ans: True
Response: See section 14.2 Indicator (Dummy) Variables
Difficulty: Medium
15. If a qualitative variable has c categories, then c dummy variables must be included in the
regression model, one for each category.
Ans: False
Response: See section 14.2 Indicator (Dummy) Variables
Difficulty: Medium
16. If a qualitative variable has c categories, then only (c – 1) dummy variables must be included
in the regression model.
Ans: True
Response: See section 14.2 Indicator (Dummy) Variables
Difficulty: Medium
17. If a data set contains k independent variables, the “all possible regression” search procedure
will determine 2k different models.
Ans: False
Response: See section 14.3 Model-Building: Search Procedures
Difficulty: Medium
18. If a data set contains k independent variables, the “all possible regression” search procedure
will determine 2k – 1 different models.
Ans: True
Response: See section 14.3 Model-Building: Search Procedures
Difficulty: Medium
19. If two or more independent variables are highly correlated, the regression analysis might
suffer from the problem of multicollinearity.
Ans: True
Response: See section 14.4 Multicollinearity
Difficulty: Easy
20. Stepwise regression is one of the ways to prevent the problem of multicollinearity.
Ans: True
Response: See section 14.3 Model-Building: Search Procedures
Difficulty: Medium
Multiple Choice
21. Multiple linear regression models can handle certain nonlinear relationships by ________.
a) biasing the sample
b) recoding or transforming variables
c) adjusting the resultant ANOVA table
d) adjusting the observed t and F values
e) performing nonlinear regression
Ans: b
Response: See section 14.1 Nonlinear Models: Mathematical Transformation
Difficulty: Medium
22. In multiple regression analysis, qualitative variables are sometimes referred to as ___.
a) dummy variables
b) quantitative variables
c) dependent variables
d) performance variables
e) cardinal variables
Ans: a
Response: See section 14.2 Indicator (Dummy) Variables
Difficulty: Medium
23. If a qualitative variable has 4 categories, how many dummy variables must be created and
used in the regression analysis?
a) 3
b) 4
c) 5
d) 6
e) 7
Ans: a
Response: See section 14.2 Indicator (Dummy) Variables
Difficulty: Medium
24 If a qualitative variable has "c" categories, how many dummy variables must be created and
used in the regression analysis?
a) c - 1
b) c
c) c + 1
d) c - 2
e) 4 + c
Ans: a
Response: See section 14.2 Indicator (Dummy) Variables
Difficulty: Medium
25. The following scatter plot indicates that _________.
a) a log x transform may be useful
b) a y2 transform may be useful
c) a x2 transform may be useful
d) no transform is needed
e) a 1/x transform may be useful
Ans: c
Response: See section 14.1 Nonlinear Models: Mathematical Transformation
Difficulty: Medium
26. The following scatter plot indicates that _________.
a) a log x transform may be useful
b) a log y transform may be useful
c) a x2 transform may be useful
d) no transform is needed
e) a 1/x transform may be useful
Ans: a
Response: See section 14.1 Nonlinear Models: Mathematical Transformation
Difficulty: Medium
27. The following scatter plot indicates that _________.
520
500
Y
480
460
440
420
0
a) a log x transform may be useful
b) a log y transform may be useful
c) an x2 transform may be useful
d) no transform is needed
e) a (– x) transform may be useful
Ans: c
1
X
2
3
Response: See section 14.1 Nonlinear Models: Mathematical Transformation
Difficulty: Medium
28. The following scatter plot indicates that _________.
Y
580
570
560
550
540
530
520
510
500
490
-2
-1.5
X -1
-0.5
0
a) a x2 transform may be useful
b) a log y transform may be useful
c) a x4 transform may be useful
d) no transform is needed
e) a x3 transform may be useful
Ans: b
Response: See section 14.1 Nonlinear Models: Mathematical Transformation
Difficulty: Medium
29. Yvonne Yang, VP of Finance at Discrete Components, Inc. (DCI), wants a regression model
which predicts the average collection period on credit sales. Her data set includes two qualitative
variables: sales discount rates (0%, 2%, 4%, and 6%), and total assets of credit customers
(small, medium, and large). The number of dummy variables needed for "sales discount rate" in
Yvonne's regression model is ________.
a) 1
b) 2
c) 3
d) 4
e) 7
Ans: c
Response: See section 14.2 Indicator (Dummy) Variables
Difficulty: Medium
30. Yvonne Yang, VP of Finance at Discrete Components, Inc. (DCI), wants a regression model
which predicts the average collection period on credit sales. Her data set includes two qualitative
variables: sales discount rates (0%, 2%, 4%, and 6%), and total assets of credit customers
(small, medium, and large). The number of dummy variables needed for "total assets of credit
customer" in Yvonne's regression model is ________.
a) 1
b) 2
c) 3
d) 4
e) 7
Ans: b
Response: See section 14.2 Indicator (Dummy) Variables
Difficulty: Medium
31. Hope Hernandez, Marketing Manager of People's Pharmacy, Inc., wants a regression model
to predict sales in the greeting card department. Her data set includes two qualitative variables:
the pharmacy neighborhood (urban, suburban, and rural), and lighting level in the greeting card
department (soft, medium, and bright). The number of dummy variables needed for "lighting
level" in Hope's regression model is ______.
a) 1
b) 2
c) 3
d) 4
e) 5
Ans: b
Response: See section 14.2 Indicator (Dummy) Variables
Difficulty: Medium
32. Hope Hernandez, Marketing Manager of People's Pharmacy, Inc., wants a regression model
to predict sales in the greeting card department. Her data set includes two qualitative variables:
the pharmacy neighborhood (urban, suburban, and rural), and lighting level in the greeting card
department (soft, medium, and bright). The number of dummy variables needed for Hope's
regression model is ______.
a) 2
b) 4
c) 6
d) 8
e) 9
Ans: b
Response: See section 14.2 Indicator (Dummy) Variables
Difficulty: Medium
33. Alan Bissell, a market analyst for City Sound Mart, is analyzing sales of heavy metal CD’s.
Alan’s dependent variable is annual heavy metal CD sales (in $1,000,000's), and his independent
variables are teenage population (in 1,000's) and type of sales district (0 = urban, 1 = rural).
Regression analysis of the data yielded the following tables.
Intercept
x1(teenagers)
x2(district)
Coefficients Standard Error
1.7
0.384212
0.04
0.014029
-1.5666667
0.20518
t Statistic
4.424638
2.851146
-7.63558
p-value
0.00166
0.019054
3.21E-05
Alan's model is ________________.
a) y = 1.7 + 0.384212 x1 + 4.424638 x2 + 0.00166 x3
b) y = 1.7 + 0.04 x1 + 1.5666667 x2
c) y = 0.384212 + 0.014029 x1 + 0.20518 x2
d) y = 4.424638 + 2.851146 x1 - 7.63558 x2
e) y = 1.7 + 0.04 x1 - 1.5666667 x2
Ans: e
Response: See section 14.2 Indicator (Dummy) Variables
Difficulty: Easy
34. Alan Bissell, a market analyst for City Sound Mart, is analyzing sales of heavy metal CD’s.
Alan’s dependent variable is annual heavy metal CD sales (in $1,000,000's), and his independent
variables are teenage population (in 1,000's) and type of sales district (0 = urban, 1 = rural).
Regression analysis of the data yielded the following tables.
Intercept
x1(teenagers)
x2(district)
Coefficients Standard Error
1.7
0.384212
0.04
0.014029
-1.5666667
0.20518
t Statistic
4.424638
2.851146
-7.63558
p-value
0.00166
0.019054
3.21E-05
For an urban sales district with 10,000 teenagers, Alan's model predicts annual sales of heavy
metal CD sales of ________________.
a) $2,100,000
b) $524,507
c) $533,333
d) $729,683
e) $21,000,000
Ans: a
Response: See section 14.2 Indicator (Dummy) Variables
Difficulty: Easy
35. Alan Bissell, a market analyst for City Sound Mart, is analyzing sales of heavy metal CD’s.
Alan’s dependent variable is annual heavy metal CD sales (in $1,000,000's), and his independent
variables are teenage population (in 1,000's) and type of sales district (0 = urban, 1 = rural).
Regression analysis of the data yielded the following tables.
Intercept
x1(teenagers)
x2(district)
Coefficients Standard Error
1.7
0.384212
0.04
0.014029
-1.5666667
0.20518
t Statistic
4.424638
2.851146
-7.63558
p-value
0.00166
0.019054
3.21E-05
For a rural sales district with 10,000 teenagers, Alan's model predicts annual sales of heavy metal
CD sales of ________________.
a) $2,100,000
b) $524,507
c) $533,333
d) $729,683
e) $210,000
Ans: c
Response: See section 14.2 Indicator (Dummy) Variables
Difficulty: Easy
36. Alan Bissell, a market analyst for City Sound Mart, is analyzing sales of heavy metal CD’s.
Alan’s dependent variable is annual heavy metal CD sales (in $1,000,000's), and his independent
variables are teenage population (in 1,000's) and type of sales district (0 = urban, 1 = rural).
Regression analysis of the data yielded the following tables.
Intercept
x1(teenagers)
x2(district)
Coefficients Standard Error
1.7
0.384212
0.04
0.014029
-1.5666667
0.20518
t Statistic
4.424638
2.851146
-7.63558
p-value
0.00166
0.019054
3.21E-05
For two sales districts with the same number of teenagers one urban and one rural, Alan's model
predicts _______.
a) $1,566,666 higher sales in the rural district
b) the same sales in both districts
c) $1,566,666 lower sales in the rural district
d) $1,700,000 higher sales in the urban district
e) $ 1,700,000 lower sales in the rural district
Ans: c
Response: See section 14.2 Indicator (Dummy) Variables
Difficulty: Medium
37. Abby Kratz, a market specialist at the market research firm of Saez, Sikes, and Spitz, is
analyzing household budget data collected by her firm. Abby's dependent variable is monthly
household expenditures on groceries (in $'s), and her independent variables are annual household
income (in $1,000's) and household neighborhood (0 = suburban, 1 = rural). Regression analysis
of the data yielded the following table.
Intercept
X1 (income)
X2
(neighborhood)
Coefficients Standard Error
19.68247
10.01176
1.735272
0.174564
49.12456
7.655776
t Statistic p-value
1.965934 0.077667
9.940612 1.68E-06
6.416667 7.67E-05
Abby's model is ________________.
a) y = 19.68247 + 10.01176 x1 + 1.965934 x2
b) y = 1.965934 + 9.940612 x1 + 6.416667 x2
c) y = 10.01176 + 0.174564 x1 + 7.655776 x2
d) y = 19.68247 - 1.735272 x1 + 49.12456 x2
e) y = 19.68247 + 1.735272 x1 + 49.12456 x2
Ans: e
Response: See section 14.2 Indicator (Dummy) Variables
Difficulty: Medium
38. Abby Kratz, a market specialist at the market research firm of Saez, Sikes, and Spitz, is
analyzing household budget data collected by her firm. Abby's dependent variable is monthly
household expenditures on groceries (in $'s), and her independent variables are annual household
income (in $1,000's) and household neighborhood (0 = suburban, 1 = rural). Regression analysis
of the data yielded the following table.
Coefficients Standard Error
Intercept
19.68247
10.01176
x1 (income)
1.735272
0.174564
x2 (neighborhood) 49.12456
7.655776
t Statistic
1.965934
9.940612
6.416667
p-value
0.077667
1.68E-06
7.67E-05
For a rural household with $70,000 annual income, Abby's model predicts monthly grocery
expenditure of ________________.
a) $141.15
b) $190.28
c) $164.52
d) $122.67
e) $132.28
Ans: b
Response: See section 14.2 Indicator (Dummy) Variables
Difficulty: Medium
39. Abby Kratz, a market specialist at the market research firm of Saez, Sikes, and Spitz, is
analyzing household budget data collected by her firm. Abby's dependent variable is monthly
household expenditures on groceries (in $'s), and her independent variables are annual household
income (in $1,000's) and household neighborhood (0 = suburban, 1 = rural). Regression analysis
of the data yielded the following table.
Coefficients Standard Error
Intercept
19.68247
10.01176
x1 (income)
1.735272
0.174564
x2 (neighborhood) 49.12456
7.655776
t Statistic
1.965934
9.940612
6.416667
p-value
0.077667
1.68E-06
7.67E-05
For a suburban household with $70,000 annual income, Abby's model predicts monthly grocery
expenditure of ________________.
a) $141.15
b) $190.28
c) $164.52
d) $122.67
e) $241.15
Ans: a
Response: See section 14.2 Indicator (Dummy) Variables
Difficulty: Medium
40. Abby Kratz, a market specialist at the market research firm of Saez, Sikes, and Spitz, is
analyzing household budget data collected by her firm. Abby's dependent variable is monthly
household expenditures on groceries (in $'s), and her independent variables are annual household
income (in $1,000's) and household neighborhood (0 = suburban, 1 = rural). Regression analysis
of the data yielded the following table.
Coefficients Standard Error
Intercept
19.68247
10.01176
x1 (income)
1.735272
0.174564
x2 (neighborhood)
49.12456
7.655776
t Statistic p-value
1.965934 0.077667
9.940612 1.68E-06
6.416667 7.67E-05
For two households, one suburban and one rural, Abby's model predicts ________.
a) equal monthly expenditures for groceries
b) the suburban household's monthly expenditures for groceries will be $49 more
c) the rural household's monthly expenditures for groceries will be $49 more
d) the suburban household's monthly expenditures for groceries will be $8 more
e) the rural household's monthly expenditures for groceries will be $49 less
Ans: c
Response: See section 14.2 Indicator (Dummy) Variables
Difficulty: Medium
41. A multiple regression analysis produced the following tables.
Coefficients Standard Error
Intercept
707.9144
435.1183
x1
2.903307
81.62802
x12
11.91297
3.806211
Regression
Residual
Total
df
2
27
29
SS
32055153
9140128
41195281
t Statistic
1.626947
0.035568
3.129878
MS
F
p-value
16027577 47.34557 1.49E-09
338523.3
The regression equation for this analysis is ____________.
a) y = 707.9144 + 2.903307 x1 + 11.91297 x12
b) y = 707.9144 + 435.1183 x1 + 1.626947 x12
c) y = 435.1183 + 81.62802 x1 + 3.806211 x12
d) y = 1.626947 + 0.035568 x1 + 3.129878 x12
e) y = 1.626947 + 0.035568 x1 - 3.129878 x12
Ans: a
p-value
0.114567
0.971871
0.003967
Response: See section 14.1 Nonlinear Models: Mathematical Transformation
Difficulty: Medium
42. A multiple regression analysis produced the following tables.
Coefficients Standard Error
Intercept
707.9144
435.1183
x1
2.903307
81.62802
2
x1
11.91297
3.806211
Regression
Residual
Total
df
2
27
29
SS
32055153
9140128
41195281
t Statistic
1.626947
0.035568
3.129878
p-value
0.114567
0.971871
0.003967
MS
F
p-value
16027577 47.34557 1.49E-09
338523.3
The sample size for this analysis is ____________.
a) 27
b) 29
c) 30
d) 25
e) 28
Ans: c
Response: See section 14.1 Nonlinear Models: Mathematical Transformation
Difficulty: Medium
43. A multiple regression analysis produced the following tables.
Intercept
x1
x12
Coefficients Standard Error
707.9144
435.1183
2.903307
81.62802
11.91297
3.806211
Regression
Residual
Total
df
2
27
29
SS
32055153
9140128
41195281
t Statistic
1.626947
0.035568
3.129878
p-value
0.114567
0.971871
0.003967
MS
F
p-value
16027577 47.34557 1.49E-09
338523.3
Using  = 0.01 to test the null hypothesis H0: 1 = 2 = 0, the critical F value is ____.
a) 5.42
b) 5.49
c) 7.60
d) 3.35
e) 2.49
Ans: b
Response: See section 14.1 Nonlinear Models: Mathematical Transformation
Difficulty: Medium
44. A multiple regression analysis produced the following tables.
Intercept
x1
x12
Coefficients Standard Error
707.9144
435.1183
2.903307
81.62802
11.91297
3.806211
Regression
Residual
Total
df
2
27
29
SS
32055153
9140128
41195281
t Statistic
1.626947
0.035568
3.129878
p-value
0.114567
0.971871
0.003967
MS
F
p-value
16027577 47.34557 1.49E-09
338523.3
Using  = 0.05 to test the null hypothesis H0: 1 = 0, the critical t value is ____.
a) ± 1.311
b) ± 1.699
c) ± 1.703
d) ± 2.502
e) ± 2.052
Ans: e
Response: See section 14.1 Nonlinear Models: Mathematical Transformation
Difficulty: Medium
45. A multiple regression analysis produced the following tables.
Coefficients Standard Error
Intercept
707.9144
435.1183
x1
2.903307
81.62802
x12
11.91297
3.806211
df
SS
t Statistic
1.626947
0.035568
3.129878
MS
p-value
0.114567
0.971871
0.003967
F
p-value
Regression
Residual
Total
2
27
29
32055153
9140128
41195281
16027577 47.34557 1.49E-09
338523.3
Using  = 0.05 to test the null hypothesis H0: 2 = 0, the critical t value is ____.
a) ± 1.311
b) ± 1.699
c) ± 1.703
d) ± 2.052
e) ± 2.502
Ans: d
Response: See section 14.1 Nonlinear Models: Mathematical Transformation
Difficulty: Medium
46. A multiple regression analysis produced the following tables.
Coefficients Standard Error
Intercept
707.9144
435.1183
x1
2.903307
81.62802
x12
11.91297
3.806211
Regression
Residual
Total
df
2
27
29
SS
32055153
9140128
41195281
t Statistic
1.626947
0.035568
3.129878
p-value
0.114567
0.971871
0.003967
MS
F
p-value
16027577 47.34557 1.49E-09
338523.3
These results indicate that ____________.
a) none of the predictor variables is significant at the 5% level
b) each predictor variable is significant at the 5% level
c) x1 is the only predictor variable significant at the 5% level
d) x12 is the only predictor variable significant at the 5% level
e) each predictor variable is insignificant at the 5% level
Ans: d
Response: See section 14.1 Nonlinear Models: Mathematical Transformation
Difficulty: Easy
47. A multiple regression analysis produced the following tables.
Intercept
x1
x12
Coefficients Standard Error
707.9144
435.1183
2.903307
81.62802
11.91297
3.806211
Regression
Residual
Total
df
2
27
29
SS
32055153
9140128
41195281
t Statistic
1.626947
0.035568
3.129878
p-value
0.114567
0.971871
0.003967
MS
F
p-value
16027577 47.34557 1.49E-09
338523.3
For x1= 10, the predicted value of y is ____________.
a) 1,632.02
b) 1,928.25
c) 10.23
d) 314.97
e) 938.35
Ans: b
Response: See section 14.1 Nonlinear Models: Mathematical Transformation
Difficulty: Easy
48. A multiple regression analysis produced the following tables.
Coefficients Standard Error
Intercept
707.9144
435.1183
x1
2.903307
81.62802
2
x1
11.91297
3.806211
Regression
Residual
Total
df
2
27
29
SS
32055153
9140128
41195281
t Statistic
1.626947
0.035568
3.129878
p-value
0.114567
0.971871
0.003967
MS
F
p-value
16027577 47.34557 1.49E-09
338523.3
For x1= 20, the predicted value of y is ____________.
a) 5531.15
b) 1,928.25
c) 1023.05
d) 3149.75
e) 9380.35
Ans: a
Response: See section 14.1 Nonlinear Models: Mathematical Transformation
Difficulty: Easy
49. A multiple regression analysis produced the following tables.
Coefficients Standard Error t Statistic p-value
Intercept
x1
x12
1411.876
35.18215
7.721648
df
Regression 2
Residual 25
Total
27
762.1533
96.8433
3.007943
SS
58567032
12765573
71332605
1.852483 0.074919
0.363289 0.719218
2.567086 0.016115
MS
F
29283516 57.34861
510622.9
The regression equation for this analysis is ____________.
a) y = 762.1533 + 96.8433 x1 + 3.007943 x12
b) y = 1411.876 + 762.1533 x1 + 1.852483 x12
c) y = 1411.876 + 35.18215 x1 + 7.721648 x12
d) y = 762.1533 + 1.852483 x1 + 0.074919 x12
e) y = 762.1533 - 1.852483 x1 + 0.074919 x12
Ans: c
Response: See section 14.1 Nonlinear Models: Mathematical Transformation
Difficulty: Easy
50. A multiple regression analysis produced the following tables.
Coefficients Standard Error t Statistic p-value
Intercept
x1
x12
1411.876
35.18215
7.721648
df
Regression 2
Residual 25
Total
27
762.1533
96.8433
3.007943
SS
58567032
12765573
71332605
1.852483 0.074919
0.363289 0.719218
2.567086 0.016115
MS
F
29283516 57.34861
510622.9
The sample size for this analysis is ____________.
a) 28
b) 25
c) 30
d) 27
e) 2
Ans: a
Response: See section 14.1 Nonlinear Models: Mathematical Transformation
Difficulty: Easy
51. A multiple regression analysis produced the following tables.
Coefficients Standard Error t Statistic p-value
Intercept
x1
x12
1411.876
35.18215
7.721648
df
Regression 2
Residual 25
Total
27
762.1533
96.8433
3.007943
SS
58567032
12765573
71332605
1.852483 0.074919
0.363289 0.719218
2.567086 0.016115
MS
F
29283516 57.34861
510622.9
Using  = 0.05 to test the null hypothesis H0: 1 = 2 = 0, the critical F value is ____.
a) 4.24
b) 3.39
c) 5.57
d) 3.35
e) 2.35
Ans: b
Response: See section 14.1 Nonlinear Models: Mathematical Transformation
Difficulty: Medium
52. A multiple regression analysis produced the following tables.
Coefficients Standard Error t Statistic p-value
Intercept
x1
1411.876
35.18215
762.1533
96.8433
1.852483 0.074919
0.363289 0.719218
x12
7.721648
df
Regression 2
Residual 25
Total
27
3.007943
SS
58567032
12765573
71332605
2.567086 0.016115
MS
F
29283516 57.34861
510622.9
Using  = 0.10 to test the null hypothesis H0: 1 = 0, the critical t value is ____.
a) ± 1.316
b) ± 1.314
c) ± 1.703
d) ± 1.780
e) ± 1.708
Ans: e
Response: See section 14.1 Nonlinear Models: Mathematical Transformation
Difficulty: Medium
53. A multiple regression analysis produced the following tables.
Coefficients Standard Error t Statistic p-value
Intercept
x1
x12
1411.876
35.18215
7.721648
df
Regression 2
Residual 25
Total
27
762.1533
96.8433
3.007943
SS
58567032
12765573
71332605
1.852483 0.074919
0.363289 0.719218
2.567086 0.016115
MS
F
29283516 57.34861
510622.9
Using  = 0.10 to test the null hypothesis H0: 2 = 0, the critical t value is ____.
a) ± 1.316
b) ± 1.314
c) ± 1.703
d) ± 1.780
e) ± 1.708
Ans: e
Response: See section 14.1 Nonlinear Models: Mathematical Transformation
Difficulty: Medium
54. A multiple regression analysis produced the following tables.
Coefficients Standard Error
Intercept 1411.876
762.1533
x1
35.18215
96.8433
2
x1
7.721648
3.007943
df
Regression 2
Residual 25
Total
27
SS
58567032
12765573
71332605
t Statistic
1.852483
0.363289
2.567086
p-value
0.074919
0.719218
0.016115
MS
F
29283516 57.34861
510622.9
For x1= 10, the predicted value of y is ____________.
a) 8.88.
b) 2,031.38
c) 2,53.86
d) 262.19
e) 2,535.86
Ans: e
Response: See section 14.1 Nonlinear Models: Mathematical Transformation
Difficulty: Medium
55. A multiple regression analysis produced the following tables.
Coefficients Standard Error
Intercept 1411.876
762.1533
x1
35.18215
96.8433
2
x1
7.721648
3.007943
df
Regression 2
Residual 25
Total
27
SS
58567032
12765573
71332605
t Statistic
1.852483
0.363289
2.567086
p-value
0.074919
0.719218
0.016115
MS
F
29283516 57.34861
510622.9
For x1= 20, the predicted value of y is ____________.
a) 5,204.18.
b) 2,031.38
c) 2,538.86
d) 6262.19
e) 6,535.86
Ans: a
Response: See section 14.1 Nonlinear Models: Mathematical Transformation
Difficulty: Medium
56. After a transformation of the y-variable values into log y, and performing a regression
analysis produced the following tables.
Intercept
x
Coefficients Standard Error t Statistic p-value
2.005349
0.097351 20.59923 4.81E-18
0.027126
0.009518 2.849843 0.008275
Regression
Residual
Total
df
SS
MS
F
p-value
1 0.196642 0.196642 8.121607 0.008447
26 0.629517 0.024212
27 0.826159
For x1= 10, the predicted value of y is ____________.
a) 155.79
b) 1.25
c) 2.42
d) 189.06
e) 18.90
Ans: d
Response: See section 14.1 Nonlinear Models: Mathematical Transformation
Difficulty: Easy
57. Which of the following iterative search procedures for model-building in a multiple
regression analysis reevaluates the contribution of variables previously include in the model after
entering a new independent variable?
a) Backward elimination
b) Stepwise regression
c) Forward selection
d) All possible regressions
e) Backward selection
Ans: b
Response: See section 14.3 Model-Building: Search Procedures
Difficulty: Medium
58. Which of the following iterative search procedures for model-building in a multiple
regression analysis starts with all independent variables in the model and then drops nonsignificant independent variables is a step-by-step manner?
a) Backward elimination
b) Stepwise regression
c) Forward selection
d) All possible regressions
e) Backward selection
Ans: a
Response: See section 14.3 Model-Building: Search Procedures
Difficulty: Medium
59. Which of the following iterative search procedures for model-building in a multiple
regression analysis adds variables to model as it proceeds, but does not reevaluate the
contribution of previously entered variables?
a) Backward elimination
b) Stepwise regression
c) Forward selection
d) All possible regressions
e) Forward elimination
Ans: c
Response: See section 14.3 Model-Building: Search Procedures
Difficulty: Medium
60. An "all possible regressions" search of a data set containing 7 independent variables will
produce ______ regressions.
a) 13
b) 127
c) 48
d) 64
e) 97
Ans: b
Response: See section 14.3 Model-Building: Search Procedures
Difficulty: Easy
61. An "all possible regressions" search of a data set containing 4 independent variables will
produce ______ regressions.
a) 15
b) 12
c) 8
d) 4
e) 2
Ans: a
Response: See section 14.3 Model-Building: Search Procedures
Difficulty: Easy
62. An "all possible regressions" search of a data set containing 9 independent variables will
produce ______ regressions.
a) 9
b) 18
c) 115
d) 151
e) 511
Ans: e
Response: See section 14.3 Model-Building: Search Procedures
Difficulty: Easy
63. An "all possible regressions" search of a data set containing "k" independent variables will
produce __________ regressions.
a) 2k -1
b) 2k - 1
c) k2 - 1
d) 2k - 1
e) 2k
Ans: d
Response: See section 14.3 Model-Building: Search Procedures
Difficulty: Medium
64. Inspection of the following table of correlation coefficients for variables in a multiple
regression analysis reveals that the first independent variable entered by the forward selection
procedure will be ___________.
y
1
-0.1661
0.231849
0.423522
-0.33227
0.199796
y
x1
x2
x3
x4
x5
x1
x2
x3
x4
x5
1
-0.51728
1
-0.22264 -0.00734
1
0.028957 -0.49869 0.260586
1
-0.20467 0.078916 0.207477 0.023839
1
a) x2
b) x3
c) x4
d) x5
e) x1
Ans: b
Response: See section 14.3 Model-Building: Search Procedures
Difficulty: Medium
65. Inspection of the following table of correlation coefficients for variables in a multiple
regression analysis reveals that the first independent variable entered by the forward selection
procedure will be ___________.
y
x1
x2
x3
x4
x5
y
1
-0.44008
0.566053
0.064919
-0.35711
0.426363
x1
x2
x3
x4
1
-0.51728
1
-0.22264 -0.00734
1
0.028957 -0.49869 0.260586
1
-0.20467 0.078916 0.207477 0.023839
a) x1
b) x2
c) x3
d) x4
e) x5
Ans: b
Response: See section 14.3 Model-Building: Search Procedures
Difficulty: Easy
x5
1
66. Inspection of the following table of correlation coefficients for variables in a multiple
regression analysis reveals that the first independent variable that will be entered into the
regression model by the forward selection procedure will be ___________.
y
y
x1
x2
x3
x4
x5
x1
x2
x3
x4
x5
1
-0.0857
1
-0.20246 0.868358
1
-0.22631 -0.10604 -0.14853
1
-0.28175 -0.0685 0.41468 -0.14151
1
0.271105 0.150796 0.129388 -0.15243 0.00821
1
a) x1
b) x2
c) x3
d) x4
e) x5
Ans: d
Response: See section 14.3 Model-Building: Search Procedures
Difficulty: Easy
67. Inspection of the following table of correlation coefficients for variables in a multiple
regression analysis reveals that the first independent variable that will be entered into the
regression model by the forward selection procedure will be ___________.
y
y
x1
x2
x3
x4
x5
a) x1
b) x2
c) x3
d) x4
e) x5
Ans: a
1
0.854168
-0.11828
-0.12003
0.525901
-0.18105
x1
x2
x3
x4
1
-0.00383
1
-0.08499 -0.14523
1
0.118169 -0.14876 0.050042
1
-0.07371 0.995886 -0.14151 -0.16934
x5
1
Response: See section 14.3 Model-Building: Search Procedures
Difficulty: Easy
68. Large correlations between two or more independent variables in a multiple regression model
could result in the problem of ________.
a) multicollinearity
b) autocorrelation
c) partial correlation
d) rank correlation
e) non-normality
Ans: a
Response: See section 14.4 Multicollinearity
Difficulty: Medium
69. An appropriate method to identify multicollinearity in a regression model is to ____.
a) examine a residual plot
b) examine the ANOVA table
c) examine a correlation matrix
d) examine the partial regression coefficients
e) examine the R2 of the regression model
Ans: c
Response: See section 14.4 Multicollinearity
Difficulty: Medium
70. An acceptable method of managing multicollinearity in a regression model is the ___.
a) use the forward selection procedure
b) use the backward elimination procedure
c) use the forward elimination procedure
d) use the stepwise regression procedure
e) use all possible regressions
Ans: d
Response: See section 14.4 Multicollinearity
Difficulty: Medium
71. A useful technique in controlling multicollinearity involves the _________.
a) use of variance inflation factors
b) use the backward elimination procedure
c) use the forward elimination procedure
d) use the forward selection procedure
e) use all possible regressions
Ans: a
Response: See section 14.4 Multicollinearity
Difficulty: Medium
72. Inspection of the following table of correlation coefficients for variables in a multiple
regression analysis reveals potential multicollinearity with variables ___________.
y
y
x1
x2
x3
x4
x5
x1
x2
x3
x4
x5
1
-0.0857
1
-0.20246 0.868358
1
-0.22631 -0.10604 -0.14853
1
-0.28175 -0.0685 0.41468 -0.14151
1
0.271105 0.150796 0.129388 -0.15243 0.00821
1
a) x1 and x2
b) x1 and x4
c) x4 and x5
d) x4 and x3
e) x5 and y
Ans: a
Response: See section 14.4 Multicollinearity
Difficulty: Medium
73. Inspection of the following table of correlation coefficients for variables in a multiple
regression analysis reveals potential multicollinearity with variables ___________.
y
y
x1
x2
x3
x4
x1
x2
x3
1
-0.08301
1
0.236745 -0.51728
1
0.155149 -0.22264 -0.00734
1
0.022234 -0.58079 0.884216 0.131956
x4
x5
1
x5
0.4808 -0.20467 0.078916 0.207477 0.103831
1
a) x1 and x5
b) x2 and x3
c) x4 and x2
d) x4 and x3
e) x4 and y
Ans: c
Response: See section 14.4 Multicollinearity
Difficulty: Medium
74. Inspection of the following table of correlation coefficients for variables in a multiple
regression analysis reveals potential multicollinearity with variables ___________.
y
x1
x2
x3
x4
x5
y
1
0.854168
-0.11828
-0.12003
0.525901
-0.18105
x1
x2
x3
x4
1
-0.00383
1
-0.08499 -0.14523
1
0.118169 -0.14876 0.050042
1
-0.07371 0.995886 -0.14151 -0.16934
x5
1
a) x1 and x2
b) x1 and x5
c) x3 and x4
d) x2 and x5
e) x3 and x5
Ans: d
Response: See section 14.4 Multicollinearity
Difficulty: Medium
75. Carlos Cavazos, Director of Human Resources, is exploring employee absenteeism at the
Plano Piano Plant. A multiple regression analysis was performed using the following variables.
The results are presented below.
Variable
Y
x1
x2
x3
x4
x5
Description
number of days absent last fiscal year
commuting distance (in miles)
employee's age (in years)
single-parent household (0 = yes, 1 = no)
length of employment at PPP (in years)
shift (0 = day, 1 = night)
Coefficients Standard Error
Intercept 6.594146
3.273005
x1
-0.18019
0.141949
x2
0.268156
0.260643
x3
-2.31068
0.962056
x4
-0.50579
0.270872
x5
2.329513
0.940321
t Statistic
2.014707
-1.26939
1.028828
-2.40182
-1.86725
2.47736
p-value
0.047671
0.208391
0.307005
0.018896
0.065937
0.015584
df
SS
MS
F
p-value
Regression 5 279.358 55.8716 4.423755 0.001532
Residual 67 846.2036 12.6299
Total
72 1125.562
R = 0.498191 R2 = 0.248194
se = 3.553858
n = 73
Adj R2 = 0.192089
Which of the following conclusions can be drawn from the above results?
a) All the independent variables in the regression are significant at 5% level.
b) Commuting distance is a highly significant (<1%) variable in explaining absenteeism.
c) Age of the employees tends to have a very significant (<1%) effect on absenteeism.
d) This model explains a little over 49% of the variability in absenteeism data.
e) A single-parent household employee is expected to be absent less number of days all other
variables held constant compared to one who is not a single-parent household.
Ans: e
Response: See section 14.3 Model-Building: Search Procedures
Difficulty: Hard
Related documents