Academic Integrity: tutoring, explanations, and feedback — we don’t complete graded work or submit on a student’s behalf.

Regress CM on FLR: Dependent Variable: CM Sample: 64 t statistic Coefficient 263

ID: 3326045 • Letter: R

Question

Regress CM on FLR: Dependent Variable: CM Sample: 64 t statistic Coefficient 263.8635 2.3905 0.66959 Prob 0.0000 0.0000 Variable s.e 12.22499 0.213263 21.58395 -11.2092 FLR R-squared (b) Regress CM on FLR and PGNP: Variable Coefficient 263.6416 Prob 0.0000 0.0000 0.0065 t statistic FLR PGNP R-squared 2.23159 0.00565 0.707665 s.e 11.59318 0.209947 0.002003 22.74109 -10.6293 2.8187 (c) Regress CM on FLR, PGNP, and TFR: Variable Coefficient 263.6416 2.23159 0.00565 12.86864 0.747372 s.e 11.59318 0.209947 0.002003 4.190533 22.74109 10.6293 2.8187 3.070883 Prob 0.0000 0.0000 0.0065 0.0032 FLR PGNP TFR R-squared (1) Given the various regression results, which model would you choose and why? (2) Test the overall significance of each estimated regression model. (3) If the regression model in (c) is the correct model, but you estimate (a) and (b), what are the consequences? pose you have regress variables PGNP and TFR to the model? Which test would you use? Show the necessa calculations

Explanation / Answer

1)

In the above case, it would be ideal to choose the model C. This is because it has the highest number of significant predictors (having p-value less than a (0.05)). Also, it has the highest r^2 which is 0.747372.

2)

Overall significance of the model is estimated by F test. If F> Fcrit, reject the null hypothesis and claim that atleast one of the independent variables is having a significant impact on the dependent variable. Here, looking at the individual variables, we can say all the models are significant because the p-value of all the variables in all the models is less than 0.05 i.e. a.

3)

In that case, we would have a model which would predict the dependent variable to a lesser extent (i.e. less r^2) and we would have less number of predictors as well.

4)

In this case, we would have to adopt to forward selection regression procedure wherein we would keep on adding extra variables until the model can predict the maximum it can out of the independent variables.