Table of contents for Basic econometrics / Damodar N. Gujarati, Dawn C. Porter.

Bibliographic record and links to related information available from the Library of Congress catalog.

Note: Contents data are machine generated based on pre-publication provided by the publisher. Contents may have variations from the printed book or be incomplete or contain other coding.


Counter
Contents 
PREFACE xvi
ACKNOWLEDGMENTS xix
Introduction 1
I.1 WHAT IS ECONOMETRICS? 1
I.2 WHY A SEPARATE DISCIPLINE? 2
I.3 METHODOLOGY OF ECONOMETRICS 3
1. Statement of Theory or Hypothesis 4
2. Specification of the Mathematical Model of Consumption 4
3. Specification of the Econometric Model of Consumption 5
4. Obtaining Data 6
5. Estimation of the Econometric Model 7
6. Hypothesis Testing 8
7. Forecasting or Prediction 8
8. Use of the Model for Control or Policy Purposes 9
Choosing among Competing Models 10
I.4 TYPES OF ECONOMETRICS 12
I.5 MATHEMATICAL AND STATISTICAL PREREQUISITES 12
I.6 THE ROLE OF THE COMPUTER 13
I.7 SUGGESTIONS FOR FURTHER READING 13
PART I SINGLE-EQUATION REGRESSION MODELS 15
1 The Nature of Regression Analysis 17
1.1 HISTORICAL ORIGIN OF THE TERM REGRESSION 17
1.2 THE MODERN INTERPRETATION OF REGRESSION 18
Examples 18
1.3 STATISTICAL VERSUS DETERMINISTIC RELATIONSHIPS 22
vii
guj35424_fm.qxd 10/23/03 2:14 PM Page vii
viii CONTENTS
1.4 REGRESSION VERSUS CAUSATION 22
1.5 REGRESSION VERSUS CORRELATION 23
1.6 TERMINOLOGY AND NOTATION 24
1.7 THE NATURE AND SOURCES OF DATA FOR
ECONOMIC ANALYSIS 25
Types of Data 25
The Sources of Data 29
The Accuracy of Data 29
A Note on the Measurement Scales of Variables 30
1.8 SUMMARY AND CONCLUSIONS 31
EXERCISES 32
2 Two-Variable Regression Analysis:
Some Basic Ideas 37
2.1 A HYPOTHETICAL EXAMPLE 37
2.2 THE CONCEPT OF POPULATION REGRESSION
FUNCTION (PRF) 41
2.3 THE MEANING OF THE TERM LINEAR 42
Linearity in the Variables 42
Linearity in the Parameters 42
2.4 STOCHASTIC SPECIFICATION OF PRF 43
2.5 THE SIGNIFICANCE OF THE STOCHASTIC
DISTURBANCE TERM 45
2.6 THE SAMPLE REGRESSION FUNCTION (SRF) 47
2.7 AN ILLUSTRATIVE EXAMPLE 51
2.8 SUMMARY AND CONCLUSIONS 52
EXERCISES 52
3 Two-Variable Regression Model: The Problem
of Estimation 58
3.1 THE METHOD OF ORDINARY LEAST SQUARES 58
3.2 THE CLASSICAL LINEAR REGRESSION MODEL:
THE ASSUMPTIONS UNDERLYING THE METHOD
OF LEAST SQUARES 65
AWord about These Assumptions 75
3.3 PRECISION OR STANDARD ERRORS OF LEAST-SQUARES
ESTIMATES 76
3.4 PROPERTIES OF LEAST-SQUARES ESTIMATORS:
THE GAUSS?MARKOV THEOREM 79
3.5 THE COEFFICIENT OF DETERMINATION r 2: A MEASURE
OF ?GOODNESS OF FIT? 81
3.6 A NUMERICAL EXAMPLE 87
3.7 ILLUSTRATIVE EXAMPLES 90
3.8 A NOTE ON MONTE CARLO EXPERIMENTS 91
guj35424_fm.qxd 10/23/03 2:14 PM Page viii
CONTENTS ix
3.9 SUMMARY AND CONCLUSIONS 93
EXERCISES 94
APPENDIX 3A 100
3A.1 DERIVATION OF LEAST-SQUARES ESTIMATES 100
3A.2 LINEARITY AND UNBIASEDNESS PROPERTIES OF
LEAST-SQUARES ESTIMATORS 100
3A.3 VARIANCES AND STANDARD ERRORS OF
LEAST-SQUARES ESTIMATORS 101
3A.4 Covariance Between ?1 and ?2 93 
3A.5 The Least-Squares Estimator of s2 93 
3A.6 Minimum-Variance Property 
of Least-Squares Estimators 95 
3A.7 Consistency of Least-Squares Estimators 96 
CHAPTER 4 
Classical Normal Linear Regression 
Model (CNLRM) 97 
4.1 The Probability Distribution 
of Disturbances ui 97 
4.2 The Normality Assumption for ui 98 
Why the Normality Assumption? 99 
4.3 Properties of OLS Estimators under 
the Normality Assumption 100 
4.4 The Method of Maximum 
Likelihood (ML) 102 
Summary and Conclusions 102 
Appendix 4A 103 
4A.1 Maximum Likelihood Estimation 
of Two-Variable Regression Model 103 
4A.2 Maximum Likelihood Estimation 
of Food Expenditure in India 105 
Appendix 4A Exercises 105 
CHAPTER 5 
Two-Variable Regression: Interval 
Estimation and Hypothesis Testing 107 
5.1 Statistical Prerequisites 107 
5.2 Interval Estimation: Some Basic Ideas 108 
5.3 Confidence Intervals for Regression 
Coefficients 1 and 2 109 
Confidence Interval for 2 109 
Confidence Interval for 1 and 2 
Simultaneously 111 
5.4 Confidence Interval for s2 111 
5.5 Hypothesis Testing: General 
Comments 113 
5.6 Hypothesis Testing: 
The Confidence-Interval Approach 113 
Two-Sided or Two-Tail Test 113 
One-Sided or One-Tail Test 115 
5.7 Hypothesis Testing: 
The Test-of-Significance Approach 115 
Testing the Significance of Regression 
Coefficients: The t Test 115 
Testing the Significance of s2: The .2 Test 118 
5.8 Hypothesis Testing: Some Practical 
Aspects 119 
The Meaning of ?Accepting? or ?Rejecting? a 
Hypothesis 119 
The ?Zero? Null Hypothesis and the ?2-t? Rule 
of Thumb 120 
Forming the Null and Alternative 
Hypotheses 121 
Choosing a, the Level of Significance 121 
The Exact Level of Significance: 
The p Value 122 
Statistical Significance versus Practical 
Significance 123 
The Choice between Confidence-Interval 
and Test-of-Significance Approaches 
to Hypothesis Testing 124 
5.9 Regression Analysis and Analysis 
of Variance 124 
5.10 Application of Regression Analysis: 
The Problem of Prediction 126 
Mean Prediction 127 
Individual Prediction 128 
5.11 Reporting the Results of Regression 
Analysis 129 
5.12 Evaluating the Results of Regression 
Analysis 130 
Normality Tests 130 
Other Tests of Model Adequacy 132 
Summary and Conclusions 134 
Exercises 135 
Appendix 5A 143 
5A.1 Probability Distributions Related 
to the Normal Distribution 143 
5A.2 Derivation of Equation (5.3.2) 145 
5A.3 Derivation of Equation (5.9.1) 145 
5A.4 Derivations of Equations (5.10.2) 
and (5.10.6) 145 
Variance of Mean Prediction 145 
Variance of Individual Prediction 146 
CHAPTER 6 
Extensions of the Two-Variable Linear 
Regression Model 147 
6.1 Regression through the Origin 147 
r2 for Regression-through-Origin Model 150 
6.2 Scaling and Units of Measurement 154 
A Word about Interpretation 157 
6.3 Regression on Standardized Variables 157 
6.4 Functional Forms of Regression Models 159 
6.5 How to Measure Elasticity: The Log-Linear 
Model 159 
6.6 Semilog Models: Log?Lin and Lin?Log 
Models 162 
How to Measure the Growth Rate: 
The Log?Lin Model 162 
The Lin?Log Model 164 
6.7 Reciprocal Models 166 
Log Hyperbola or Logarithmic Reciprocal 
Model 172 
6.8 Choice of Functional Form 172 
6.9 A Note on the Nature of the Stochastic Error 
Term: Additive versus Multiplicative 
Stochastic Error Term 174 
Summary and Conclusions 175 
Exercises 176 
Appendix 6A 182 
6A.1 Derivation of Least-Squares Estimators 
for Regression through the Origin 182 
6A.2 Proof That a Standardized Variable 
Has Zero Mean and Unit Variance 183 
6A.3 Logarithms 184 
6A.4 Growth Rate Formulas 186 
6A.5 Box-Cox Regression, Model 187 
CHAPTER 7 
Multiple Regression Analysis: 
The Problem of Estimation 188 
7.1 The Three-Variable Model: Notation 
and Assumptions 188 
7.2 Interpretation of Multiple Regression 
Equation 191 
7.3 The Meaning of Partial Regression 
Coefficients 191 
7.4 OLS and ML Estimation of the Partial 
Regression Coefficients 192 
OLS Estimators 192 
Variances and Standard Errors 
of OLS Estimators 194 
Properties of OLS Estimators 195 
Maximum Likelihood Estimators 196 
7.5 The Multiple Coefficient of Determination R2 
and the Multiple Coefficient 
of Correlation R 196 
7.6 An Illustrative Example 198 
Regression on Standardized Variables 199 
7.7 Simple Regression in the Context 
of Multiple Regression: Introduction to 
Specification Bias 200 
7.8 R2 and the Adjusted R2 201 
Comparing Two R2 Values 203 
Allocating R2 among Regressors 206 
The ?Game?? of Maximizing R 
? 2 206 
7.9 The Cobb?Douglas Production Function: 
More on Functional Form 207 
7.10 Polynomial Regression Models 210 
7.11 Partial Correlation Coefficients 213 
Explanation of Simple and Partial 
Correlation Coefficients 213 
Interpretation of Simple and Partial 
Correlation Coefficients 214 
Summary and Conclusions 215 
Exercises 216 
Appendix 7A 227 
7A.1 Derivation of OLS Estimators 
Given in Equations (7.4.3) to (7.4.5) 227 
7A.2 Equality between the Coefficients of PGNP 
in Equations (7.3.5) and (7.6.2) 229 
7A.3 Derivation of Equation (7.4.19) 229 
7A.4 Maximum Likelihood Estimation 
of the Multiple Regression Model 230 
7A.5 Eviews Output of the Cobb?Douglas 
Production Function in Equation (7.9.4) 231 
CHAPTER 8 
Multiple Regression Analysis: The Problem 
of Inference 233 
8.1 The Normality Assumption Once Again 233 
8.2 Hypothesis Testing in Multiple Regression: 
General Comments 234 
8.3 Hypothesis Testing about Individual 
Regression Coefficients 235 
8.4 Testing the Overall Significance of the Sample 
Regression 237 
The Analysis of Variance Approach to Testing the 
Overall Significance of an Observed Multiple 
Regression: The F Test 238 
Testing the Overall Significance of a Multiple 
Regression: The F Test 240 
An Important Relationship between 
R2 and F 241 
Testing the Overall Significance of a Multiple 
Regression in Terms of R2 242 
The ?Incremental? or ?Marginal? Contribution 
of an Explanatory Variable 243 
8.5 Testing the Equality of Two Regression 
Coefficients 246 
8.6 Restricted Least Squares: Testing Linear 
Equality Restrictions 248 
The t-Test Approach 249 
The F-Test Approach: Restricted Least 
Squares 249 
General F Testing 252 
8.7 Testing for Structural or Parameter Stability 
of Regression Models: The Chow Test 254 
8.8 Prediction with Multiple Regression 259 
8.9 The Troika of Hypothesis Tests: The 
Likelihood Ratio (LR), Wald (W), and 
Lagrange Multiplier (LM) Tests 259 
8.10 Testing the Functional Form of Regression: 
Choosing between Linear and Log?Linear 
Regression Models 260 
Summary and Conclusions 262 
Exercises 262 
Appendix 8A: Likelihood 
Ratio (LR) Test 274 
CHAPTER 9 
Dummy Variable Regression Models 277 
9.1 The Nature of Dummy Variables 277 
9.2 Anova Models 278 
Caution in the Use of Dummy Variables 281 
9.3 Anova Models with Two Qualitative 
Variables 283 
9.4 Regression with a Mixture of Quantitative 
and Qualitative Regressors: The ANCOVA 
Models 283 
9.5 The Dummy Variable Alternative 
to the Chow Test 285 
9.6 Interaction Effects Using Dummy 
Variables 288 
9.7 The Use of Dummy Variables in Seasonal 
Analysis 290 
9.8 Piecewise Linear Regression 295 
9.9 Panel Data Regression Models 297 
9.10 Some Technical Aspects of the Dummy 
Variable Technique 297 
The Interpretation of Dummy Variables 
in Semilogarithmic Regressions 297 
Dummy Variables and Heteroscedasticity 298 
Dummy Variables and Autocorrelation 299 
What Happens If the Dependent Variable 
Is a Dummy Variable? 299 
9.11 Topics for Further Study 300 
9.12 A Concluding Example 300 
Summary and Conclusions 304 
Exercises 305 
Appendix 9A: Semilogarithmic Regression 
with Dummy Regressor 314 
PART TWO 
RELAXING THE ASSUMPTIONS OF THE 
CLASSICAL MODEL 315 
CHAPTER 10 
Multicollinearity: What Happens 
If the Regressors Are Correlated? 320 
10.1 The Nature of Multicollinearity 321 
10.2 Estimation in the Presence of Perfect 
Multicollinearity 324 
10.3 Estimation in the Presence of ?High? 
but ?Imperfect? Multicollinearity 325 
10.4 Multicollinearity: Much Ado about Nothing? 
Theoretical Consequences 
of Multicollinearity 326 
10.5 Practical Consequences 
of Multicollinearity 327 
Large Variances and Covariances 
of OLS Estimators 328 
Wider Confidence Intervals 330 
?Insignificant? t Ratios 330 
A High R2 but Few Significant t Ratios 331 
Sensitivity of OLS Estimators and Their 
Standard Errors to Small Changes in Data 331 
Consequences of Micronumerosity 332 
10.6 An Illustrative Example 332 
10.7 Detection of Multicollinearity 337 
10.8 Remedial Measures 342 
Do Nothing 342 
Rule-of-Thumb Procedures 342 
10.9 Is Multicollinearity Necessarily Bad? Maybe 
Not, If the Objective Is Prediction Only 347 
10.10 An Extended Example: The Longley 
Data 347 
Summary and Conclusions 350 
Exercises 351 
CHAPTER 11 
Heteroscedasticity: What Happens If 
the Error Variance Is Nonconstant? 365 
11.1 The Nature of Heteroscedasticity 365 
11.2 OLS Estimation in the Presence 
of Heteroscedasticity 370 
11.3 The Method of Generalized Least 
Squares (GLS) 371 
Difference between OLS and GLS 373 
11.4 Consequences of Using OLS in the Presence 
of Heteroscedasticity 374 
OLS Estimation Allowing for 
Heteroscedasticity 374 
OLS Estimation Disregarding 
Heteroscedasticity 374 
A Technical Note 376 
11.5 Detection of Heteroscedasticity 376 
Informal Methods 376 
Formal Methods 378 
11.6 Remedial Measures 389 
When s2 
i Is Known: The Method of Weighted 
Least Squares 389 
When s2 
i Is Not Known 391 
11.7 Concluding Examples 395 
11.8 A Caution about Overreacting 
to Heteroscedasticity 400 
Summary and Conclusions 400 
Exercises 401 
Appendix 11A 409 
11A.1 Proof of Equation (11.2.2) 409 
11A.2 The Method of Weighted Least 
Squares 409 
11A.3 Proof that E(s2) s2 in the Presence 
of Heteroscedasticity 410 
11A.4 White?s Robust Standard Errors 411 
CHAPTER 12 
Autocorrelation: What Happens If the Error 
Terms Are Correlated? 412 
12.1 The Nature of the Problem 413 
12.2 OLS Estimation in the Presence 
of Autocorrelation 418 
12.3 The BLUE Estimator in the Presence 
of Autocorrelation 422 
12.4 Consequences of Using OLS 
in the Presence of Autocorrelation 423 
OLS Estimation Allowing 
for Autocorrelation 423 
OLS Estimation Disregarding 
Autocorrelation 423 
12.5 Relationship between Wages and Productivity 
in the Business Sector of the United States, 
1960?2005 428 
12.6 Detecting Autocorrelation 429 
I. Graphical Method 429 
II. The Runs Test 431 
III. Durbin?Watson d Test 434 
IV. A General Test of Autocorrelation: 
The Breusch?Godfrey (BG) Test 438 
Why So Many Tests of Autocorrelation? 440 
12.7 What to Do When You Find Autocorrelation: 
Remedial Measures 440 
12.8 Model Mis-Specification versus Pure 
Autocorrelation 441 
12.9 Correcting for (Pure) Autocorrelation: 
The Method of Generalized Least 
Squares (GLS) 442 
When . Is Known 442 
When . Is Not Known 443 
12.10 The Newey?West Method of Correcting 
the OLS Standard Errors 447 
12.11 OLS versus FGLS and HAC 448 
12.12 Additional Aspects of Autocorrelation 449 
Dummy Variables and Autocorrelation 449 
ARCH and GARCH Models 449 
Coexistence of Autocorrelation 
and Heteroscedasticity 450 
12.13 A Concluding Example 450 
Summary and Conclusions 452 
Exercises 453 
Appendix 12A 466 
12A.1 Proof That the Error Term vt ln 
Equation (12.1.11) Is Autocorrelated 466 
12A.2 Proof of Equations (12.2.3), (12.2.4), 
and (12.2.5) 466 
CHAPTER 13 
Econometric Modeling: Model Specification 
and Diagnostic Testing 467 
13.1 Model Selection Criteria 468 
13.2 Types of Specification Errors 468 
13.3 Consequences of Model Specification 
Errors 470 
Underfitting a Model (Omitting a Relevant 
Variable) 471 
Inclusion of an Irrelevant Variable 
(Overfitting a Model) 473 
13.4 Tests of Specification Errors 474 
Detecting the Presence of Unnecessary Variables 
(Overfitting a Model) 475 
Tests for Omitted Variables and Incorrect 
Functional Form 477 
13.5 Errors of Measurement 482 
Errors of Measurement in the Dependent 
Variable Y 482 
Errors of Measurement in the Explanatory 
Variable X 483 
13.6 Incorrect Specification of the Stochastic 
Error Term 486 
13.7 Nested versus Non-Nested Models 487 
13.8 Tests of Non-Nested Hypotheses 488 
The Discrimination Approach 488 
The Discerning Approach 488 
13.9 Model Selection Criteria 493 
The R2 Criterion 493 
Adjusted R2 493 
Akaike?s Information Criterion (AIC) 494 
Schwarz?s Information Criterion (SIC) 494 
Mallows?s Cp Criterion 494 
A Word of Caution about Model 
Selection Criteria 495 
Forecast Chi-Square (.2) 496 
13.10 Additional Topics in Econometric 
Modeling 496 
Outliers, Leverage, and Influence 496 
Recursive Least Squares 498 
Chow?s Prediction Failure Test 498 
Missing Data 499 
13.11 Concluding Examples 500 
1. A Model of Hourly Wage Determination 500 
2. Real Consumption Function for the United 
States, 1947?2000 505 
13.12 Non-Normal Errors and Stochastic 
Regressors 509 
1. What Happens If the Error Term Is Not 
Normally Distributed? 509 
2. Stochastic Explanatory Variables 510 
13.13 A Word to the Practitioner 511 
Summary and Conclusions 512 
Exercises 513 
Appendix 13A 520 
13A.1 The Proof That E(b12) = 2 + 3b32 
[Equation (13.3.3)] 520 
13A.2 The Consequences of Including an Irrelevant 
Variable: The Unbiasedness Property 520 
13A.3 The Proof of Equation (13.5.10) 521 
13A.4 The Proof of Equation (13.6.2) 522 
PART 3 
TOPICS IN ECONOMETRICS 523 
CHAPTER 14 
Nonlinear Regression Models 525 
14.1 Intrinsically Linear and Intrinsically 
Nonlinear Regression Models 525 
14.2 Estimation of Linear and Nonlinear 
Regression Models 527 
14.3 Estimating Nonlinear Regression Models: 
The Trial-and-Error Method 527 
14.4 Approaches to Estimating Nonlinear 
Regression Models 529 
Direct Search or Trial-and-Error 
or Derivative-Free Method 529 
Direct Optimization 529 
Iterative Linearization Method 530 
14.5 Illustrative Examples 530 
Summary and Conclusions 535 
Exercises 535 
Appendix 14A 537 
14A.1 Derivation of Equations (14.2.4) 
and (14.2.5) 537 
14A.2 The Linearization Method 537 
14A.3 Linear Approximation of the Exponential 
Function Given in Equation (14.2.2) 538 
CHAPTER 15 
Qualitative Response Regression Models 541 
15.1 The Nature of Qualitative Response 
Models 541 
15.2 The Linear Probability Model (LPM) 543 
Non-Normality of the Disturbances ui 544 
Heteroscedastic Variances 
of the Disturbances 544 
Nonfulfillment of 0 = E(Yi | Xi) = 1 545 
Questionable Value of R2 as a Measure 
of Goodness of Fit 546 
15.3 Applications of LPM 549 
15.4 Alternatives to LPM 552 
15.5 The Logit Model 553 
15.6 Estimation of the Logit Model 555 
Data at the Individual Level 556 
Grouped or Replicated Data 556 
15.7 The Grouped Logit (Glogit) Model: A 
Numerical Example 558 
Interpretation of the Estimated Logit 
Model 558 
15.8 The Logit Model for Ungrouped 
or Individual Data 561 
15.9 The Probit Model 566 
Probit Estimation with Grouped 
Data: gprobit 567 
The Probit Model for Ungrouped 
or Individual Data 570 
The Marginal Effect of a Unit Change 
in the Value of a Regressor in the Various 
Regression Models 571 
15.10 Logit and Probit Models 571 
15.11 The Tobit Model 574 
Illustration of the Tobit Model: Ray Fair?s Model 
of Extramarital Affairs 575 
15.12 Modeling Count Data: The Poisson 
Regression Model 576 
15.13 Further Topics in Qualitative Response 
Regression Models 579 
Ordinal Logit and Probit Models 580 
Multinomial Logit and Probit Models 580 
Duration Models 580 
Summary and Conclusions 581 
Exercises 582 
Appendix 15A 589 
15A.1 Maximum Likelihood Estimation of the Logit 
and Probit Models for Individual (Ungrouped) 
Data 589 
CHAPTER 16 
Panel Data Regression Models 591 
16.1 Why Panel Data? 592 
16.2 Panel Data: An Illustrative Example 593 
16.3 Pooled OLS Regression or Constant 
Coefficients Model 594 
16.4 The Fixed Effect Least-Squares Dummy 
Variable (LSDV) Model 596 
A Caution in the Use of the Fixed Effect 
LSDV Model 598 
16.5 The Fixed-Effect Within-Group (WG) 
Estimator 599 
16.6 The Random Effects Model (REM) 602 
Breusch and Pagan Lagrange 
Multiplier Test 605 
16.7 Properties of Various Estimators 605 
16.8 Fixed Effects versus Random Effects Model: 
Some Guidelines 606 
16.9 Panel Data Regressions: Some Concluding 
Comments 607 
16.10 Some Illustrative Examples 607 
Summary and Conclusions 612 
Exercises 613 
CHAPTER 17 
Dynamic Econometric Models: 
Autoregressive and Distributed-Lag 
Models 617 
17.1 The Role of ?Time,?? or ?Lag,?? 
in Economics 618 
17.2 The Reasons for Lags 622 
17.3 Estimation of Distributed-Lag Models 623 
Ad Hoc Estimation of Distributed-Lag 
Models 623 
17.4 The Koyck Approach to Distributed-Lag 
Models 624 
The Median Lag 627 
The Mean Lag 627 
17.5 Rationalization of the Koyck Model: The 
Adaptive Expectations Model 629 
17.6 Another Rationalization of the Koyck Model: 
The Stock Adjustment, or Partial Adjustment, 
Model 632 
17.7 Combination of Adaptive Expectations 
and Partial Adjustment Models 634 
17.8 Estimation of Autoregressive Models 634 
17.9 The Method of Instrumental 
Variables (IV) 636 
17.10 Detecting Autocorrelation in Autoregressive 
Models: Durbin h Test 637 
17.11 A Numerical Example: The Demand for 
Money in Canada, 1979?I to 1988?IV 639 
17.12 Illustrative Examples 642 
17.13 The Almon Approach to Distributed-Lag 
Models: The Almon or Polynomial Distributed 
Lag (PDL) 645 
17.14 Causality in Economics: The Granger 
Causality Test 652 
The Granger Test 653 
A Note on Causality and Exogeneity 657 
Summary and Conclusions 658 
Exercises 59 
Appendix 17A 669 
17A.1 The Sargan Test for the Validity 
of Instruments 669 
PART 4 
SIMULTANEOUS-EQUATION 
MODELS 671 
CHAPTER 18 
Simultaneous-Equation Models 673 
18.1 The Nature of Simultaneous-Equation 
Models 673 
18.2 Examples of Simultaneous-Equation 
Models 674 
18.3 The Simultaneous-Equation Bias: 
Inconsistency of OLS Estimators 679 
18.4 The Simultaneous-Equation Bias: A Numerical 
Example 682 
Summary and Conclusions 684 
Exercises 684 
CHAPTER 19 
The Identification Problem 689 
19.1 Notations and Definitions 689 
19.2 The Identification Problem 692 
Underidentification 692 
Just, or Exact, Identification 694 
Overidentification 697 
19.3 Rules for Identification 699 
The Order Condition of Identifiability 699 
The Rank Condition of Identifiability 700 
19.4 A Test of Simultaneity 703 
Hausman Specification Test 703 
19.5 Tests for Exogeneity 705 
Summary and Conclusions 706 
Exercises 706 
CHAPTER 20 
Simultaneous-Equation Methods 711 
20.1 Approaches to Estimation 711 
20.2 Recursive Models and Ordinary 
Least Squares 712 
20.3 Estimation of a Just Identified Equation: The 
Method of Indirect Least Squares (ILS) 715 
An Illustrative Example 715 
Properties of ILS Estimators 718 
20.4 Estimation of an Overidentified Equation: 
The Method of Two-Stage Least Squares 
(2SLS) 718 
20.5 2SLS: A Numerical Example 721 
20.6 Illustrative Examples 724 
Summary and Conclusions 730 
Exercises 730 
Appendix 20A 735 
20A.1 Bias in the Indirect Least-Squares 
Estimators 735 
20A.2 Estimation of Standard Errors of 2SLS 
Estimators 736 
CHAPTER 21 
Time Series Econometrics: 
Some Basic Concepts 737 
21.1 A Look at Selected U.S. Economic Time 
Series 738 
21.2 Key Concepts 739 
21.3 Stochastic Processes 740 
Stationary Stochastic Processes 740 
Nonstationary Stochastic Processes 741 
21.4 Unit Root Stochastic Process 744 
21.5 Trend Stationary (TS) and Difference 
Stationary (DS) Stochastic Processes 745 
21.6 Integrated Stochastic Processes 746 
Properties of Integrated Series 747 
21.7 The Phenomenon of Spurious 
Regression 747 
21.8 Tests of Stationarity 748 
1. Graphical Analysis 749 
2. Autocorrelation Function (ACF) 
and Correlogram 749 
Statistical Significance of Autocorrelation 
Coefficients 753 
21.9 The Unit Root Test 754 
The Augmented Dickey?Fuller (ADF) 
Test 757 
Testing the Significance of More Than One 
Coefficient: The F Test 758 
The Phillips?Perron (PP) Unit 
Root Tests 758 
Testing for Structural Changes 758 
A Critique of the Unit Root Tests 759 
21.10 Transforming Nonstationary Time Series 760 
Difference-Stationary Processes 760 
Trend-Stationary Processes 761 
21.11 Cointegration: Regression of a Unit 
Root Time Series on Another Unit Root 
Time Series 762 
Testing for Cointegration 763 
Cointegration and Error Correction 
Mechanism (ECM) 764 
21.12 Some Economic Applications 765 
Summary and Conclusions 768 
Exercises 769 
CHAPTER 22 
Time Series Econometrics: 
Forecasting 773 
22.1 Approaches to Economic Forecasting 773 
Exponential Smoothing Methods 774 
Single-Equation Regression Models 774 
Simultaneous-Equation Regression 
Models 774 
ARIMA Models 774 
VAR Models 775 
22.2 AR, MA, and ARIMA Modeling of Time 
Series Data 775 
An Autoregressive (AR) Process 775 
A Moving Average (MA) Process 776 
An Autoregressive and Moving Average (ARMA) 
Process 776 
An Autoregressive Integrated Moving 
Average (ARIMA) Process 776 
22.3 The Box?Jenkins (BJ) Methodology 777 
22.4 Identification 778 
22.5 Estimation of the ARIMA Model 782 
22.6 Diagnostic Checking 782 
22.7 Forecasting 782 
22.8 Further Aspects of the BJ Methodology 784 
22.9 Vector Autoregression (VAR) 784 
Estimation or VAR 785 
Forecasting with VAR 786 
VAR and Causality 787 
Some Problems with VAR Modeling 788 
An Application of VAR: A VAR Model of the Texas 
Economy 789 
22.10 Measuring Volatility in Financial Time Series: 
The ARCH and GARCH Models 791 
What to Do If ARCH Is Present 795 
A Word on the Durbin?Watson d and the ARCH 
Effect 796 
A Note on the GARCH Model 796 
22.11 Concluding Examples 796 
Summary and Conclusions 798 
Exercises 799 
APPENDIX A 
A Review of Some Statistical Concepts 801 
A.1 Summation and Product Operators 801 
A.2 Sample Space, Sample Points, 
and Events 802 
A.3 Probability and Random Variables 802 
Probability 802 
Random Variables 803 
A.4 Probability Density Function (PDF) 803 
Probability Density Function of a Discrete 
Random Variable 803 
Probability Density Function of a Continuous 
Random Variable 804 
Joint Probability Density Functions 805 
Marginal Probability Density Function 805 
Statistical Independence 806 
A.5 Characteristics of Probability 
Distributions 808 
Expected Value 808 
Properties of Expected Values 809 
Variance 810 
Properties of Variance 811 
Covariance 811 
Properties of Covariance 812 
Correlation Coefficient 812 
Conditional Expectation and Conditional 
Variance 813 
Properties of Conditional Expectation 
and Conditional Variance 814 
Higher Moments of Probability 
Distributions 815 
A.6 Some Important Theoretical Probability 
Distributions 816 
Normal Distribution 816 
The .2 (Chi-Square) Distribution 819 
Student?s t Distribution 820 
The F Distribution 821 
The Bernoulli Binomial Distribution 822 
Binomial Distribution 822 
The Poisson Distribution 823 
A.7 Statistical Inference: Estimation 823 
Point Estimation 823 
Interval Estimation 824 
Methods of Estimation 825 
Small-Sample Properties 826 
Large-Sample Properties 828 
A.8 Statistical Inference: Hypothesis Testing 831 
The Confidence Interval Approach 832 
The Test of Significance Approach 836 
References 837 
APPENDIX B 
Rudiments of Matrix Algebra 838 
B.1 Definitions 838 
Matrix 838 
Column Vector 838 
Row Vector 839 
Transposition 839 
Submatrix 839 
B.2 Types of Matrices 839 
Square Matrix 839 
Diagonal Matrix 839 
Scalar Matrix 840 
Identity, or Unit, Matrix 840 
Symmetric Matrix 840 
Null Matrix 840 
Null Vector 840 
Equal Matrices 840 
B.3 Matrix Operations 840 
Matrix Addition 840 
Matrix Subtraction 841 
Scalar Multiplication 841 
Matrix Multiplication 841 
Properties of Matrix Multiplication 842 
Matrix Transposition 843 
Matrix Inversion 843 
B.4 Determinants 843 
Evaluation of a Determinant 844 
Properties of Determinants 844 
Rank of a Matrix 845 
Minor 846 
Cofactor 846 
B.5 Finding the Inverse of a Square Matrix 847 
B.6 Matrix Differentiation 848 
References 848 
APPENDIX C 
The Matrix Approach to Linear Regression 
Model 849 
C.1 The k-Variable Linear Regression Model 849 
C.2 Assumptions of the Classical Linear 
Regression Model in Matrix Notation 851 
C.3 OLS Estimation 853 
An Illustration 855 
Variance-Covariance Matrix of ? 856 
Properties of OLS Vector ? 858 
C.4 The Coefficient of Determination R2 in Matrix 
Notation 858 
C.5 The Correlation Matrix 859 
C.6 Hypothesis Testing about Individual 
Regression Coefficients in Matrix 
Notation 859 
C.7 Testing the Overall Significance of Regression: 
Analysis of Variance in Matrix Notation 860 
C.8 Testing Linear Restrictions: General F Testing 
Using Matrix Notation 861 
C.9 Prediction Using Multiple Regression: Matrix 
Formulation 861 
Mean Prediction 861 
Variance of Mean Prediction 862 
Individual Prediction 862 
Variance of Individual Prediction 862 
C.10 Summary of the Matrix Approach: An 
Illustrative Example 863 
C.11 Generalized Least Squares (GLS) 867 
C.12 Summary and Conclusions 868 
Exercises 869 
Appendix CA 
CA.1 Derivation of k Normal or Simultaneous 
Equations 874 
CA.2 Matrix Derivation of Normal Equations 875 
CA.3 Variance?Covariance Matrix of ? 875 
CA.4 Blue Property of OLS Estimators 875 
APPENDIX D 
Statistical Tables 877 
APPENDIX E 
Computer Output of EViews, MINITAB, 
Excel, and STATA 894 
E.1 EViews 894 
E.2 MINITAB 896 
E.3 Excel 897 
E.4 STATA 898 
E.5 Concluding Comments 898 
References 899 
APPENDIX F 
Economic Data on the World 
Wide Web 900 
Selected Bibliography 902 
INDEX

Library of Congress Subject Headings for this publication:

Econometrics.