Hausman test is popularly used to examine the endogeneity of explanatory variables in a regression model. To derive a well-defined asymptotic distribution of Hausman test, the correlation between the instrumental variables and the error term needs to converge to zero. However, it is possible that there remains considerable correlation in finite samples between the instruments and the error, even though their correlation eventually converges to zero. This article investigates the potential problem that such "pseudo- exogenous" instruments may create. We show that the performance of Hausman test is deteriorated when the instruments are asymptotically exogenous but endogenous in finite samples, through Monte Carlo simulations.
|Number of pages||7|
|Journal||Communications in Statistics: Simulation and Computation|
|Publication status||Published - 2010 Feb 1|
All Science Journal Classification (ASJC) codes
- Statistics and Probability
- Modelling and Simulation