James-stein-type estimators in large samples with application to the least absolute deviations estimator

Tae Hwan Kim, Halbert White

Research output: Contribution to journalArticle

14 Citations (Scopus)

Abstract

We explore the extension of James–Stein–type estimators in a direction that enables them to preserve their superiority when the sample size goes to infinity. Instead of shrinking a base estimator toward a fixed point, we shrink it toward a data-dependent point. We provide an analytic expression for the asymptotic risk and bias of James–Stein–type estimators shrunk toward a data-dependent point and prove that they have smaller asymptotic risk than the base estimator. Shrinking an estimator toward a data-dependent point turns out to be equivalent to combining two random variables using the James–Stein rule. We propose a general combination scheme that includes random combination (the James–Stein combination) and the usual nonrandom combination as special cases. As an example, we apply our method to combine the least absolute deviations estimator and the least squares estimator. Our simulation study indicates that the resulting combination estimators have desirable finite-sample properties when errors are drawn from symmetric distributions. Finally, using stock return data, we present some empirical evidence that the combination estimators have the potential to improve out-of-sample prediction in terms of both mean squared error and mean absolute error.

Original languageEnglish
Pages (from-to)697-705
Number of pages9
JournalJournal of the American Statistical Association
Volume96
Issue number454
DOIs
Publication statusPublished - 2001 Jun 1

All Science Journal Classification (ASJC) codes

  • Statistics and Probability
  • Statistics, Probability and Uncertainty

Fingerprint Dive into the research topics of 'James-stein-type estimators in large samples with application to the least absolute deviations estimator'. Together they form a unique fingerprint.

  • Cite this