James-stein-type estimators in large samples with application to the least absolute deviations estimator

Tae Hwan Kim, Halbert White

Research output: Contribution to journalArticle

14 Citations (Scopus)

Abstract

We explore the extension of James–Stein–type estimators in a direction that enables them to preserve their superiority when the sample size goes to infinity. Instead of shrinking a base estimator toward a fixed point, we shrink it toward a data-dependent point. We provide an analytic expression for the asymptotic risk and bias of James–Stein–type estimators shrunk toward a data-dependent point and prove that they have smaller asymptotic risk than the base estimator. Shrinking an estimator toward a data-dependent point turns out to be equivalent to combining two random variables using the James–Stein rule. We propose a general combination scheme that includes random combination (the James–Stein combination) and the usual nonrandom combination as special cases. As an example, we apply our method to combine the least absolute deviations estimator and the least squares estimator. Our simulation study indicates that the resulting combination estimators have desirable finite-sample properties when errors are drawn from symmetric distributions. Finally, using stock return data, we present some empirical evidence that the combination estimators have the potential to improve out-of-sample prediction in terms of both mean squared error and mean absolute error.

Original languageEnglish
Pages (from-to)697-705
Number of pages9
JournalJournal of the American Statistical Association
Volume96
Issue number454
DOIs
Publication statusPublished - 2001 Jun 1

Fingerprint

Stein-type Estimator
Least Absolute Deviation
Estimator
Dependent Data
Shrinking
Stock Returns
Symmetric Distributions
Deviation
Least Squares Estimator
Mean Squared Error
Sample Size
Random variable
Fixed point
Infinity
Simulation Study

All Science Journal Classification (ASJC) codes

  • Statistics and Probability
  • Statistics, Probability and Uncertainty

Cite this

@article{90760abf1b09423eab1c751597e4706d,
title = "James-stein-type estimators in large samples with application to the least absolute deviations estimator",
abstract = "We explore the extension of James–Stein–type estimators in a direction that enables them to preserve their superiority when the sample size goes to infinity. Instead of shrinking a base estimator toward a fixed point, we shrink it toward a data-dependent point. We provide an analytic expression for the asymptotic risk and bias of James–Stein–type estimators shrunk toward a data-dependent point and prove that they have smaller asymptotic risk than the base estimator. Shrinking an estimator toward a data-dependent point turns out to be equivalent to combining two random variables using the James–Stein rule. We propose a general combination scheme that includes random combination (the James–Stein combination) and the usual nonrandom combination as special cases. As an example, we apply our method to combine the least absolute deviations estimator and the least squares estimator. Our simulation study indicates that the resulting combination estimators have desirable finite-sample properties when errors are drawn from symmetric distributions. Finally, using stock return data, we present some empirical evidence that the combination estimators have the potential to improve out-of-sample prediction in terms of both mean squared error and mean absolute error.",
author = "Kim, {Tae Hwan} and Halbert White",
year = "2001",
month = "6",
day = "1",
doi = "10.1198/016214501753168352",
language = "English",
volume = "96",
pages = "697--705",
journal = "Journal of the American Statistical Association",
issn = "0162-1459",
publisher = "Taylor and Francis Ltd.",
number = "454",

}

James-stein-type estimators in large samples with application to the least absolute deviations estimator. / Kim, Tae Hwan; White, Halbert.

In: Journal of the American Statistical Association, Vol. 96, No. 454, 01.06.2001, p. 697-705.

Research output: Contribution to journalArticle

TY - JOUR

T1 - James-stein-type estimators in large samples with application to the least absolute deviations estimator

AU - Kim, Tae Hwan

AU - White, Halbert

PY - 2001/6/1

Y1 - 2001/6/1

N2 - We explore the extension of James–Stein–type estimators in a direction that enables them to preserve their superiority when the sample size goes to infinity. Instead of shrinking a base estimator toward a fixed point, we shrink it toward a data-dependent point. We provide an analytic expression for the asymptotic risk and bias of James–Stein–type estimators shrunk toward a data-dependent point and prove that they have smaller asymptotic risk than the base estimator. Shrinking an estimator toward a data-dependent point turns out to be equivalent to combining two random variables using the James–Stein rule. We propose a general combination scheme that includes random combination (the James–Stein combination) and the usual nonrandom combination as special cases. As an example, we apply our method to combine the least absolute deviations estimator and the least squares estimator. Our simulation study indicates that the resulting combination estimators have desirable finite-sample properties when errors are drawn from symmetric distributions. Finally, using stock return data, we present some empirical evidence that the combination estimators have the potential to improve out-of-sample prediction in terms of both mean squared error and mean absolute error.

AB - We explore the extension of James–Stein–type estimators in a direction that enables them to preserve their superiority when the sample size goes to infinity. Instead of shrinking a base estimator toward a fixed point, we shrink it toward a data-dependent point. We provide an analytic expression for the asymptotic risk and bias of James–Stein–type estimators shrunk toward a data-dependent point and prove that they have smaller asymptotic risk than the base estimator. Shrinking an estimator toward a data-dependent point turns out to be equivalent to combining two random variables using the James–Stein rule. We propose a general combination scheme that includes random combination (the James–Stein combination) and the usual nonrandom combination as special cases. As an example, we apply our method to combine the least absolute deviations estimator and the least squares estimator. Our simulation study indicates that the resulting combination estimators have desirable finite-sample properties when errors are drawn from symmetric distributions. Finally, using stock return data, we present some empirical evidence that the combination estimators have the potential to improve out-of-sample prediction in terms of both mean squared error and mean absolute error.

UR - http://www.scopus.com/inward/record.url?scp=1542678344&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=1542678344&partnerID=8YFLogxK

U2 - 10.1198/016214501753168352

DO - 10.1198/016214501753168352

M3 - Article

AN - SCOPUS:1542678344

VL - 96

SP - 697

EP - 705

JO - Journal of the American Statistical Association

JF - Journal of the American Statistical Association

SN - 0162-1459

IS - 454

ER -