Revisiting tests for neglected nonlinearity using artificial neural networks

Jin Seo Cho, Isao Ishida, Halbert White

Research output: Contribution to journalArticle

12 Citations (Scopus)

Abstract

Tests for regression neglected nonlinearity based on artificial neural networks (ANNs) have so far been studied by separately analyzing the two ways in which the null of regression linearity can hold. This implies that the asymptotic behavior of general ANN-based tests for neglected nonlinearity is still an open question. Here we analyze a convenient ANN-based quasi-likelihood ratio statistic for testing neglected nonlinearity, paying careful attention to both components of the null.We derive the asymptotic null distribution under each component separately and analyze their interaction. Somewhat remarkably, it turns out that the previously known asymptotic null distribution for the type 1 case still applies, but under somewhat stronger conditions than previously recognized. We present Monte Carlo experiments corroborating our theoretical results and showing that standard methods can yield misleading inference when our new, stronger regularity conditions are violated.

Original languageEnglish
Pages (from-to)1133-1186
Number of pages54
JournalNeural Computation
Volume23
Issue number5
DOIs
Publication statusPublished - 2011 May 1

Fingerprint

Nonlinearity
Artificial Neural Network
Regularity
Inference
Statistics
Interaction
Experiment
Linearity
Testing
Likelihood Ratio

All Science Journal Classification (ASJC) codes

  • Arts and Humanities (miscellaneous)
  • Cognitive Neuroscience

Cite this

Cho, Jin Seo ; Ishida, Isao ; White, Halbert. / Revisiting tests for neglected nonlinearity using artificial neural networks. In: Neural Computation. 2011 ; Vol. 23, No. 5. pp. 1133-1186.
@article{5536502ca8c14d00ac7b7385faa4c3d3,
title = "Revisiting tests for neglected nonlinearity using artificial neural networks",
abstract = "Tests for regression neglected nonlinearity based on artificial neural networks (ANNs) have so far been studied by separately analyzing the two ways in which the null of regression linearity can hold. This implies that the asymptotic behavior of general ANN-based tests for neglected nonlinearity is still an open question. Here we analyze a convenient ANN-based quasi-likelihood ratio statistic for testing neglected nonlinearity, paying careful attention to both components of the null.We derive the asymptotic null distribution under each component separately and analyze their interaction. Somewhat remarkably, it turns out that the previously known asymptotic null distribution for the type 1 case still applies, but under somewhat stronger conditions than previously recognized. We present Monte Carlo experiments corroborating our theoretical results and showing that standard methods can yield misleading inference when our new, stronger regularity conditions are violated.",
author = "Cho, {Jin Seo} and Isao Ishida and Halbert White",
year = "2011",
month = "5",
day = "1",
doi = "10.1162/NECO_a_00117",
language = "English",
volume = "23",
pages = "1133--1186",
journal = "Neural Computation",
issn = "0899-7667",
publisher = "MIT Press Journals",
number = "5",

}

Revisiting tests for neglected nonlinearity using artificial neural networks. / Cho, Jin Seo; Ishida, Isao; White, Halbert.

In: Neural Computation, Vol. 23, No. 5, 01.05.2011, p. 1133-1186.

Research output: Contribution to journalArticle

TY - JOUR

T1 - Revisiting tests for neglected nonlinearity using artificial neural networks

AU - Cho, Jin Seo

AU - Ishida, Isao

AU - White, Halbert

PY - 2011/5/1

Y1 - 2011/5/1

N2 - Tests for regression neglected nonlinearity based on artificial neural networks (ANNs) have so far been studied by separately analyzing the two ways in which the null of regression linearity can hold. This implies that the asymptotic behavior of general ANN-based tests for neglected nonlinearity is still an open question. Here we analyze a convenient ANN-based quasi-likelihood ratio statistic for testing neglected nonlinearity, paying careful attention to both components of the null.We derive the asymptotic null distribution under each component separately and analyze their interaction. Somewhat remarkably, it turns out that the previously known asymptotic null distribution for the type 1 case still applies, but under somewhat stronger conditions than previously recognized. We present Monte Carlo experiments corroborating our theoretical results and showing that standard methods can yield misleading inference when our new, stronger regularity conditions are violated.

AB - Tests for regression neglected nonlinearity based on artificial neural networks (ANNs) have so far been studied by separately analyzing the two ways in which the null of regression linearity can hold. This implies that the asymptotic behavior of general ANN-based tests for neglected nonlinearity is still an open question. Here we analyze a convenient ANN-based quasi-likelihood ratio statistic for testing neglected nonlinearity, paying careful attention to both components of the null.We derive the asymptotic null distribution under each component separately and analyze their interaction. Somewhat remarkably, it turns out that the previously known asymptotic null distribution for the type 1 case still applies, but under somewhat stronger conditions than previously recognized. We present Monte Carlo experiments corroborating our theoretical results and showing that standard methods can yield misleading inference when our new, stronger regularity conditions are violated.

UR - http://www.scopus.com/inward/record.url?scp=79958266867&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=79958266867&partnerID=8YFLogxK

U2 - 10.1162/NECO_a_00117

DO - 10.1162/NECO_a_00117

M3 - Article

VL - 23

SP - 1133

EP - 1186

JO - Neural Computation

JF - Neural Computation

SN - 0899-7667

IS - 5

ER -