Penalized Orthogonal Iteration for Sparse Estimation of Generalized Eigenvalue Problem

Sungkyu Jung, Jeongyoun Ahn, Yongho Jeon

Research output: Contribution to journalArticle

Abstract

We propose a new algorithm for sparse estimation of eigenvectors in generalized eigenvalue problems (GEPs). The GEP arises in a number of modern data-analytic situations and statistical methods, including principal component analysis (PCA), multiclass linear discriminant analysis (LDA), canonical correlation analysis (CCA), sufficient dimension reduction (SDR), and invariant co-ordinate selection. We propose to modify the standard generalized orthogonal iteration with a sparsity-inducing penalty for the eigenvectors. To achieve this goal, we generalize the equation-solving step of orthogonal iteration to a penalized convex optimization problem. The resulting algorithm, called penalized orthogonal iteration, provides accurate estimation of the true eigenspace, when it is sparse. Also proposed is a computationally more efficient alternative, which works well for PCA and LDA problems. Numerical studies reveal that the proposed algorithms are competitive, and that our tuning procedure works well. We demonstrate applications of the proposed algorithm to obtain sparse estimates for PCA, multiclass LDA, CCA, and SDR. Supplementary materials for this article are available online.

Original languageEnglish
Pages (from-to)710-721
Number of pages12
JournalJournal of Computational and Graphical Statistics
Volume28
Issue number3
DOIs
Publication statusPublished - 2019 Jul 3

Fingerprint

Generalized Eigenvalue Problem
Discriminant Analysis
Sufficient Dimension Reduction
Principal Component Analysis
Iteration
Canonical Correlation Analysis
Multi-class
Eigenvector
Ordinate
Eigenspace
Convex Optimization
Sparsity
Statistical method
Penalty
Numerical Study
Tuning
Optimization Problem
Generalise
Invariant
Eigenvalues

All Science Journal Classification (ASJC) codes

  • Statistics and Probability
  • Discrete Mathematics and Combinatorics
  • Statistics, Probability and Uncertainty

Cite this

@article{58408a5c7a5441a8b9cab0d890f6f34e,
title = "Penalized Orthogonal Iteration for Sparse Estimation of Generalized Eigenvalue Problem",
abstract = "We propose a new algorithm for sparse estimation of eigenvectors in generalized eigenvalue problems (GEPs). The GEP arises in a number of modern data-analytic situations and statistical methods, including principal component analysis (PCA), multiclass linear discriminant analysis (LDA), canonical correlation analysis (CCA), sufficient dimension reduction (SDR), and invariant co-ordinate selection. We propose to modify the standard generalized orthogonal iteration with a sparsity-inducing penalty for the eigenvectors. To achieve this goal, we generalize the equation-solving step of orthogonal iteration to a penalized convex optimization problem. The resulting algorithm, called penalized orthogonal iteration, provides accurate estimation of the true eigenspace, when it is sparse. Also proposed is a computationally more efficient alternative, which works well for PCA and LDA problems. Numerical studies reveal that the proposed algorithms are competitive, and that our tuning procedure works well. We demonstrate applications of the proposed algorithm to obtain sparse estimates for PCA, multiclass LDA, CCA, and SDR. Supplementary materials for this article are available online.",
author = "Sungkyu Jung and Jeongyoun Ahn and Yongho Jeon",
year = "2019",
month = "7",
day = "3",
doi = "10.1080/10618600.2019.1568014",
language = "English",
volume = "28",
pages = "710--721",
journal = "Journal of Computational and Graphical Statistics",
issn = "1061-8600",
publisher = "American Statistical Association",
number = "3",

}

Penalized Orthogonal Iteration for Sparse Estimation of Generalized Eigenvalue Problem. / Jung, Sungkyu; Ahn, Jeongyoun; Jeon, Yongho.

In: Journal of Computational and Graphical Statistics, Vol. 28, No. 3, 03.07.2019, p. 710-721.

Research output: Contribution to journalArticle

TY - JOUR

T1 - Penalized Orthogonal Iteration for Sparse Estimation of Generalized Eigenvalue Problem

AU - Jung, Sungkyu

AU - Ahn, Jeongyoun

AU - Jeon, Yongho

PY - 2019/7/3

Y1 - 2019/7/3

N2 - We propose a new algorithm for sparse estimation of eigenvectors in generalized eigenvalue problems (GEPs). The GEP arises in a number of modern data-analytic situations and statistical methods, including principal component analysis (PCA), multiclass linear discriminant analysis (LDA), canonical correlation analysis (CCA), sufficient dimension reduction (SDR), and invariant co-ordinate selection. We propose to modify the standard generalized orthogonal iteration with a sparsity-inducing penalty for the eigenvectors. To achieve this goal, we generalize the equation-solving step of orthogonal iteration to a penalized convex optimization problem. The resulting algorithm, called penalized orthogonal iteration, provides accurate estimation of the true eigenspace, when it is sparse. Also proposed is a computationally more efficient alternative, which works well for PCA and LDA problems. Numerical studies reveal that the proposed algorithms are competitive, and that our tuning procedure works well. We demonstrate applications of the proposed algorithm to obtain sparse estimates for PCA, multiclass LDA, CCA, and SDR. Supplementary materials for this article are available online.

AB - We propose a new algorithm for sparse estimation of eigenvectors in generalized eigenvalue problems (GEPs). The GEP arises in a number of modern data-analytic situations and statistical methods, including principal component analysis (PCA), multiclass linear discriminant analysis (LDA), canonical correlation analysis (CCA), sufficient dimension reduction (SDR), and invariant co-ordinate selection. We propose to modify the standard generalized orthogonal iteration with a sparsity-inducing penalty for the eigenvectors. To achieve this goal, we generalize the equation-solving step of orthogonal iteration to a penalized convex optimization problem. The resulting algorithm, called penalized orthogonal iteration, provides accurate estimation of the true eigenspace, when it is sparse. Also proposed is a computationally more efficient alternative, which works well for PCA and LDA problems. Numerical studies reveal that the proposed algorithms are competitive, and that our tuning procedure works well. We demonstrate applications of the proposed algorithm to obtain sparse estimates for PCA, multiclass LDA, CCA, and SDR. Supplementary materials for this article are available online.

UR - http://www.scopus.com/inward/record.url?scp=85063513660&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85063513660&partnerID=8YFLogxK

U2 - 10.1080/10618600.2019.1568014

DO - 10.1080/10618600.2019.1568014

M3 - Article

AN - SCOPUS:85063513660

VL - 28

SP - 710

EP - 721

JO - Journal of Computational and Graphical Statistics

JF - Journal of Computational and Graphical Statistics

SN - 1061-8600

IS - 3

ER -