Compressed principal component analysis of non-gaussian vectors

Marc Mignolet, Christian Soize

Research output: Contribution to journalArticlepeer-review

4 Scopus citations

Abstract

A novel approximate representation of non-Gaussian random vectors is introduced and validated, which can be viewed as a compressed principal component analysis (CPCA). This representation relies on the eigenvectors of the covariance matrix obtained as in a principal component analysis (PCA) but expresses the random vector as a linear combination of a random sample of N of these eigenvectors. In this model, the indices of these eigenvectors are independent discrete random variables with probabilities proportional to the corresponding eigenvalues. Moreover, the coefficients of the linear combination are zero mean unit variance random variables. Under these conditions, it is first shown that the covariance matrix of this CPCA matches exactly its PCA counterpart independently of the value of N. Next, it is also shown that the distribution of the random coefficients can be selected, without loss of generality, to be a symmetric function. Then, to represent the vector of these coefficients, a novel set of symmetric vector-valued multidimensional polynomials of the canonical Gaussian random vector is derived. Interestingly, it is noted that the number of such polynomials is only slowly growing with the maximum polynomial order, thereby providing a framework for a compact approximation of the target random vector. The identification of the deterministic parameters of the expansion of the random coefficients on these symmetric vector-valued multidimensional polynomials is addressed next. Finally, an example of application is provided that demonstrates the good matching of the distributions of the elements of the target random vector and its approximation with only a very limited number of parameters.

Original languageEnglish (US)
Pages (from-to)1261-1286
Number of pages26
JournalSIAM-ASA Journal on Uncertainty Quantification
Volume8
Issue number4
DOIs
StatePublished - 2020

Keywords

  • Compressed principal component analysis
  • Inverse problem
  • Non-Gaussian vector
  • Principal component analysis
  • Random eigenvectors
  • Random fields
  • Reduction method
  • Stochastic model
  • Stochastic modeling
  • Stochastic processes
  • Symmetric polynomials
  • Uncertainty quantification

ASJC Scopus subject areas

  • Statistics and Probability
  • Modeling and Simulation
  • Statistics, Probability and Uncertainty
  • Discrete Mathematics and Combinatorics
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'Compressed principal component analysis of non-gaussian vectors'. Together they form a unique fingerprint.

Cite this