Least squares linear discriminant analysis

Jieping Ye

Research output: Chapter in Book/Report/Conference proceedingConference contribution

222 Scopus citations


Linear Discriminant Analysis (LDA) is a well-known method for dimensionality reduction and classification. LDA in the binaryclass case has been shown to be equivalent to linear regression with the class label as the output. This implies that LDA for binary-class classifications can be formulated as a least squares problem. Previous studies have shown certain relationship between multivariate linear regression and LDA for the multi-class case. Many of these studies show that multivariate linear regression with a specific class indicator matrix as the output can be applied as a preprocessing step for LDA. However, directly casting LDA as a least squares problem is challenging for the multi-class case. In this paper, a novel formulation for multivariate linear regression is proposed. The equivalence relationship between the proposed least squares formulation and LDA for multi-class classifications is rigorously established under a mild condition, which is shown empirically to hold in many applications involving high-dimensional data. Several LDA extensions based on the equivalence relationship are discussed.

Original languageEnglish (US)
Title of host publicationACM International Conference Proceeding Series
Number of pages7
StatePublished - 2007
Event24th International Conference on Machine Learning, ICML 2007 - Corvalis, OR, United States
Duration: Jun 20 2007Jun 24 2007


Other24th International Conference on Machine Learning, ICML 2007
Country/TerritoryUnited States
CityCorvalis, OR

ASJC Scopus subject areas

  • Human-Computer Interaction


Dive into the research topics of 'Least squares linear discriminant analysis'. Together they form a unique fingerprint.

Cite this