SVM versus Least Squares SVM

Jieping Ye, Tao Xiong

Research output: Contribution to journalConference articlepeer-review

78 Scopus citations

Abstract

We study the relationship between Support Vector Machines (SVM) and Least Squares SVM (LS-SVM). Our main result shows that under mild conditions, LS-SVM for binary-class classifications is equivalent to the hard margin SVM based on the well-known Mahalanobis distance measure. We further study the asymptotics of the hard margin SVM when the data dimensionality tends to infinity with a fixed sample size. Using recently developed theory on the asymptotics of the distribution of the eigenvalues of the covariance matrix, we show that under mild conditions, the equivalence result holds for the traditional Euclidean distance measure. These equivalence results are further extended to the multi-class case. Experimental results confirm the presented theoretical analysis.

Original languageEnglish (US)
Pages (from-to)644-651
Number of pages8
JournalJournal of Machine Learning Research
Volume2
StatePublished - Dec 1 2007
Event11th International Conference on Artificial Intelligence and Statistics, AISTATS 2007 - San Juan, Puerto Rico
Duration: Mar 21 2007Mar 24 2007

ASJC Scopus subject areas

  • Software
  • Control and Systems Engineering
  • Statistics and Probability
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'SVM versus Least Squares SVM'. Together they form a unique fingerprint.

Cite this