Integrating outlier filtering in large margin training

Xi Chuan Zhou, Hai Bin Shen, Jie Ping Ye

Research output: Contribution to journalArticlepeer-review

6 Scopus citations


Large margin classifiers such as support vector machines (SVM) have been applied successfully in various classification tasks. However, their performance may be significantly degraded in the presence of outliers. In this paper, we propose a robust SVM formulation which is shown to be less sensitive to outliers. The key idea is to employ an adaptively weighted hinge loss that explicitly incorporates outlier filtering in the SVM training, thus performing outlier filtering and classification simultaneously. The resulting robust SVM formulation is non-convex. We first relax it into a semi-definite programming which admits a global solution. To improve the efficiency, an iterative approach is developed. We have performed experiments using both synthetic and real-world data. Results show that the performance of the standard SVM degrades rapidly when more outliers are included, while the proposed robust SVM training is more stable in the presence of outliers.

Original languageEnglish (US)
Pages (from-to)362-370
Number of pages9
JournalJournal of Zhejiang University: Science C
Issue number5
StatePublished - May 2011


  • Multi-stage relaxation
  • Outlier filter
  • Semi-definite programming
  • Support vector machines

ASJC Scopus subject areas

  • Engineering(all)


Dive into the research topics of 'Integrating outlier filtering in large margin training'. Together they form a unique fingerprint.

Cite this