Improved sparse coding using manifold projections

Karthikeyan Natesan Ramamurthy, Jayaraman J. Thiagarajan, Andreas Spanias

Research output: Chapter in Book/Report/Conference proceedingConference contribution

9 Scopus citations


Sparse representations using predefined and learned dictionaries have widespread applications in signal and image processing. Sparse approximation techniques can be used to recover data from its low dimensional corrupted observations, based on the knowledge that the data is sparsely representable using a known dictionary. In this paper, we propose a method to improve data recovery by ensuring that the data recovered using sparse approximation is close its manifold. This is achieved by performing regularization using examples from the data manifold. This technique is particularly useful when the observations are highly reduced in dimensions when compared to the data and corrupted with high noise. Using an example application of image inpainting, we demonstrate that the proposed algorithm achieves a reduction in reconstruction error in comparison to using only sparse coding with predefined and learned dictionaries, when the percentage of missing pixels is high.

Original languageEnglish (US)
Title of host publicationICIP 2011
Subtitle of host publication2011 18th IEEE International Conference on Image Processing
Number of pages4
StatePublished - 2011
Event2011 18th IEEE International Conference on Image Processing, ICIP 2011 - Brussels, Belgium
Duration: Sep 11 2011Sep 14 2011

Publication series

NameProceedings - International Conference on Image Processing, ICIP
ISSN (Print)1522-4880


Other2011 18th IEEE International Conference on Image Processing, ICIP 2011


  • dictionary learning
  • image inpainting
  • manifold projection
  • sparse representation

ASJC Scopus subject areas

  • Software
  • Computer Vision and Pattern Recognition
  • Signal Processing


Dive into the research topics of 'Improved sparse coding using manifold projections'. Together they form a unique fingerprint.

Cite this