• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations
Advanced Search Include Citations

DMCA

Non-Parametric Bayesian Dictionary Learning for Sparse Image Representations

Cached

  • Download as a PDF

Download Links

  • [www.ece.duke.edu]
  • [people.ee.duke.edu]
  • [www.columbia.edu]
  • [people.ee.duke.edu]
  • [books.nips.cc]
  • [machinelearning.wustl.edu]
  • [www.ima.umn.edu]
  • [ima.umn.edu]
  • [snowbird.djvuzone.org]
  • [people.ee.duke.edu]
  • [people.ee.duke.edu]
  • [www.columbia.edu]
  • [papers.nips.cc]
  • [www.researchgate.net]

  • Save to List
  • Add to Collection
  • Correct Errors
  • Monitor Changes
by Mingyuan Zhou , Haojun Chen , John Paisley , Lu Ren , Guillermo Sapiro , Lawrence Carin
Citations:91 - 33 self
  • Summary
  • Citations
  • Active Bibliography
  • Co-citation
  • Clustered Documents
  • Version History

BibTeX

@MISC{Zhou_non-parametricbayesian,
    author = {Mingyuan Zhou and Haojun Chen and John Paisley and Lu Ren and Guillermo Sapiro and Lawrence Carin},
    title = {Non-Parametric Bayesian Dictionary Learning for Sparse Image Representations },
    year = {}
}

Share

Facebook Twitter Reddit Bibsonomy

OpenURL

 

Abstract

Non-parametric Bayesian techniques are considered for learning dictionaries for sparse image representations, with applications in denoising, inpainting and compressive sensing (CS). The beta process is employed as a prior for learning the dictionary, and this non-parametric method naturally infers an appropriate dictionary size. The Dirichlet process and a probit stick-breaking process are also considered to exploit structure within an image. The proposed method can learn a sparse dictionary in situ; training images may be exploited if available, but they are not required. Further, the noise variance need not be known, and can be nonstationary. Another virtue of the proposed method is that sequential inference can be readily employed, thereby allowing scaling to large images. Several example results are presented, using both Gibbs and variational Bayesian inference, with comparisons to other state-of-the-art approaches.

Keyphrases

sparse image representation    non-parametric bayesian dictionary learning    probit stick-breaking process    non-parametric bayesian technique    variational bayesian inference    sequential inference    noise variance    dirichlet process    sparse dictionary    state-of-the-art approach    appropriate dictionary size    several example result    beta process    compressive sensing    training image    large image    non-parametric method   

Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University