The PDF file you selected should load here if your Web browser has a PDF reader plug-in installed (for example, a recent version of Adobe Acrobat Reader).

If you would like more information about how to print, save, and work with PDFs, Highwire Press provides a helpful Frequently Asked Questions about PDFs.

Alternatively, you can download the PDF file directly to your computer, from where it can be opened using a PDF reader. To download the PDF, click the Download link above.

Fullscreen Fullscreen Off


Objectives: We present two algorithms for image processing; the first is based on Boltzmann sampling and the second on entropic sampling.

Methods: These algorithms come within the Bayesian framework which has three components: 1. Likelihood: a conditional density - the probability of a noisy image given a clean image, 2. A Prior and, 3. A Posterior: a conditional density - the probability of a clean image given a noisy image. The Likelihood provides a model for the degradation process; the Prior models what we consider as a clean image; it also provides a means of incorporating whatever data we have of the image; the Posterior combines the Prior and Likelihood and provides an estimate of the clean counterpart of the given noisy image. The algorithm sets a competition between: 1. The Likelihood that tries to anchor the image to the given noisy image so that the features present can be retained including perhaps the noisy ones and, 2. The Prior which tries to make the image smooth, even at the risk of eliminating some genuine features of the image other than the noise.

Findings: A proper choice of the prior and the likelihood functions would lead to good image processing. We need also good estimators of the clean image.

Application: The choice of estimators is somewhat straight forward for image processing employing Boltzmann algorithm. For non-Boltzmann algorithm we need efficient estimators that make full use of the entropic ensemble generated.


Keywords

Image Processing, Prior, Posterior, Boltzmann Sampling, Entropic Sampling, Bayesian.
User
Notifications