Advertisement

Bayesian Image Analysis

  • Donald Geman
  • Stuart Geman
Conference paper
Part of the NATO ASI Series book series (volume 20)

Abstract

In [8] we introduced a class of image models for various tasks in digital image processing. These models are multi-level or “hierarchical” Markov Random Fields (MRFs). Here we pursue this approach to image modelling and analysis along some different lines, involving segmentation, boundary finding, and computer tomography. Similar models and associated optimization algorithms appear regularly in other work involving immense spatial systems; some examples are the studies in these proceedings on statistical mechanical systems (e.g. ferromagnets, spin-glasses and random fields), the work of Hinton and Sejnowski [14], Hopfield [15], and von der Malsburg and Bienenstock [19], in neural modeling and perceptual inference, and other work in image analysis, e.g. Besag [2], Kiiveri and Campbell [17], Cross and Jain [5], Cohen and Cooper [4], Elliott and Derin [7], Deviver [6], Grenander [11], and Marroquin [20]. The use of MRFs and related stochastic processes as models for intensity data has been prevalent in the image processing literature for some time now; we refer the reader to [8] and standard references for a detailed account of the genealogy.

Keywords

Posterior Distribution Perceptual Inference Gibbs Distribution Edge Site Edge Variable 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    N. Accomando, “Maximum likelihood reconstruction of a two dimensional Poisson intensity function from attenuated projections,” Ph.D. thesis, Division of Applied Mathematics, Brown University, 1984.Google Scholar
  2. 2.
    J. Besag, “On the statistical analysis of dirty pictures,” preprint, Department of Statistics, of Durhan, U.K., 1985.Google Scholar
  3. 3.
    V. Cerny, “A thermodynamical approach to the travelling salesman problem: an efficient simulation algorithm,” preprint, Institue of Physics and Biophysics, Comenius Unive, Bratislava, 1982.Google Scholar
  4. 4.
    F.S. Cohen and D.B. Cooper, “Simple parallel hierarchical and relaxation algorithms for segmenting noncausal Markovian random fields,” preprint, Brown University, 1984.Google Scholar
  5. 5.
    G.C. Cross and A.K. Jain, “Markov random field texture models,” IEEE Transaction Pattern Analysis Machine Intelligence., vol. PAMI-5, pp.25–39, 1983.Google Scholar
  6. 6.
    P.A. Devijver, “Probabilistic labeling in a hidden second order markov mesh,” technical report, Philips Research Laboratory, Brussels, Belgium, 1985.Google Scholar
  7. 7.
    H. Elliott and H. Derin, “Modelling and segmentation of noisy and textured images using Gibbs random fields,” preprint, Department of Electrical and Computer Engineering, University of Massachusetts, 1984.Google Scholar
  8. 8.
    S. Geman and D. Geman, “Stochastic relaxation, Gibbs distributions, and the Bayesian restoration of images,” IEEE Trans. Pattern Anal. Machine Intell., vol. PAMI-6, pp.721–741, 1984.Google Scholar
  9. 9.
    S. Geman and D.E. McClure, “Bayesian image analysis: An application to single photon emission tomography,” 1985 Proceedings of the American Statistical Ass ociation. Statistical Computing Section. (to appear).Google Scholar
  10. 10.
    B. Gidas, “Nonstationary Markov chains and convergence of the annealing algorithm,” Journal of Statistical Physics. Vol. 39, pp.73–131, 1985.CrossRefzbMATHMathSciNetGoogle Scholar
  11. 11.
    U. Grenander, “Tutorial in Pattern Theory,” Division of Applied Mathematics, Brown University, 1984.Google Scholar
  12. 12.
    B. Hajek, “Cooling schedules for optimal annealing,” preprint, Department of Electrical Engineering and the Coordinated Science Laboratory, University of Illinois at Champaign-Urbana, 1985.Google Scholar
  13. 13.
    A.R. Hanson and E.M Riseman, “Segmentation of natural scenes,” Computer Vision Systems. New York: Academic Press, 1978.Google Scholar
  14. 14.
    G.E. Hinton and T.J. Sejnowski, “Optimal perceptual inference,” in Proceedings IEEE Conference Computer Vision Pattern Recognition. 1983.Google Scholar
  15. 15.
    J.J. Hopfield, “Neural networks and physical systems with emergent collective computational abilities,” Proceedings of the National Academy of Sciences USA. Vol. 79, pp.2554–2558, 1982.Google Scholar
  16. 16.
    R.A. Hummel and S.W. Zucker, “On the foundations of relaxation labeling processes,” IEEE Transaction Pattern Analysis Machine Intelligence, vol. PAMI-5, pp.267–287, 1983.Google Scholar
  17. 17.
    H.T. Kiiveri and N.A. Campbell, “Allocation of remotely sensed data using Markov models for spectral variables and pixel labels,” preprint, CSIRO Division of Mathematics and Statistics, Sydney, Australia, 1985.Google Scholar
  18. 18.
    J. Kirkpatrick, CD. Gelatt, Jr. and M.P. Vecchi, “Optimization by simulated annealing,” IBM Thomas J. Watson Research Center, Yorktown Heights, N.Y. 1982.Google Scholar
  19. 19.
    C. von der Malsburg, E. Bienenstock, “Statistical coding and short term synaptic plasticity: A scheme for knowledge representation in the brain,” (this volume).Google Scholar
  20. 20.
    J.L. Marroquin, “Surface reconstruction preserving discontinuities,” Laboratory for Information and Decision Systems, M.I.T., Cambridge, MA, 1984.Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1986

Authors and Affiliations

  • Donald Geman
    • 1
  • Stuart Geman
    • 2
  1. 1.Department of Mathematics and StatisticsUniversity of MassachusettsAmherstUSA
  2. 2.Division of Applied MathematicsBrown UniversityProvidenceUSA

Personalised recommendations