Image analysis, random fields and Markov chain Monte Carlo methods : a mathematical introduction
著者
書誌事項
Image analysis, random fields and Markov chain Monte Carlo methods : a mathematical introduction
(Applications of mathematics, 27)
Springer, c2003
2nd ed
大学図書館所蔵 全43件
  青森
  岩手
  宮城
  秋田
  山形
  福島
  茨城
  栃木
  群馬
  埼玉
  千葉
  東京
  神奈川
  新潟
  富山
  石川
  福井
  山梨
  長野
  岐阜
  静岡
  愛知
  三重
  滋賀
  京都
  大阪
  兵庫
  奈良
  和歌山
  鳥取
  島根
  岡山
  広島
  山口
  徳島
  香川
  愛媛
  高知
  福岡
  佐賀
  長崎
  熊本
  大分
  宮崎
  鹿児島
  沖縄
  韓国
  中国
  タイ
  イギリス
  ドイツ
  スイス
  フランス
  ベルギー
  オランダ
  スウェーデン
  ノルウェー
  アメリカ
注記
Includes 1 CD-ROM
Bibliography: p. [357]-377
Includes index
内容説明・目次
内容説明
"This book is concerned with a probabilistic approach for image analysis, mostly from the Bayesian point of view, and the important Markov chain Monte Carlo methods commonly used....This book will be useful, especially to researchers with a strong background in probability and an interest in image analysis. The author has presented the theory with rigor...he doesn't neglect applications, providing numerous examples of applications to illustrate the theory." -- MATHEMATICAL REVIEWS
目次
I. Bayesian Image Analysis: Introduction.- 1. The Bayesian Paradigm.- 1.1 Warming up for Absolute Beginners.- 1.2 Images and Observations.- 1.3 Prior and Posterior Distributions.- 1.4 Bayes Estimators.- 2. Cleaning Dirty Pictures.- 2.1 Boundaries and Their Information Content.- 2.2 Towards Piecewise Smoothing.- 2.3 Filters, Smoothers, and Bayes Estimators.- 2.4 Boundary Extraction.- 2.5 Dependence on Hyperparameters.- 3. Finite Random Fields.- 3.1 Markov Random Fields.- 3.2 Gibbs Fields and Potentials.- 3.3 Potentials Continued.- II. The Gibbs Sampler and Simulated Annealing.- 4. Markov Chains: Limit Theorems.- 4.1 Preliminaries.- 4.2 The Contraction Coefficient.- 4.3 Homogeneous Markov Chains.- 4.4 Exact Sampling.- 4.5 Inhomogeneous Markov Chains.- 4.6 A Law of Large Numbers for Inhomogeneous Chains.- 4.7 A Counterexample for the Law of Large Numbers.- 5. Gibbsian Sampling and Annealing.- 5.1 Sampling.- 5.2 Simulated Annealing.- 5.3 Discussion.- 6. Cooling Schedules.- 6.1 The ICM Algorithm.- 6.2 Exact MAP Estimation Versus Fast Cooling.- 6.3 Finite Time Annealing.- III. Variations of the Gibbs Sampler.- 7. Gibbsian Sampling and Annealing Revisited.- 7.1 A General Gibbs Sampler.- 7.2 Sampling and Annealing Under Constraints.- 8. Partially Parallel Algorithms.- 8.1 Synchronous Updating on Independent Sets.- 8.2 The Swendson-Wang Algorithm.- 9. Synchronous Algorithms.- 9.1 Invariant Distributions and Convergence.- 9.2 Support of the Limit Distribution.- 9.3 Synchronous Algorithms and Reversibility.- IV. Metropolis Algorithms and Spectral Methods.- 10. Metropolis Algorithms.- 10.1 Metropolis Sampling and Annealing.- 10.2 Convergence Theorems.- 10.3 Best Constants.- 10.4 About Visiting Schemes.- 10.5 Generalizations and Modifications.- 10.6 The Metropolis Algorithm in Combinatorial Optimization.- 11. The Spectral Gap and Convergence of Markov Chains.- 11.1 Eigenvalues of Markov Kernels.- 11.2 Geometric Convergence Rates.- 12. Eigenvalues, Sampling, Variance Reduction.- 12.1 Samplers and Their Eigenvalues.- 12.2 Variance Reduction.- 12.3 Importance Sampling.- 13. Continuous Time Processes.- 13.1 Discrete State Space.- 13.2 Continuous State Space.- V. Texture Analysis.- 14. Partitioning.- 14.1 How to Tell Textures Apart.- 14.2 Bayesian Texture Segmentation.- 14.3 Segmentation by a Boundary Model.- 14.4 Juleszs Conjecture and Two Point Processes.- 15. Random Fields and Texture Models.- 15.1 Neighbourhood Relations.- 15.2 Random Field Texture Models.- 15.3 Texture Synthesis.- 16. Bayesian Texture Classification.- 16.1 Contextual Classification.- 16.2 Marginal Posterior Modes Methods.- VI. Parameter Estimation.- 17. Maximum Likelihood Estimation.- 17.1 The Likelihood Function.- 17.2 Objective Functions.- 18. Consistency of Spatial ML Estimators.- 18.1 Observation Windows and Specifications.- 18.2 Pseudolikelihood Methods.- 18.3 Large Deviations and Full Maximum Likelihood.- 18.4 Partially Observed Data.- 19. Computation of Full ML Estimators.- 19.1 A Naive Algorithm.- 19.2 Stochastic Optimization for the Full Likelihood.- 19.3 Main Results.- 19.4 Error Decomposition.- 19.5 L2-Estimates.- VII. Supplement.- 20. A Glance at Neural Networks.- 20.1 Boltzmann Machines.- 20.2 A Learning Rule.- 21. Three Applications.- 21.1 Motion Analysis.- 21.2 Tomographic Image Reconstruction.- 21.3 Biological Shape.- VIII. Appendix.- A. Simulation of Random Variables.- A.1 Pseudorandom Numbers.- A.2 Discrete Random Variables.- A.3 Special Distributions.- B. Analytical Tools.- B.1 Concave Functions.- B.2 Convergence of Descent Algorithms.- B.3 A Discrete Gronwall Lemma.- B.4 A Gradient System.- C. Physical Imaging Systems.- D. The Software Package AntslnFields.- References.- Symbols.
「Nielsen BookData」 より