【机器学习】24 Markov chain Monte Carlo (MCMC) inference
本章介绍了马尔可夫链蒙特卡洛(MCMC)方法,重点探讨了Gibbs采样和Metropolis-Hastings算法两种核心方法,并分析其应用场景、收敛性及优化策略。内容涵盖Gibbs采样在Ising模型、高斯混合模型中的应用,Metropolis-Hastings算法原理及变体,以及收敛诊断、辅助变量MCMC、退火方法和边缘似然估计等技术。章节还涉及并行计算、自适应MCMC等前沿主题,为贝叶斯推断
本章目录
24 Markov chain Monte Carlo (MCMC) inference 837
24.1 Introduction 837
24.2 Gibbs sampling 838
24.2.1 Basic idea 838
24.2.2 Example: Gibbs sampling for the Ising model 838
24.2.3 Example: Gibbs sampling for inferring the parameters of a GMM 840
24.2.4 Collapsed Gibbs sampling * 841
24.2.5 Gibbs sampling for hierarchical GLMs 844
24.2.6 BUGS and JAGS 846
24.2.7 The Imputation Posterior (IP) algorithm 847
24.2.8 Blocking Gibbs sampling 847
24.3 Metropolis Hastings algorithm 848
24.3.1 Basic idea 848
24.3.2 Gibbs sampling is a special case of MH 849
24.3.3 Proposal distributions 850
24.3.4 Adaptive MCMC 853
24.3.5 Initialization and mode hopping 854
24.3.6 Why MH works * 854
24.3.7 Reversible jump (trans-dimensional) MCMC * 855
24.4 Speed and accuracy of MCMC 856
24.4.1 The burn-in phase 856
24.4.2 Mixing rates of Markov chains * 857
24.4.3 Practical convergence diagnostics 858
24.4.4 Accuracy of MCMC 860
24.4.5 How many chains? 862
24.5 Auxiliary variable MCMC * 863
24.5.1 Auxiliary variable sampling for logistic regression 863
24.5.2 Slice sampling 864
24.5.3 Swendsen Wang 866
24.5.4 Hybrid/Hamiltonian MCMC * 868
24.6 Annealing methods 868
24.6.1 Simulated annealing 869
24.6.2 Annealed importance sampling 871
24.6.3 Parallel tempering 871
24.7 Approximating the marginal likelihood 872
24.7.1 The candidate method 872
24.7.2 Harmonic mean estimate 872
24.7.3 Annealed importance sampling 873
github下载链接:https://github.com/916718212/Machine-Learning-A-Probabilistic-Perspective-.git
更多推荐
所有评论(0)