Hamiltonian Monte Carlo: Efficient Posterior Sampling in High Dimensions
1 Research Platform
Abstract —
Hamiltonian Monte Carlo (HMC) uses gradient information to explore posterior distributions efficiently, making it a practical default for high-dimensional Bayesian inference.
Keywords: bayesian statistics, markov chain monte carlo, hamiltonian monte carlo, posterior inference
Motivation
Random-walk Metropolis methods can struggle in high dimensions because they take small, inefficient steps. HMC improves efficiency by simulating a Hamiltonian system that proposes distant, high-acceptance moves.
Core Idea
HMC introduces a momentum variable and uses a Hamiltonian:
where is the negative log posterior and is typically a quadratic kinetic term. Leapfrog integration produces proposals that preserve volume and approximately conserve energy.
Practical Considerations
- Step size and trajectory length strongly affect performance.
- Adaptive variants like NUTS select path length automatically.
- Diagnostics such as effective sample size and R-hat are essential.
Reference
Show BibTeX
@article{hoffman2014nuts,
title={The No-U-Turn Sampler: Adaptively Setting Path Lengths in Hamiltonian Monte Carlo},
author={Hoffman, Matthew D. and Gelman, Andrew},
journal={Journal of Machine Learning Research},
year={2014},
volume={15},
number={47},
pages={1593-1623}
}