Papers

Improved Marginal Unbiased Score Expansion (MUSE) via Implicit Differentiation

M Millea - arXiv preprint arXiv:2209.10512, 2022 - arxiv.org
Statistics paper stat.ML Suggest

… Owing to its reliance on prior samples, MUSE can be considered a form of simulation-based inference, extended to use readily available joint posterior gradients, similar to the proposal …

Link to paper

BibTeX

@article{2209.10512v1,
Author = {Marius Millea},
Title = {Improved Marginal Unbiased Score Expansion (MUSE) via Implicit
Differentiation},
Eprint = {2209.10512v1},
ArchivePrefix = {arXiv},
PrimaryClass = {stat.ML},
Abstract = {We apply the technique of implicit differentiation to boost performance,
reduce numerical error, and remove required user-tuning in the Marginal
Unbiased Score Expansion (MUSE) algorithm for hierarchical Bayesian inference.
We demonstrate these improvements on three representative inference problems:
1) an extended Neal's funnel 2) Bayesian neural networks, and 3) probabilistic
principal component analysis. On our particular test cases, MUSE with implicit
differentiation is faster than Hamiltonian Monte Carlo by factors of 155, 397,
and 5, respectively, or factors of 65, 278, and 1 without implicit
differentiation, and yields good approximate marginal posteriors. The Julia and
Python MUSE packages have been updated to use implicit differentiation, and can
solve problems defined by hand or with any of a number of popular probabilistic
programming languages and automatic differentiation backends.},
Year = {2022},
Month = {Sep},
Url = {http://arxiv.org/abs/2209.10512v1},
File = {2209.10512v1.pdf}
}

Share