Papers

Maximum Likelihood Learning of Unnormalized Models for Simulation-Based Inference

P Glaser, M Arbel, A Doucet, A Gretton - arXiv preprint arXiv:2210.14756, 2022 - arxiv.org
Computer Science paper cs.LG Suggest

We introduce two synthetic likelihood methods for Simulation-Based Inference (SBI), to conduct either amortized or targeted inference from experimental observations …

Cited by Link to paper

BibTeX

@article{2210.14756v2,
Author = {Pierre Glaser and Michael Arbel and Samo Hromadka and Arnaud Doucet and Arthur Gretton},
Title = {Maximum Likelihood Learning of Unnormalized Models for Simulation-Based
Inference},
Eprint = {2210.14756v2},
ArchivePrefix = {arXiv},
PrimaryClass = {cs.LG},
Abstract = {We introduce two synthetic likelihood methods for Simulation-Based Inference
(SBI), to conduct either amortized or targeted inference from experimental
observations when a high-fidelity simulator is available. Both methods learn a
conditional energy-based model (EBM) of the likelihood using synthetic data
generated by the simulator, conditioned on parameters drawn from a proposal
distribution. The learned likelihood can then be combined with any prior to
obtain a posterior estimate, from which samples can be drawn using MCMC. Our
methods uniquely combine a flexible Energy-Based Model and the minimization of
a KL loss: this is in contrast to other synthetic likelihood methods, which
either rely on normalizing flows, or minimize score-based objectives; choices
that come with known pitfalls. We demonstrate the properties of both methods on
a range of synthetic datasets, and apply them to a neuroscience model of the
pyloric network in the crab, where our method outperforms prior art for a
fraction of the simulation budget.},
Year = {2022},
Month = {Oct},
Url = {http://arxiv.org/abs/2210.14756v2},
File = {2210.14756v2.pdf}
}

Share