Papers

Multifidelity Simulation-based Inference for Computationally Expensive Simulators

AN Krouglova, HR Johnson, B Confavreux… - arXiv preprint arXiv …, 2025 - arxiv.org
Physics paper stat.ML Suggest

… We evaluate the performance of our multifidelity approach to simulation-based inference on three tasks. We start with the Ornstein-Uhlenbeck process, for which the …

Cited by Link to paper

BibTeX

@article{2502.08416v2,
Author = {Anastasia N. Krouglova and Hayden R. Johnson and Basile Confavreux and Michael Deistler and Pedro J. Gonçalves},
Title = {Multifidelity Simulation-based Inference for Computationally Expensive
Simulators},
Eprint = {2502.08416v2},
ArchivePrefix = {arXiv},
PrimaryClass = {stat.ML},
Abstract = {Across many domains of science, stochastic models are an essential tool to
understand the mechanisms underlying empirically observed data. Models can be
of different levels of detail and accuracy, with models of high-fidelity (i.e.,
high accuracy) to the phenomena under study being often preferable. However,
inferring parameters of high-fidelity models via simulation-based inference is
challenging, especially when the simulator is computationally expensive. We
introduce MF-NPE, a multifidelity approach to neural posterior estimation that
leverages inexpensive low-fidelity simulations to infer parameters of
high-fidelity simulators within a limited simulation budget. MF-NPE performs
neural posterior estimation with limited high-fidelity resources by virtue of
transfer learning, with the ability to prioritize individual observations using
active learning. On one statistical task with analytical ground-truth and two
real-world tasks, MF-NPE shows comparable performance to current approaches
while requiring up to two orders of magnitude fewer high-fidelity simulations.
Overall, MF-NPE opens new opportunities to perform efficient Bayesian inference
on computationally expensive simulators.},
Year = {2025},
Month = {Feb},
Url = {http://arxiv.org/abs/2502.08416v2},
File = {2502.08416v2.pdf}
}

Share