Papers

Consistency Models for Scalable and Fast Simulation-Based Inference

M Schmitt, V Pratz, U Köthe, PC Bürkner… - arXiv preprint arXiv …, 2023 - arxiv.org
Mathematics paper cs.LG Suggest

… In this paper, we port consistency models to simulationbased inference (see Figure 1) to achieve an unprecedented combination of both scalable and fast neural posterior …

Cited by Link to paper

BibTeX

@article{2312.05440v3,
Author = {Marvin Schmitt and Valentin Pratz and Ullrich Köthe and Paul-Christian Bürkner and Stefan T Radev},
Title = {Consistency Models for Scalable and Fast Simulation-Based Inference},
Eprint = {2312.05440v3},
ArchivePrefix = {arXiv},
PrimaryClass = {cs.LG},
Abstract = {Simulation-based inference (SBI) is constantly in search of more expressive and efficient algorithms to accurately infer the parameters of complex simulation models. In line with this goal, we present consistency models for posterior estimation (CMPE), a new conditional sampler for SBI that inherits the advantages of recent unconstrained architectures and overcomes their sampling inefficiency at inference time. CMPE essentially distills a continuous probability flow and enables rapid few-shot inference with an unconstrained architecture that can be flexibly tailored to the structure of the estimation problem. We provide hyperparameters and default architectures that support consistency training over a wide range of different dimensions, including low-dimensional ones which are important in SBI workflows but were previously difficult to tackle even with unconditional consistency models. Our empirical evaluation demonstrates that CMPE not only outperforms current state-of-the-art algorithms on hard low-dimensional benchmarks, but also achieves competitive performance with much faster sampling speed on two realistic estimation problems with high data and/or parameter dimensions.},
Year = {2023},
Month = {Dec},
Note = {Neural Information Processing Systems (NeurIPS 2024)},
Url = {http://arxiv.org/abs/2312.05440v3},
File = {2312.05440v3.pdf}
}

Share