Papers

Marginal Flow a flexible and efficient framework for density estimation

MM Negri, J Aellen, M Jahn… - arXiv preprint arXiv …, 2025 - arxiv.org
oncology paper cs.LG Suggest

… Figure 14: Simulation-based inference benchmark: we show average and standard deviation over 10 different test observations. We compare our method against Free-…

Link to paper

BibTeX

@article{2509.26221v1,
Author = {Marcello Massimo Negri and Jonathan Aellen and Manuel Jahn and AmirEhsan Khorashadizadeh and Volker Roth},
Title = {Marginal Flow: a flexible and efficient framework for density estimation},
Eprint = {2509.26221v1},
ArchivePrefix = {arXiv},
PrimaryClass = {cs.LG},
Abstract = {Current density modeling approaches suffer from at least one of the following
shortcomings: expensive training, slow inference, approximate likelihood, mode
collapse or architectural constraints like bijective mappings. We propose a
simple yet powerful framework that overcomes these limitations altogether. We
define our model $q_\theta(x)$ through a parametric distribution $q(x|w)$ with
latent parameters $w$. Instead of directly optimizing the latent variables $w$,
our idea is to marginalize them out by sampling $w$ from a learnable
distribution $q_\theta(w)$, hence the name Marginal Flow. In order to evaluate
the learned density $q_\theta(x)$ or to sample from it, we only need to draw
samples from $q_\theta(w)$, which makes both operations efficient. The proposed
model allows for exact density evaluation and is orders of magnitude faster
than competing models both at training and inference. Furthermore, Marginal
Flow is a flexible framework: it does not impose any restrictions on the neural
network architecture, it enables learning distributions on lower-dimensional
manifolds (either known or to be learned), it can be trained efficiently with
any objective (e.g. forward and reverse KL divergence), and it easily handles
multi-modal targets. We evaluate Marginal Flow extensively on various tasks
including synthetic datasets, simulation-based inference, distributions on
positive definite matrices and manifold learning in latent spaces of images.},
Year = {2025},
Month = {Sep},
Url = {http://arxiv.org/abs/2509.26221v1},
File = {2509.26221v1.pdf}
}

Share