Papers

Adaptive Symmetrization of the KL Divergence

O Ben-Dov, LFO Chamon - arXiv preprint arXiv:2511.11159, 2025 - arxiv.org
Neuroscience paper cs.LG Suggest

… how this framework can be used to combine the advantages of NFs and EBMs in tasks such as density estimation, image generation, and simulation-based inference. …

Link to paper

BibTeX

@article{2511.11159v1,
Author = {Omri Ben-Dov and Luiz F. O. Chamon},
Title = {Adaptive Symmetrization of the KL Divergence},
Eprint = {2511.11159v1},
ArchivePrefix = {arXiv},
PrimaryClass = {cs.LG},
Abstract = {Many tasks in machine learning can be described as or reduced to learning a probability distribution given a finite set of samples. A common approach is to minimize a statistical divergence between the (empirical) data distribution and a parameterized distribution, e.g., a normalizing flow (NF) or an energy-based model (EBM). In this context, the forward KL divergence is a ubiquitous due to its tractability, though its asymmetry may prevent capturing some properties of the target distribution. Symmetric alternatives involve brittle min-max formulations and adversarial training (e.g., generative adversarial networks) or evaluating the reverse KL divergence, as is the case for the symmetric Jeffreys divergence, which is challenging to compute from samples. This work sets out to develop a new approach to minimize the Jeffreys divergence. To do so, it uses a proxy model whose goal is not only to fit the data, but also to assist in optimizing the Jeffreys divergence of the main model. This joint training task is formulated as a constrained optimization problem to obtain a practical algorithm that adapts the models priorities throughout training. We illustrate how this framework can be used to combine the advantages of NFs and EBMs in tasks such as density estimation, image generation, and simulation-based inference.},
Year = {2025},
Month = {Nov},
Url = {http://arxiv.org/abs/2511.11159v1},
File = {2511.11159v1.pdf}
}

Share