Papers

E-QRGMM Efficient Generative Metamodeling for Covariate-Dependent Uncertainty Quantification

Z Liang, Q Zhang - arXiv preprint arXiv:2601.19256, 2026 - arxiv.org
Computer Science paper cs.LG Suggest

… Abstract Covariate-dependent uncertainty quantification in simulation-based inference is crucial for high-stakes decision-making but remains challenging due to the …

Link to paper

BibTeX

@article{2601.19256v1,
Author = {Zhiyang Liang and Qingkai Zhang},
Title = {E-QRGMM: Efficient Generative Metamodeling for Covariate-Dependent Uncertainty Quantification},
Eprint = {2601.19256v1},
ArchivePrefix = {arXiv},
PrimaryClass = {cs.LG},
Abstract = {Covariate-dependent uncertainty quantification in simulation-based inference is crucial for high-stakes decision-making but remains challenging due to the limitations of existing methods such as conformal prediction and classical bootstrap, which struggle with covariate-specific conditioning. We propose Efficient Quantile-Regression-Based Generative Metamodeling (E-QRGMM), a novel framework that accelerates the quantile-regression-based generative metamodeling (QRGMM) approach by integrating cubic Hermite interpolation with gradient estimation. Theoretically, we show that E-QRGMM preserves the convergence rate of the original QRGMM while reducing grid complexity from $O(n^{1/2})$ to $O(n^{1/5})$ for the majority of quantile levels, thereby substantially improving computational efficiency. Empirically, E-QRGMM achieves a superior trade-off between distributional accuracy and training speed compared to both QRGMM and other advanced deep generative models on synthetic and practical datasets. Moreover, by enabling bootstrap-based construction of confidence intervals for arbitrary estimands of interest, E-QRGMM provides a practical solution for covariate-dependent uncertainty quantification.},
Year = {2026},
Month = {Jan},
Url = {http://arxiv.org/abs/2601.19256v1},
File = {2601.19256v1.pdf}
}

Share