We present a framework using variational inference with normalizing flows (VI-NFs) to generate proposals of reversible jump Markov chain Monte Carlo (RJMCMC) for efficient trans-dimensional Bayesian inference. Unlike transport reversible jump methods relying on forward KL minimization with pilot MCMC samples, our approach minimizes the reverse KL divergence which requires only samples from a base distribution, eliminating costly target sampling. The method employs RealNVP-based flows to learn model-specific transport maps, enabling construction of both between-model and within-model proposals. Our framework provides accurate marginal likelihood estimates from the variational approximation. This facilitates efficient model comparison and proposal adaptation in RJMCMC. Experiments on illustrative example, factor analysis and variable selection tasks in linear regression show that TRJ designed by VI-NFs achieves faster mixing and more efficient model space exploration compared to existing baselines. The proposed algorithm can be extended to conditional flows for amortized vairiational inference across models. Code is available at https://github.com/YinPingping111/TRJ_VINFs.
翻译:本文提出一种利用归一化流的变分推断(VI-NFs)生成可逆跳转马尔可夫链蒙特卡洛(RJMCMC)建议分布的框架,以实现高效的跨维度贝叶斯推断。与依赖前向KL最小化并结合引导MCMC样本的传输可逆跳转方法不同,本方法通过最小化反向KL散度来构建建议分布,仅需从基分布中采样,从而避免了昂贵的目标分布采样过程。该方法采用基于RealNVP的流结构学习模型特定的传输映射,能够同时构建模型间与模型内的建议分布。本框架通过变分近似提供精确的边缘似然估计,为RJMCMC中的模型比较与建议分布自适应提供了有效支持。在线性回归的示例分析、因子分析及变量选择任务上的实验表明,基于VI-NFs设计的TRJ方法相较于现有基线方法,具有更快的混合速度与更高效的模型空间探索能力。所提算法可扩展至条件流结构,实现跨模型的摊销式变分推断。代码发布于 https://github.com/YinPingping111/TRJ_VINFs。