Bayesian inference typically relies on specifying a parametric model that approximates the data-generating process. However, misspecified models can yield poor convergence rates and unreliable posterior calibration. Bayesian empirical likelihood offers a semi-parametric alternative by replacing the parametric likelihood with a profile empirical likelihood defined through moment constraints, thereby avoiding explicit distributional assumptions. Despite these advantages, Bayesian empirical likelihood faces substantial computational challenges, including the need to solve a constrained optimization problem for each likelihood evaluation and difficulties with non-convex posterior support, particularly in small-sample settings. This paper introduces a variational approach based on expectation-propagation to approximate the Bayesian empirical-likelihood posterior, balancing computational cost and accuracy without altering the target posterior via adjustments such as pseudo-observations. Empirically, we show that our approach can achieve a superior cost-accuracy trade-off relative to existing methods, including Hamiltonian Monte Carlo and variational Bayes. Theoretically, we show that the approximation and the Bayesian empirical-likelihood posterior are asymptotically equivalent.
翻译:暂无翻译