Mixtures of Gaussian Bayesian networks have previously been studied under full-covariance assumptions, where each mixture component has its own covariance matrix. We propose a mixture model with tied-covariance, in which all components share a common covariance matrix. Our main contribution is the derivation of its marginal likelihood, which remains analytic. Unlike in the full-covariance case, however, the marginal likelihood no longer factorizes into component-specific terms. We refer to the new likelihood as the BGe scoring metric for tied-covariance mixtures of Gaussian Bayesian networks. For model inference, we implement MCMC schemes combining structure MCMC with a fast Gibbs sampler for mixtures, and we empirically compare the tied- and full-covariance mixtures of Gaussian Bayesian networks on simulated and benchmark data.
翻译:先前对高斯贝叶斯网络混合模型的研究均基于完全协方差假设,即每个混合分量拥有独立的协方差矩阵。本文提出一种协方差约束的混合模型,其中所有分量共享同一协方差矩阵。我们的核心贡献在于推导了该模型的边际似然函数,该函数仍保持解析形式。然而与完全协方差情形不同,边际似然不再可分解为各分量独立项的乘积。我们将这一新似然函数称为协方差约束高斯贝叶斯网络混合模型的BGe评分度量。在模型推断方面,我们实现了结合结构MCMC与快速混合Gibbs采样器的MCMC方案,并通过模拟数据与基准数据对协方差约束与完全协方差的高斯贝叶斯网络混合模型进行了实证比较。