We study the problem of learning a directed acyclic graph from data generated according to an additive, non-linear structural equation model with Gaussian noise. We express each non-linear function through a basis expansion, and derive a maximum likelihood estimator with a group l0-regularization that penalizes the number of edges in the graph. The resulting estimator is formulated through a convex mixed-integer program, enabling the use of branch-and-bound methods to obtain a solution that is guaranteed to be accurate up to a pre-specified optimality gap. Our formulation can naturally encode background knowledge, such as the presence or absence of edges and partial order constraints among the variables. We establish consistency guarantees for our estimator in terms of graph recovery, even when the number of variables grows with the sample size. Additionally, by connecting the optimality guarantees with our statistical error bounds, we derive an early stopping criterion that allows terminating the branch-and-bound procedure while preserving consistency. Compared with existing approaches that either assume equal error variances, restrict to linear structural equation models, or rely on heuristic procedures, our method enjoys both optimization and statistical guarantees. Extensive simulations and real-data analysis show that the proposed method achieves markedly better graph recovery performance.
翻译:我们研究从服从加性非线性结构方程模型(含高斯噪声)生成的数据中学习有向无环图的问题。通过基展开表示每个非线性函数,我们推导出一种带有组l0正则化的最大似然估计器,该正则化惩罚图中的边数。所得估计器通过凸混合整数规划形式化,可利用分支定界法获得保证在预设最优性间隙内精确的解。我们的公式能自然编码背景知识,如边的存在与否以及变量间的偏序约束。即使在变量数量随样本量增长的情况下,我们仍为估计器建立了图恢复的一致性保证。此外,通过将最优性保证与统计误差界关联,我们推导出一种早期停止准则,允许在保持一致性的前提下终止分支定界过程。与现有方法(或假设误差方差相等,或限于线性结构方程模型,或依赖启发式过程)相比,我们的方法兼具优化与统计保证。大量模拟和真实数据分析表明,所提方法显著提升了图恢复性能。