Recent advance in score-based models incorporates the stochastic differential equation (SDE), which brings the state-of-the art performance on image generation tasks. This paper improves such score-based models by analyzing the model at the zero perturbation noise. In real datasets, the score function diverges as the perturbation noise ($\sigma$) decreases to zero, and this observation leads an argument that the score estimation fails at $\sigma=0$ with any neural network structure. Subsequently, we introduce Unbounded Noise Conditional Score Network (UNCSN) that resolves the score diverging problem with an easily applicable modification to any noise conditional score-based models. Additionally, we introduce a new type of SDE, so the exact log likelihood can be calculated from the newly suggested SDE. On top of that, the associated loss function mitigates the loss imbalance issue in a mini-batch, and we present a theoretic analysis on the proposed loss to uncover the behind mechanism of the data distribution modeling by the score-based models.
翻译:基于分数的模型最近的进展包括了Stochistic 差异方程式(SDE ), 它带来了图像生成任务的最新表现。 本文通过分析零振动噪音的模型改进了这种基于分数的模型。 在真实的数据集中, 分数函数随着扰动噪音( $\\ sigma$) 降低到零而有所不同, 而这一观察引出了一个论点, 即分数估计值在任何神经网络结构中以$\sigma=0美元为单位失败。 随后, 我们引入了无限制的噪音条件评分网络( UNCSN ), 解决分数差异问题, 对任何基于噪音的得分模型进行易于应用的修改。 此外, 我们引入了一种新的SDE 类型, 这样精确的日志可能性可以从新推荐的 SDE 中计算出来。 除此之外, 相关的损失函数减轻了一个微型批量中的损失不平衡问题, 我们对拟议的损失进行了理论分析, 以发现基于分数模型的数据分配模式背后的机制。