Computing the discrepancy between time series of variable sizes is notoriously challenging. While dynamic time warping (DTW) is popularly used for this purpose, it is not differentiable everywhere and is known to lead to bad local optima when used as a "loss". Soft-DTW addresses these issues, but it is not a positive definite divergence: due to the bias introduced by entropic regularization, it can be negative and it is not minimized when the time series are equal. We propose in this paper a new divergence, dubbed soft-DTW divergence, which aims to correct these issues. We study its properties; in particular, under conditions on the ground cost, we show that it is non-negative and minimized when the time series are equal. We also propose a new "sharp" variant by further removing entropic bias. We showcase our divergences on time series averaging and demonstrate significant accuracy improvements compared to both DTW and soft-DTW on 84 time series classification datasets.
翻译:虽然动态时间扭曲(DTW)通常用于此目的,但在所有地方都无法区分,而且当作为“损失”使用时,已知会导致当地选择差。 Soft-DTW解决这些问题,但也不是一个肯定的正面差异:由于对等时间序列的偏差,它可能是负的,在时间序列相同时,它不会最小化。我们在本文件中提出了一个新的差异,称为软-DTW差异,目的是纠正这些问题。我们研究了它的特点;特别是在地面成本条件下,我们显示它不是负的,在时间序列相同时,它会最小化。我们还提出一个新的“sharp”变式,进一步消除对等偏差。我们展示了我们时间序列的平均差异,并展示了84个时间序列数据集与DTW和软-DTW相比的准确性显著提高。