Modifications on triplet loss that rescale the back-propagated gradients of special pairs have made significant progress on local descriptor learning. However, current gradient modulation strategies are mainly static so that they would suffer from changes of training phases or datasets. In this paper, we propose a dynamic gradient modulation, named SDGMNet, to improve triplet loss for local descriptor learning. The core of our method is formulating modulation functions with statistical characteristics which are estimated dynamically. Firstly, we perform deep analysis on back propagation of general triplet-based loss and introduce included angle for distance measure. On this basis, auto-focus modulation is employed to moderate the impact of statistically uncommon individual pairs in stochastic gradient descent optimization; probabilistic margin cuts off the gradients of proportional Siamese pairs that are believed to reach the optimum; power adjustment balances the total weights of negative pairs and positive pairs. Extensive experiments demonstrate that our novel descriptor surpasses previous state-of-the-arts on standard benchmarks including patch verification, matching and retrieval tasks.
翻译:对三重损失进行修改,以对特殊配对的回传梯度进行重新缩放,在本地标本学习方面取得重大进展。然而,目前的梯度调制战略主要是静态的,以便它们受到培训阶段或数据集变化的影响。在本文中,我们提议采用动态梯度调制,名为SDGMNet,以增进本地标本学习的三重损失。我们的方法核心是用动态估算的统计特征来制定调控功能。首先,我们对一般三重损失的回传进行深入分析,并引入包括距离测量角度。在此基础上,采用自动重点调制,以缓解统计上不常见的个体对等在随机梯度梯度梯度梯度优化中的影响;概率差将被认为达到最佳水平的成比例色色双的梯度缩小;电源调整平衡负对和正对子的总重量。广泛的实验表明,我们的新描述值超过以往关于标准基准(包括补丁核查、配对和检索任务)的状态。