Kernel-based learning methods can dramatically increase the storage capacity of Hopfield networks, yet the dynamical mechanism behind this enhancement remains poorly understood. We address this gap by conducting a geometric analysis of the network's energy landscape. We introduce a novel metric, "Pinnacle Sharpness," to quantify the local stability of attractors. By systematically varying the kernel width and storage load, we uncover a rich phase diagram of attractor shapes. Our central finding is the emergence of a "ridge of optimization," where the network maximizes attractor stability under challenging high-load and global-kernel conditions. Through a theoretical decomposition of the landscape gradient into a direct "driving" force and an indirect "feedback" force, we reveal the origin of this phenomenon. The optimization ridge corresponds to a regime of strong anti-correlation between the two forces, where the direct force, amplified by the high storage load, dominates the opposing collective feedback force. This demonstrates a sophisticated self-organization mechanism: the network adaptively harnesses inter-pattern interactions as a cooperative feedback control system to sculpt a robust energy landscape. Our findings provide a new physical picture for the stability of high-capacity associative memories and offer principles for their design.
翻译:基于核的学习方法可以显著提高Hopfield网络的存储容量,然而这种增强背后的动力学机制仍不甚明了。我们通过对网络能量景观的几何分析来填补这一空白。我们引入了一种新的度量指标——“峰尖锐度”,以量化吸引子的局部稳定性。通过系统性地改变核宽度和存储负载,我们揭示了一个丰富的吸引子形状相图。我们的核心发现是“优化脊”的出现,在此条件下网络在具有挑战性的高负载和全局核条件下最大化吸引子稳定性。通过对景观梯度进行理论分解,将其分为直接的“驱动”力和间接的“反馈”力,我们揭示了这一现象的起源。优化脊对应于两种力之间强反相关的状态,其中由高存储负载放大的直接力主导了相反的集体反馈力。这展示了一种复杂的自组织机制:网络自适应地利用模式间相互作用作为一个协作反馈控制系统,以塑造一个鲁棒的能量景观。我们的发现为高容量联想记忆的稳定性提供了新的物理图景,并为其设计提供了原则。