Kernel-based learning methods can dramatically increase the storage capacity of Hopfield networks, yet the dynamical mechanism behind this enhancement remains poorly understood. We address this gap by conducting a geometric analysis of the network's energy landscape. We introduce a novel metric, ``Pinnacle Sharpness,'' to quantify the local stability of attractors. By systematically varying the kernel width and storage load, we uncover a rich phase diagram of attractor shapes. Our central finding is the emergence of a ``ridge of optimization,'' where the network maximizes attractor stability under challenging high-load and global-kernel conditions. Through a theoretical decomposition of the landscape gradient into a direct ``driving'' force and an indirect ``feedback'' force, we reveal the origin of this phenomenon. The optimization ridge corresponds to a regime of strong anti-correlation between the two forces, where the direct force, amplified by the high storage load, dominates the opposing collective feedback force. This demonstrates a sophisticated self-organization mechanism: the network adaptively harnesses inter-pattern interactions as a cooperative feedback control system to sculpt a robust energy landscape. Our findings provide a new physical picture for the stability of high-capacity associative memories and offer principles for their design.
翻译:基于核的学习方法能显著提升Hopfield网络的存储容量,但其背后的动力学机制仍鲜为人知。我们通过分析网络能量景观的几何特性来填补这一空白。引入一种新度量——‘峰顶锐度’,以量化吸引子的局部稳定性。通过系统改变核宽度与存储负载,我们揭示出吸引子形态的丰富相图。核心发现是‘优化脊’的出现,即网络在高负载与全局核条件下最大化吸引子稳定性。通过对景观梯度进行理论分解,将其分为直接的‘驱动’力与间接的‘反馈’力,我们阐明了该现象的起源。优化脊对应两种力强反相关的状态,其中由高存储负载放大的直接力主导了相反的集体反馈力。这展示了一种精巧的自组织机制:网络自适应地利用模式间相互作用作为协作反馈控制系统,以塑造鲁棒的能量景观。我们的研究为高容量联想记忆的稳定性提供了新的物理图景,并为其设计提供了原理依据。