Kernel-based learning methods can dramatically increase the storage capacity of Hopfield networks, yet the dynamical mechanism behind this enhancement remains poorly understood. We address this gap by unifying the geometric analysis of the attractor landscape with the spectral theory of kernel machines. Using a novel metric, "Pinnacle Sharpness," we first uncover a rich phase diagram of attractor stability, identifying a "Ridge of Optimization" where the network achieves maximal robustness under high-load conditions. Phenomenologically, this ridge is characterized by a "Force Antagonism," where a strong driving force is balanced by a collective feedback force. Theoretically, we reveal that this phenomenon arises from a specific reorganization of the weight spectrum, which we term \textit{Spectral Concentration}. Unlike a simple rank-1 collapse, our analysis shows that the network on the ridge self-organizes into a critical state: the leading eigenvalue is amplified to maximize global stability (Direct Force), while the trailing eigenvalues are preserved to maintain high memory capacity (Indirect Force). These findings provide a complete physical picture of how high-capacity associative memories are formed, demonstrating that optimal performance is achieved by tuning the system to a spectral "Goldilocks zone" between rank collapse and diffusion.
翻译:基于核的学习方法可显著提升Hopfield网络的存储容量,但其背后的动力学机制仍不甚明晰。本研究通过将吸引子景观的几何分析与核机器的谱理论相统一,填补了这一空白。利用一种新颖的度量——“峰锐度”,我们首先揭示了吸引子稳定性的丰富相图,识别出一个“优化脊”,在该区域网络在高负载条件下实现最大鲁棒性。从现象学角度看,该脊的特征表现为“力拮抗”,即强大的驱动力与集体反馈力达到平衡。理论上,我们揭示了这一现象源于权重谱的一种特定重组,我们称之为“谱集中”。与简单的秩-1塌缩不同,我们的分析表明,位于脊上的网络自组织进入临界态:主导特征值被放大以最大化全局稳定性(直接力),而后续特征值得以保留以维持高记忆容量(间接力)。这些发现为高容量联想记忆的形成提供了完整的物理图景,证明通过将系统调节至秩塌缩与扩散之间的谱“适居带”,可实现最优性能。