Spiking neural networks offer event driven computation, sparse activation, and hardware efficiency, yet training often converges slowly and lacks stability. We present Adaptive Homeostatic Spiking Activity Regulation (AHSAR), an extremely simple plug in and training paradigm agnostic method that stabilizes optimization and accelerates convergence without changing the model architecture, loss, or gradients. AHSAR introduces no trainable parameters. It maintains a per layer homeostatic state during the forward pass, maps centered firing rate deviations to threshold scales through a bounded nonlinearity, uses lightweight cross layer diffusion to avoid sharp imbalance, and applies a slow across epoch global gain that combines validation progress with activity energy to tune the operating point. The computational cost is negligible. Across diverse training methods, SNN architectures of different depths, widths, and temporal steps, and both RGB and DVS datasets, AHSAR consistently improves strong baselines and enhances out of distribution robustness. These results indicate that keeping layer activity within a moderate band is a simple and effective principle for scalable and efficient SNN training.
翻译:脉冲神经网络具备事件驱动计算、稀疏激活和硬件高效等优势,但其训练过程往往收敛缓慢且缺乏稳定性。本文提出自适应稳态脉冲活动调节方法,这是一种极其简单、即插即用且与训练范式无关的技术,能够在保持模型架构、损失函数和梯度不变的前提下稳定优化过程并加速收敛。该方法不引入任何可训练参数,其在前向传播过程中维持每层的稳态状态,通过有界非线性函数将中心化发放率偏差映射至阈值缩放因子,采用轻量级跨层扩散机制避免剧烈失衡,并应用基于验证进度与活动能量的跨周期全局增益慢调节机制以优化工作点。其计算开销可忽略不计。在多种训练方法、不同深度/宽度/时间步长的SNN架构以及RGB与DVS数据集上的实验表明,该方法能持续提升强基线性能并增强分布外鲁棒性。这些结果证明,将层活动维持在适度区间是实现可扩展高效SNN训练的简洁有效原则。