Fully parallel neural network accelerators on field-programmable gate arrays (FPGAs) offer high throughput for latency-critical applications but face hardware resource constraints. Weightless neural networks (WNNs) efficiently replace arithmetic with logic-based inference. Differential weightless neural networks (DWN) further optimize resource usage by learning connections between encoders and LUT layers via gradient-based training. However, DWNs rely on thermometer encoding, and the associated hardware cost has not been fully evaluated. We present a DWN hardware generator that includes thermometer encoding explicitly. Experiments on the Jet Substructure Classification (JSC) task show that encoding can increase LUT usage by up to 3.20$\times$, dominating costs in small networks and highlighting the need for encoding-aware hardware design in DWN accelerators.
翻译:现场可编程门阵列(FPGAs)上的全并行神经网络加速器为延迟敏感应用提供高吞吐量,但面临硬件资源限制。无权重神经网络(WNNs)通过基于逻辑的推理有效替代算术运算。差分无权重神经网络(DWN)通过基于梯度的训练学习编码器与查找表层之间的连接,进一步优化资源使用。然而,DWN依赖温度计编码,其相关的硬件成本尚未得到充分评估。我们提出了一种明确包含温度计编码的DWN硬件生成器。在Jet Substructure Classification(JSC)任务上的实验表明,编码可使查找表使用量增加高达3.20倍,在小型网络中占据主导成本,凸显了DWN加速器中编码感知硬件设计的必要性。