In this paper, we demonstrate how the physics of entropy production, when combined with symmetry constraints, can be used for implementing high-performance and energy-efficient analog computing systems. At the core of the proposed framework is a generalized maximum-entropy principle that can describe the evolution of a mesoscopic physical system formed by an interconnected ensemble of analog elements, including devices that can be readily fabricated on standard integrated circuit technology. We show that the maximum-entropy state of this ensemble corresponds to a margin-propagation (MP) distribution and can be used for computing correlations and inner products as the ensemble's macroscopic properties. Furthermore, the limits of computational throughput and energy efficiency can be pushed by extending the framework to non-equilibrium or transient operating conditions, which we demonstrate using a proof-of-concept radio-frequency (RF) correlator integrated circuit fabricated in a 22 nm SOI CMOS process. The measured results show a compute efficiency greater than 2 Peta ($10^{15}$) Bit Operations per second per Watt (PetaOPS/W) at 8-bit precision and greater than 0.8 Exa ($10^{18}$) Bit Operations per second per Watt (ExaOPS/W) at 3-bit precision for RF data sampled at rates greater than 4 GS/s. Using the fabricated prototypes, we also showcase several real-world RF applications at the edge, including spectrum sensing, and code-domain communications.
 翻译:暂无翻译