The detailed functioning of the human brain is still poorly understood. Brain simulations are a well-established way to complement experimental research, but must contend with the computational demands of the approximately $10^{11}$ neurons and the $10^{14}$ synapses connecting them, the network of the latter referred to as the connectome. Studies suggest that changes in the connectome (i.e., the formation and deletion of synapses, also known as structural plasticity) are essential for critical tasks such as memory formation and learning. The connectivity update can be efficiently computed using a Barnes-Hut-inspired approximation that lowers the computational complexity from $O(n^2)$ to $O(n log n)$, where n is the number of neurons. However, updating synapses, which relies heavily on RMA, and the spike exchange between neurons, which requires all-to-all communication at every time step, still hinder scalability. We present a new algorithm that significantly reduces the communication overhead by moving computation instead of data. This shrinks the time it takes to update connectivity by a factor of six and the time it takes to exchange spikes by more than two orders of magnitude.
翻译:暂无翻译