The "White Whale" has been published! The paper is on novel spiking neural networks that convert between value, temporal and spike train encodings
Published 9/7/2025
A paper entitled "Alleviating the Communication Bottleneck in Neuromorphic Computing with Custom-Designed Spiking Neural Networks" has just been published by Journal of Low Power Electronics and Applications. Our nickname for the paper has been "The White Whale", as its abbreviated form got rejected by both IJCNN and ICONS. It's a pity, because the main result of this paper is really cool: You can use SNNs to convert between value, time and spike-train encodings of data.
Now, why is this cool? Because spike trains are often more successful than the others when you train a SNN for an application; but spike trains are inefficient when communicating to and from a host in an embedded neuromorphic application. With these networks, you can train the network with spike trains, but then when you implement it, use a more efficient encoding, and convert it to the spike train neuromorphically.
Similarly, applications often use SNNs for decisions by counting the spikes on output neurons, and comparing the count. Again, this is inefficient from a communication standpoint, as communicating spikes to a host is expensive. With these networks, you can perform the counting with a SNN, and then communicate the result to the host with a single spike.
In this paper, we describe the SNNs, and then evaluate two applications that suffer from the above scenario. Their performance is improved drastically using the SNNs in this paper. On FPGA, the speedups are drastic -- in one case by a factor of 23.
The paper is not an easy read, and when it was abbreviated for IJCNN and ICONS, it was even harder. That's why the paper was rejected from those conferences. As such, we have created an open source repo that illustrates all of the networks and applications in the paper. This includes videos to explain each network, as we illustrate how it works in the TENNLab open-source framework.
We will also create a video to go over the paper as a whole, and if you want your own personal explanation, please contact us.
The paper is here.
Plank | Rizzo | Gullett | Dent | Schuman |
Plank
Rizzo
Gullett
Dent
Schuman