All Publications

Reducing the Size of Spiking Convolutional Neural Networks by Trading Time for Space

James S. Plank, Jiajia Zhao and Brent Hurst

December, 2020

IEEE International Conference on Rebooting Computing (ICRC 2020)

https://icrc.ieee.org/past-editions/icrc-2020/program

View Article

Abstract

Spiking neural networks are attractive alternatives to conventional neural networks because of their ability to implement complex algorithms with low power and network complexity. On the flip side, they are difficult to train to solve specific problems. One approach to training is to train conventional neural networks with binary threshold activation functions, which may then be implemented with spikes. This is a powerful approach. However, when applied to neural networks with convolutional kernels, the spiking networks explode in size. In this work, we design multiple spiking computational modules, which reduce the size of the networks back to size of the conventional networks. They do so by taking advantage of the temporal nature of spiking neural networks. We evaluate the size reduction analytically and on classification examples. Finally, we compare and confirm the classification accuracy of their implementation on a discrete threshold neuroprocessor. There is a 12 minute video of this paper at: https://youtu.be/Q-7FJOS7dhI.

Citation Information

Text


author     J. S. Plank and J. Zhao and B. Hurst
title      Reducing the Size of Spiking Convolutional Neural Networks by Trading Time for Space
booktitle  IEEE International Conference on Rebooting Computing (ICRC)
month      December
year       2020
publisher  IEEE
pages      116-125

Bibtex


@INPROCEEDINGS{pzh:20:rss,
    author = "J. S. Plank and J. Zhao and B. Hurst",
    title = "Reducing the Size of Spiking Convolutional Neural Networks by Trading Time for Space",
    booktitle = "IEEE International Conference on Rebooting Computing (ICRC)",
    month = "December",
    year = "2020",
    pages = "115-126",
    publisher = "IEEE"
}