An Evolutionary Optimization Framework for Neural Networks and Neuromorphic Architectures
Catherine D. Schuman, James S. Plank, Adam Disney, and John Reynolds.
July, 2016
International Joint Conference on Neural Networks, part of the IEEE World Congress on Computational Intelligence, Vancouver, Canada.
Abstract
As new neural network and neuromorphic architectures are being developed, new training methods that operate within the constraints of the new architectures are required. Evolutionary optimization (EO) is a convenient training method for new architectures. In this work, we review a spiking neural network architecture and a neuromorphic architecture, and we describe an EO training framework for these architectures. We present the results of this training framework on four classification data sets and compare those results to other neural network and neuromorphic implementations. We also discuss how this EO framework may be extended to other architectures.Citation Information
Text
author C. D. Schuman and J. S. Plank and A. Disney and J. Reynolds title An Evolutionary Optimization Framework for Neural Networks and Neuromorphic Architectures booktitle International Joint Conference on Neural Networks where http://www.wcci2016.org/ year 2016 month July address Vancouver
Bibtex
@INPROCEEDINGS{spdr:16:eo, author = "C. D. Schuman and J. S. Plank and A. Disney and J. Reynolds", title = "An Evolutionary Optimization Framework for Neural Networks and Neuromorphic Architectures", booktitle = "International Joint Conference on Neural Networks", where = "http://www.wcci2016.org/", year = "2016", month = "July", address = "Vancouver" }