All Publications

Bayesian Multi-Objective Hyperparameter Optimization for Accurate, Fast, and Efficient Neural Network Accelerator Design

Maryam Parsa, John P. Mitchell, Catherine D. Schuman, Robert M. Patton, Thomas E. Potok, and Kaushik Roy.

July, 2020

Frontiers in Neuroscience

https://doi.org/10.3389/fnins.2020.00667

PDF not available yet, or is only available from the conference/journal publisher.

Abstract

In resource-constrained environments, such as low-power edge devices and smart sensors, deploying a fast, compact, and accurate intelligent system with minimum energy is indispensable. Embedding intelligence can be achieved using neural networks on neuromorphic hardware. Designing such networks would require determining several inherent hyperparameters. A key challenge is to find the optimum set of hyperparameters that might belong to the input/output encoding modules, the neural network itself, the application, or the underlying hardware. In this work, we present a hierarchical pseudo agent-based multi-objective Bayesian hyperparameter optimization framework (both software and hardware) that not only maximizes the performance of the network, but also minimizes the energy and area requirements of the corresponding neuromorphic hardware. We validate performance of our approach (in terms of accuracy and computation speed) on several control and classification applications on digital and mixed-signal (memristor-based) neural accelerators. We show that the optimum set of hyperparameters might drastically improve the performance of one application (i.e., 52–71% for Pole-Balance), while having minimum effect on another (i.e., 50–53% for RoboNav). In addition, we demonstrate resiliency of different input/output encoding, training neural network, or the underlying accelerator modules in a neuromorphic system to the changes of the hyperparameters.

Citation Information

Text


author    M. Parsa and J. P. Mitchell and C. D. Schuman and R. M. Patton
          and T. E. Potok, and K. Roy 
title     Bayesian Multi-Objective Hyperparameter Optimization for Accurate, Fast, 
          and Efficient Neural Network Accelerator Design
journal   Frontiers in Neuroscience 
volume    14 
year      2020
where     https://doi.org/10.3389/fnins.2020.00667
doi       10.3389/fnins.2020.00667
issn      1662-453X
pages     667

Bibtex


@ARTICLE{pms:20:bmo,
    author = "M. Parsa and J. P. Mitchell and C. D. Schuman and R. M. Patton
               and T. E. Potok, and K. Roy ",
    title = "Bayesian Multi-Objective Hyperparameter Optimization for Accurate, Fast, 
               and Efficient Neural Network Accelerator Design",
    journal = "Frontiers in Neuroscience ",
    volume = "14 ",
    year = "2020",
    where = "https://doi.org/10.3389/fnins.2020.00667",
    doi = "10.3389/fnins.2020.00667",
    issn = "1662-453X",
    pages = "667"
}