arXiv Analytics

Sign in

arXiv:2008.08476 [cs.LG]AbstractReferencesReviewsResources

NASCaps: A Framework for Neural Architecture Search to Optimize the Accuracy and Hardware Efficiency of Convolutional Capsule Networks

Alberto Marchisio, Andrea Massa, Vojtech Mrazek, Beatrice Bussolino, Maurizio Martina, Muhammad Shafique

Published 2020-08-19Version 1

Deep Neural Networks (DNNs) have made significant improvements to reach the desired accuracy to be employed in a wide variety of Machine Learning (ML) applications. Recently the Google Brain's team demonstrated the ability of Capsule Networks (CapsNets) to encode and learn spatial correlations between different input features, thereby obtaining superior learning capabilities compared to traditional (i.e., non-capsule based) DNNs. However, designing CapsNets using conventional methods is a tedious job and incurs significant training effort. Recent studies have shown that powerful methods to automatically select the best/optimal DNN model configuration for a given set of applications and a training dataset are based on the Neural Architecture Search (NAS) algorithms. Moreover, due to their extreme computational and memory requirements, DNNs are employed using the specialized hardware accelerators in IoT-Edge/CPS devices. In this paper, we propose NASCaps, an automated framework for the hardware-aware NAS of different types of DNNs, covering both traditional convolutional DNNs and CapsNets. We study the efficacy of deploying a multi-objective Genetic Algorithm (e.g., based on the NSGA-II algorithm). The proposed framework can jointly optimize the network accuracy and the corresponding hardware efficiency, expressed in terms of energy, memory, and latency of a given hardware accelerator executing the DNN inference. Besides supporting the traditional DNN layers, our framework is the first to model and supports the specialized capsule layers and dynamic routing in the NAS-flow. We evaluate our framework on different datasets, generating different network configurations, and demonstrate the tradeoffs between the different output metrics. We will open-source the complete framework and configurations of the Pareto-optimal architectures at https://github.com/ehw-fit/nascaps.

Comments: To appear at the IEEE/ACM International Conference on Computer-Aided Design (ICCAD '20), November 2-5, 2020, Virtual Event, USA
Categories: cs.LG, stat.ML
Related articles: Most relevant | Search more
arXiv:1909.02453 [cs.LG] (Published 2019-09-05)
Best Practices for Scientific Research on Neural Architecture Search
arXiv:2006.02903 [cs.LG] (Published 2020-06-01)
A Comprehensive Survey of Neural Architecture Search: Challenges and Solutions
arXiv:1902.08142 [cs.LG] (Published 2019-02-21)
Evaluating the Search Phase of Neural Architecture Search