Network with Sub-networks: Layer-wise Detachable Neural Network

Authors
Ninnart Fuengfusin*, Hakaru Tamukoh
Graduate School of Life Science and Systems Engineering, Kyushu Institute of Technology, 2-4 Hibikino, Wakamatsu-ku, Kitakyushu, Fukuoka 808-0196, Japan
*Corresponding author. Email: [email protected]
Corresponding Author
Ninnart Fuengfusin
Received 11 November 2019, Accepted 19 June 2020, Available Online 21 December 2020.
DOI
https://doi.org/10.2991/jrnal.k.201215.006How to use a DOI?
Keywords
Model compression; neural networks; multilayer perceptron; supervised learning
Abstract
In this paper, we introduce a network with sub-networks: a neural network whose layers can be detached into sub-neural networks during the inference phase. To develop trainable parameters that can be inserted into both base-model and sub-models, first, the parameters of sub-models are duplicated in the base-model. Each model is separately forward-propagated, and all models are grouped into pairs. Gradients from selected pairs of networks are averaged and used to update both networks. With the Modified National Institute of Standards and Technology (MNIST) and Fashion-MNIST datasets, our base-model achieves identical test-accuracy to that of regularly trained models. However, the sub-models result in lower test-accuracy. Nevertheless, the sub-models serve as alternative approaches with fewer parameters than those of regular models.
Copyright
© 2020 The Authors. Published by ALife Robotics Corp. Ltd.
Open Access
This is an open access article distributed under the CC BY-NC 4.0 license (http://creativecommons.org/licenses/by-nc/4.0/).