Adjoined Networks: A Training Paradigm With Applications to Network Compression

Utkarsh Nath, Shrinu Kushagra, Yingzhen Yang

Research output: Contribution to journalConference articlepeer-review

Abstract

Compressing deep neural networks while maintaining accuracy is important when we want to deploy large, powerful models in production and/or edge devices. One common technique used to achieve this goal is knowledge distillation. Typically, the output of a static pre-defined teacher (a large base network) is used as soft labels to train and transfer information to a student (or smaller) network. In this paper, we introduce Adjoined Networks, or AN, a learning paradigm that trains both the original base network and the smaller compressed network together. In our training approach, the parameters of the smaller network are shared across both the base and the compressed networks. Using our training paradigm, we can simultaneously compress (the student network) and regularize (the teacher network) any architecture. In this paper, we focus on popular CNN-based architectures used for computer vision tasks. We conduct an extensive experimental evaluation of our training paradigm on various large-scale datasets. Using ResNet-50 as the base network, AN achieves 71.8% top-1 accuracy with only 1.8M parameters and 1.6 GFLOPs on the ImageNet data-set. We further propose Differentiable Adjoined Networks (DANs), a training paradigm that augments AN by using neural architecture search to jointly learn both the width and the weights for each layer of the smaller network. DAN achieves ResNet-50 level accuracy on ImageNet with 3.8× fewer parameters and 2.2× fewer FLOPs.

Original languageEnglish (US)
JournalCEUR Workshop Proceedings
Volume3121
StatePublished - 2022
EventAAAI 2022 Spring Symposium on Machine Learning and Knowledge Engineering for Hybrid Intelligence, AAAI-MAKE 2022 - Palo Alto, United States
Duration: Mar 21 2022Mar 23 2022

Keywords

  • Differentiable Adjoined Networks
  • Knowledge Distillation
  • Neural Architecture Search

ASJC Scopus subject areas

  • General Computer Science

Fingerprint

Dive into the research topics of 'Adjoined Networks: A Training Paradigm With Applications to Network Compression'. Together they form a unique fingerprint.

Cite this