Putting An End to End-to-End: Gradient-Isolated Learning of Representations

Open Access
Authors
Publication date 2020
Host editors
  • H. Wallach
  • H. Larochelle
  • A. Beygelzimer
  • F. d'Alché-Buc
  • E. Fox
  • R. Garnett
Book title 32nd Conference on Neural Information Processing Systems (NeurIPS 2019)
Book subtitle Vancouver, Canada, 8-14 December 2019
ISBN
  • 9781713807933
Series Advances in Neural Information Processing Systems
Event 33rd Annual Conference on Neural Information Processing Systems, NeurIPS 2019
Volume | Issue number 4
Pages (from-to) 3016-3028
Publisher San Diego, CA: Neural Information Processing Systems Foundation
Organisations
  • Faculty of Science (FNWI) - Informatics Institute (IVI)
Abstract
We propose a novel deep learning method for local self-supervised representation learning that does not require labels nor end-to-end backpropagation but exploits the natural order in data instead. Inspired by the observation that biological neural networks appear to learn without backpropagating a global error signal, we split a deep neural network into a stack of gradient-isolated modules. Each module is trained to maximally preserve the information of its inputs using the InfoNCE bound from Oord et al. [2018]. Despite this greedy training, we demonstrate that each module improves upon the output of its predecessor, and that the representations created by the top module yield highly competitive results on downstream classification tasks in the audio and visual domain. The proposal enables optimizing modules asynchronously, allowing large-scale distributed training of very deep neural networks on unlabelled datasets.
Document type Conference contribution
Note Running title: 33rd Conference on Neural Information Processing Systems (NeurIPS 2019). - With supplemental file.
Language English
Published at https://papers.nips.cc/paper/2019/hash/851300ee84c2b80ed40f51ed26d866fc-Abstract.html
Other links http://www.proceedings.com/53719.html
Downloads
Supplementary materials
Permalink to this page
Back