MetaModulation: Learning Variational Feature Hierarchies for Few-Shot Learning with Fewer Tasks

Open Access
Authors
Publication date 2023
Journal Proceedings of Machine Learning Research
Event 40th International Conference on Machine Learning, ICML 2023
Volume | Issue number 202
Pages (from-to) 32847-32858
Number of pages 12
Organisations
  • Faculty of Science (FNWI) - Informatics Institute (IVI)
Abstract
Meta-learning algorithms are able to learn a new task using previously learned knowledge, but they often require a large number of meta-training tasks which may not be readily available. To address this issue, we propose a method for few-shot learning with fewer tasks, which we call MetaModulation. The key idea is to use a neural network to increase the density of the meta-training tasks by modulating batch normalization parameters during meta-training. Additionally, we modify parameters at various neural network levels, rather than just a single layer, to increase task diversity. To account for the uncertainty caused by the reduced number of training tasks, we propose a variational MetaModulation where the modulation parameters are treated as latent variables. We also introduce learning variational feature hierarchies by the variational MetaModulation, which modulates features at all layers and can take into account task uncertainty and generate more diverse tasks. The ablation studies illustrate the advantages of utilizing a learnable task modulation at different levels and demonstrate the benefit of incorporating probabilistic variants in few-task meta-learning. Our MetaModulation and its variational variants consistently outperform state-of-the-art alternatives on four few-task meta-learning benchmarks.
Document type Article
Note Proceedings of the 40th International Conference on International Conference on Machine Learning, 23-29 July 2023, Honolulu, Hawaii, USA
Language English
Published at https://proceedings.mlr.press/v202/sun23b.html
Downloads
sun23b (Final published version)
Permalink to this page
Back