How to Train Neural Field Representations: A Comprehensive Study and Benchmark

Open Access
Authors
Publication date 2024
Book title 2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition
Book subtitle CVPR 2024 : Seattle, Washington, USA, 16-22 June 2024 : proceedings
ISBN
  • 9798350353013
ISBN (electronic)
  • 9798350353006
Event 2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition
Pages (from-to) 22616-22625
Publisher Los Alamitos, California: IEEE Computer Society
Organisations
  • Faculty of Science (FNWI) - Informatics Institute (IVI)
Abstract
Neural fields (NeFs) have recently emerged as a versatile method for modeling signals of various modalities, including images, shapes, and scenes. Subsequently, a number of works have explored the use of NeFs as representations for downstream tasks, e.g. classifying an image based on the parameters of a NeF that has been fit to it. However, the impact of the NeF hyperparameters on their quality as downstream representation is scarcely understood and re-mains largely unexplored. This is in part caused by the large amount of time required to. fit datasets of neuralfields. In this work, we propose a JAX-based library1 that lever-ages parallelization to enable fast optimization of large-scale NeF datasets, resulting in a significant speed-up. With this library, we perform a comprehensive study that inves-tigates the effects of different hyperparameters on fitting NeFs for downstream tasks. In particular, we explore the use of a shared initialization, the effects of overtraining, and the expressiveness of the network architectures used. Our study provides valuable insights on how to train NeFs and offers guidance for optimizing their effectiveness in down-stream applications. Finally, based on the proposed library and our analysis, we propose Neural Field Arena, a bench-mark consisting of neural field variants of popular vision datasets, including MNIST, CIFAR, variants of ImageNet, and ShapeNetv2. Our library and the Neural Field Arena will be open-sourced to introduce standardized benchmarking and promote further research on neural fields.
Document type Conference contribution
Note With supplemental materials
Language English
Published at https://doi.org/10.48550/arXiv.2312.10531 https://doi.org/10.1109/CVPR52733.2024.02134
Published at https://openaccess.thecvf.com/content/CVPR2024/html/Papa_How_to_Train_Neural_Field_Representations_A_Comprehensive_Study_and_CVPR_2024_paper.html
Other links https://www.proceedings.com/76082.html
Downloads
Supplementary materials
Permalink to this page
Back