The Drunkard’s Odometry: Estimating Camera Motion in Deforming Scenes

Open Access
Authors
  • David Recasens
  • M.R. Oswald ORCID logo
  • Marc Pollefeys
  • Javier Civera
Publication date 2023
Host editors
  • A. Oh
  • T. Naumann
  • A. Globerson
  • K. Saenko
  • M. Hardt
  • S. Levine
Book title 37th Conference on Neural Information Processing Systems (NeurIPS 2023)
Book subtitle 10-16 December 2023, New Orleans, Louisana, USA
ISBN (electronic)
  • 9781713899921
Series Advances in Neural Information Processing Systems
Event 37th Conference on Neural Information Processing Systems (NeurIPS 2023)
Number of pages 13
Publisher Neural Information Processing Systems Foundation
Organisations
  • Faculty of Science (FNWI) - Informatics Institute (IVI)
Abstract
Estimating camera motion in deformable scenes poses a complex and open research challenge. Most existing non-rigid structure from motion techniques assume to observe also static scene parts besides deforming scene parts in order to establish an anchoring reference. However, this assumption does not hold true in certain relevant application cases such as endoscopies. Deformable odometry and SLAM pipelines, which tackle the most challenging scenario of exploratory trajectories, suffer from a lack of robustness and proper quantitative evaluation methodologies. To tackle this issue with a common benchmark, we introduce the Drunkard's Dataset, a challenging collection of synthetic data targeting visual navigation and reconstruction in deformable environments. This dataset is the first large set of exploratory camera trajectories with ground truth inside 3D scenes where every surface exhibits non-rigid deformations over time. Simulations in realistic 3D buildings lets us obtain a vast amount of data and ground truth labels, including camera poses, RGB images and depth, optical flow and normal maps at high resolution and quality. We further present a novel deformable odometry method, dubbed the Drunkard’s Odometry, which decomposes optical flow estimates into rigid-body camera motion and non-rigid scene deformations. In order to validate our data, our work contains an evaluation of several baselines as well as a novel tracking error metric which does not require ground truth data. Dataset and code: https://davidrecasens.github.io/TheDrunkard'sOdometry/
Document type Conference contribution
Note With supplementary ZIP-file
Language English
Published at https://doi.org/10.48550/arXiv.2306.16917
Published at https://papers.nips.cc/paper_files/paper/2023/hash/98c9b79e9c686aadd4d81e34a7773dd1-Abstract-Datasets_and_Benchmarks.html
Other links https://doi.org/10.52202/075280
Downloads
2306.16917v1 (Submitted manuscript)
Supplementary materials
Permalink to this page
Back