Distortion-aware Depth Estimation with Gradient Priors from Panoramas of Indoor Scenes

Authors
Publication date 2022
Book title Proceedings : 2022 International Conference on 3D Vision
Book subtitle 3DV 2022 : Prague, Czechia, 12-15 September 2022
ISBN
  • 9781665456715
ISBN (electronic)
  • 9781665456708
Event 2022 International Conference on 3D Vision
Pages (from-to) 134-143
Publisher Los Alamitos, CA: IEEE Computer Society, Conference Publishing Services
Organisations
  • Faculty of Science (FNWI) - Informatics Institute (IVI)
Abstract
Compared to 2D perspective images, panoramic images capture a larger field-of-view (FOV). Therefore, depth estimation from panoramas is an important task for 3D scene understanding and has made significant progress with the development of CNNs. However, existing CNN-based methods still suffer from the Equirectangular Projection (ERP) problem to deal with panoramic distortions (e.g. same receptive fields near the equator and the two poles) and have difficulty generating accurate depth boundaries.
In contrast to existing CNN-based methods, in this paper, a novel Transformer-based method is proposed which is able to cope with panoramic distortions and to generate accurate depth boundaries. A Distortion-aware Transformer is designed using a yaw-invariant cycle shift and a distortion-guided partitioning. The aim is to alleviate the distortion effect by enlarging the receptive fields in both horizontal and vertical directions. Then, a Gradient Transformer is proposed to enhance the features around the boundaries. Gradient information is adopted as a boundary prior.
Large-scale experimental results show an improvement compared to state-of-the-art methods. Our method also shows strong generalization capabilities. Finally, our method is extended to panorama semantic segmentation.
Document type Conference contribution
Language English
Published at https://doi.org/10.1109/3DV57658.2022.00026
Other links https://www.proceedings.com/68009.html
Permalink to this page
Back