Home Abstract Video Method Results Citation Acknowledgement

Given an (a) RGB image and an existing depth estima-tion/completion model, our PnP module propagates information from (b) sparse points and generates an (c) improved depth predic-tion without any re-training. We show the (d) improvement denoted in red region.

Abstract

We propose a novel plug-and-play (PnP) module for improving depth prediction with taking arbitrary patterns of sparse depths as input. Given any pre-trained depth predic-tion model, our PnP module updates the intermediate feature map such that the model outputs new depths consistent with the given sparse depths. Our method requires no additional training and can be applied to practical applications such as leveraging both RGB and sparse LiDAR points to robustly estimate dense depth map. Our approach achieves consistent improvements on various state-of-the-art methods on indoor (i.e., NYU-v2) and outdoor (i.e., KITTI) datasets. Various types of LiDARs are also synthesized in our experiments to verify the general applicability of our PnP module in practice.

Video Overview

Method

Illustration of the PnP module via backward-forward propagation. Given a depth estimation model f and sparse ground truth Ds, our PnP module iteratively updates the intermediate representation z based on the gradient computed from sparse points and re-inference to obtain better depth prediction. Here, we illustrate an example for two iterations.

Results

Qualitative results on NYU and KITTI. Row (a), (b), (c), (d) are RGB image, sparse depth, final prediction with our PnP module, and corresponding improvement map. In (d), the red region indicates improvement and the green one indicates degradation.

Citation

Plug-and-Play: Improve Depth Estimation via Sparse Data Propagation

Tsun-Hsuan Wang, Fu-En Wang, Juan-Ting Lin, Yi-Hsuan Tsai, Wei-Chen Chiu, Min Sun
Paper (arXiv)  Source Code
    @article{wang2018pnp,
    title={Plug-and-Play: Improve Depth Estimation via Sparse Data Propagation},
    author={Wang, Tsun-Hsuan and Wang, Fu-En and Lin, Juan-Ting and Tsai, Yi-Hsuan and Chiu, Wei-Chen and Sun, Min},
    journal={arXiv preprint arXiv:1812.08350},
    year={2018}}
    

Acknowledgement