NeRSP: Neural 3D Reconstruction for Reflective Objects with Sparse Polarized Images

CVPR 2024

1Beijing University of Posts and Telecommunications
2Graduate School of Information Science and Technology, Osaka University 3National Key Laboratory for Multimedia Information Processing, School of Computer Science, Peking University 4National Engineering Research Center of Visual Technology, School of Computer Science, Peking University

Video

Abstract

We present NeRSP, a Neural 3D reconstruction technique for Reflective surfaces with Sparse Polarized images. Reflective surface reconstruction is extremely challenging as specular reflections are view-dependent and thus violate the multiview consistency for multiview stereo. On the other hand, sparse image inputs, as a practical capture setting, commonly cause incomplete or distorted results due to the lack of correspondence matching.

This paper jointly handles the challenges from sparse inputs and reflective surfaces by leveraging polarized images. We derive photometric and geometric cues from the polarimetric image formation model and multiview azimuth consistency, which jointly optimize the surface geometry modeled via implicit neural representation. Based on the experiments on our synthetic and real datasets, we achieve the state-of-the-art surface reconstruction results with only 6 views as input.

Real-world dataset RMVP3D

Reconstrucions on RMVP3D

Input 6 views

Reconstrucions on PANDORA dataset

Input 6 views

Related Links

Our work benefits from several works.

  • Our implementation is build upon IDR and NeuS.
  • MVAS: The geometric cue formation is inspired by their work.
  • PANDORA: The photometric cue formation is inspired by their work. We also used their data for comparison.
  • BibTeX

    @inproceedings{nersp2024yufei,
    title = {NeRSP: Neural 3D Reconstruction for Reflective Objects with Sparse Polarized Images},
    author = {Yufei, Han and Heng, Guo and Koki, Fukai and Hiroaki, Santo and Boxin, Shi and Fumio, Okura and Zhanyu, Ma and Yunpeng, Jia},
    year = {2024},
    booktitle = CVPR,
    }