HDR-NeRF: High Dynamic Range Neural Radiance Fields

CVPR 2022


Xin Huang1, Qi Zhang2, Ying Feng2, Hongdong Li3, Xuan Wang2, Qing Wang1

1Northwestern Polytechnical University    2Tencent AI Lab     3Australian National University

Abstract


Architechture

We present High Dynamic Range Neural Radiance Fields (HDR-NeRF) to recover an HDR radiance field from a set of low dynamic range (LDR) views with different exposures. Using the HDR-NeRF, we are able to generate both novel HDR views and novel LDR views under different exposures. The key to our method is to model the physical imaging process, which dictates that the radiance of a scene point transforms to a pixel value in the LDR image with two implicit functions: a radiance field and a tone mapper. The radiance field encodes the scene radiance (values vary from 0 to +infty), which outputs the density and radiance of a ray by giving corresponding ray origin and ray direction. The tone mapper models the mapping process that a ray hitting on the camera sensor becomes a pixel value. The color of the ray is predicted by feeding the radiance and the corresponding exposure time into the tone mapper. We use the classic volume rendering technique to project the output radiance, colors, and densities into HDR and LDR images, while only the input LDR images are used as the supervision. We collect a new forward-facing HDR dataset to evaluate the proposed method. Experimental results on synthetic and real-world scenes validate that our method can not only accurately control the exposures of synthesized views but also render views with a high dynamic range.


Novel LDR views


HDR-NeRF produces high-fidelity LDR views with varying exposures.


Novel HDR views (Tone-mapped)


Our HDR-NeRF is able to render novel HDR views. The tone-mapped HDR views reveal the details of over-exposure and under-exposure areas.


Comparisons of novel LDR/HDR views


Mildenhall et al., NeRF:Representing scenes as neural radiance fields for view synthesis. ECCV, pages 405–421, 2020.

Martin-Brualla, et al., NeRF in the wild: Neural radiance fields for unconstrained photo collections. CVPR, pages 7210–7219, 2021.


Pipeline Overview


Architechture

The pipeline of HDR-NeRF modeling the physical process. Our method is consisted of two modules: an HDR radiance field models the target scene for radiance and densities and a tone mapper models the CRF for colors.


Results and Comparisons



Citation


@article{huang2021hdrnerf,
    title={HDR-NeRF: High Dynamic Range Neural Radiance Fields},
    author={Xin, Huang and Qi, Zhang and Ying, Feng and Hongdong, Li and Xuan, Wang and Qing, Wang},
    journal = {arXiv preprint arXiv:2111.14451},
    month = {November},
    year = {2021}
}

Acknowledgements


We thank Li Ma and Xiaoyu Li for their instructive and useful advices.