Accepted by Proc. of SIGGRAPH Asia 2023
Figure: Given a set of input images of an outdoor scene, our SOL-NeRF pipeline decomposes them into geometry and material properties, which enables rendering the input scene from a novel viewpoint and relighting it with a different illumination.
Abstract
Outdoor scenes often involve large-scale geometry and complex unknown lighting conditions, making it difficult to decompose them into geometry, reflectance and illumination. Recently researchers made attempts to decompose outdoor scenes using Neural Radiance Fields (NeRF) and learning-based lighting and shadow representations. However, diverse lighting conditions and shadows in outdoor scenes are challenging for learning-based models. Moreover, existing methods may produce rough geometry and normal reconstruction and introduce notable shading artifacts when the scene is rendered under a novel illumination. To solve the above problems, we propose SOL-NeRF to decompose outdoor scenes with the help of a hybrid lighting representation and a signed distance field geometry reconstruction. We use a single Spherical Gaussian (SG) lobe to approximate the sun lighting, and a first-order Spherical Harmonic (SH) mixture to resemble the sky lighting. This hybrid representation is specifically designed for outdoor settings, and compactly models the outdoor lighting, ensuring robustness and efficiency. The shadow of the direct sun lighting can be obtained by casting the ray against the mesh extracted from the signed distance field, and the remaining shadow can be approximated by Ambient Occlusion (AO). Additionally, sun lighting color prior and a relaxed Manhattan-world assumption can be further applied to boost decomposition and relighting performance. When changing the lighting condition, our method can produce consistent relighting results with correct shadow effects. Experiments conducted on our hybrid lighting scheme and the entire decomposition pipeline show that our method achieves better reconstruction, decomposition, and relighting performance compared to previous methods both quantitatively and qualitatively.
Paper
DE-NeRF: DEcoupled Neural Radiance Fields for View-Consistent Appearance Editing and High-Frequency Environmental Relighting
Code
Coming Soon
Methodology
Figure: The overview of SOL-NeRF pipeline. Given a set of images under multiple different lighting conditions, we model the scene's geometry with a signed distance field (SDF) and apply an adaptive sampling strategy in the neural volume rendering process. To decompose geometry, material, shadow and lighting, we predict the diffuse albedo a with an MLP network and the normal is derived by the gradient of the SDF network. Our lighting is composed of a Spherical Gaussian (SG) function and the first-order Spherical Harmonic (SH) functions. The SG function is responsible for high-intensity lighting like the sun while the SH functions are designed to represent relatively low-intensity lighting like the sky light.We consider both the shadow cast by the directional SG light and the ambient occlusion. To enhance decomposition quality, we introduce priors for the sunlight color and geometry. Overall, SOL-NeRF enables realistic reconstruction and relighting the input scene under a novel lighting condition.
Figure: The calculation of shadow under the proposed hybrid lighting. The final shadow consists of both the SG shadow and the shadow caused by ambient occlusion. Specifically, we calculate the SG shadow by casting the SG lighting direction to the extracted mesh. The ambient occlusion value of a surface point p_r is determined by the ratio of points outside the surface and the total sample points p'_{r,k} in a hemisphere centered at p_r . Note that ambient occlusion is calculated by directly querying the SDF, while traced shadow is based on extracted mesh.
Figure: The sunlight color is related to the sun elevation \theta. According to the learned sun elevation \theta, atmosphere thickness, and air density, the optical depth is calculated. Then the color of the sunlight that reaches the surface is obtained using wavelength-dependent Reyleigh and Mie Scattering laws [Nishita et al. 1993; Tyndall 1869]. Using this method we can associate every \theta with a color. This function is approximated using a polynomial that can be evaluated efficiently in training time.
Geometry Reconstruction and Decomposing
Figure: Decomposition results of NeRF-OSR and our method. For each scene, we show different decomposed components (normal, albedo and shadow) and the reconstructed image.
Relighting
Figure: Relighting results of NeRF-OSR and our method. For each input scene, we relight it with two different lighting conditions and show rendered images and shadows.
BibTex
Last updated on June, 2023. |