Explore the Full Program of SIGGRAPH Asia 2025!
Close

Presentation

NeLiF: Neural Lighting Function Generation for Real-Time Indoor Rendering
DescriptionRecent advances in neural rendering have extensively explored modeling
the radiance fields with neural representations, while overlooking the under-
lying mechanisms for producing various lighting effects, and consequently
leading to limited adaptability to dynamic scenes. These lighting effects,
including highlights, shadows, and indirect lighting, are usually computed
using physically-based rendering methods like path tracing, which can be
computationally prohibitive for complex indoor luminaires. Although sev-
eral recent studies have attempted to model global illumination effects with
neural representations, they commonly suffer from long training times or
poor generalizability to novel scenes. In light of these challenges, this work
presents a novel neural lighting function generation model that can syn-
thesize diverse lighting effects in real time for unseen dynamic scenes and
complex indoor luminaires, with results comparable to state-of-the-art ren-
dering pipelines. Our model specifically consists of two stages. In the first
stage, multi-view observation images of the luminaire are captured and
then used to encode a compact, scene-independent 3D neural lighting field.
In the second stage, light information is sampled from the neural lighting
field and combined with the G-buffers and shadow clues to produce the
shading results. In parallel, we leverage a state-of-the-art generative model
together with our HDR Lift module to generate an HDR 3D Gaussian representation of the luminaire.In our experiments, the model trained on a dataset of 10,000 modern indoor scenes demonstrates strong generalizability, high efficiency, and visually convincing results across a wide range of test scenes, highlighting its potential as a practical and flexible solution for high-fidelity, real-time neural indoor rendering.