PI-Light: Physics-Inspired Diffusion for Full-Image Relighting
Overview
Taxonomy
Research Landscape Overview
Claimed Contributions
The authors propose a two-stage diffusion-based framework for full-image relighting that incorporates batch-aware attention for consistent intrinsic predictions, a physics-guided neural rendering module enforcing physically plausible light transport, and physics-inspired losses that regularize training toward physically meaningful solutions to enhance generalization to real-world scenes.
The authors introduce physics-inspired losses (diffuse shading loss and physical-based shading loss) that regularize the neural forward rendering module to follow physically plausible light transport principles, enabling the model to learn correct light transport with less data and computation while improving generalization.
The authors construct a new dataset featuring diverse objects from Objaverse and curated scenes from BlenderKit, all rendered under controlled lighting conditions with ground-truth intrinsic properties, addressing data scarcity in full-image relighting research and enabling comprehensive downstream benchmarking.
Core Task Comparisons
Comparisons with papers in the same taxonomy category
[1] Lightlab: Controlling light sources in images with diffusion models PDF
[17] A Diffusion Approach to Radiance Field Relighting using MultiâIllumination Synthesis PDF
[31] LumiNet: Latent Intrinsics Meets Diffusion Models for Indoor Scene Relighting PDF
[46] Comprehensive Relighting: Generalizable and Consistent Monocular Human Relighting and Harmonization PDF
Contribution Analysis
Detailed comparisons for each claimed contribution
PI-Light: Physics-Inspired Diffusion Framework for Full-Image Relighting
The authors propose a two-stage diffusion-based framework for full-image relighting that incorporates batch-aware attention for consistent intrinsic predictions, a physics-guided neural rendering module enforcing physically plausible light transport, and physics-inspired losses that regularize training toward physically meaningful solutions to enhance generalization to real-world scenes.
[1] Lightlab: Controlling light sources in images with diffusion models PDF
[2] LightIt: Illumination Modeling and Control for Diffusion Models PDF
[11] Learning A Physical-aware Diffusion Model Based on Transformer for Underwater Image Enhancement PDF
[17] A Diffusion Approach to Radiance Field Relighting using MultiâIllumination Synthesis PDF
[66] Diffrelight: Diffusion-based facial performance relighting PDF
[68] Zero-reference low-light enhancement via physical quadruple priors PDF
[69] Generative multiview relighting for 3d reconstruction under extreme illumination variation PDF
[70] Underwater sequential images enhancement via diffusion and physics priors fusion PDF
[71] Difareli: Diffusion face relighting PDF
[72] Relightify: Relightable 3d faces from a single image via diffusion models PDF
Physics-Inspired Light Transport Prior for Neural Forward Rendering
The authors introduce physics-inspired losses (diffuse shading loss and physical-based shading loss) that regularize the neural forward rendering module to follow physically plausible light transport principles, enabling the model to learn correct light transport with less data and computation while improving generalization.
[51] Scatternerf: Seeing through fog with physically-based inverse neural rendering PDF
[52] Neural Inverse Rendering from Propagating Light PDF
[53] URhand: Universal relightable hands PDF
[54] Relightablehands: Efficient neural relighting of articulated hand models PDF
[55] Soft Shadow Diffusion (SSD): Physics-Inspired Learning for 3D Computational Periscopy PDF
[56] Relighting4d: Neural relightable human from videos PDF
[57] Neural Light Transport for Relighting and View Synthesis PDF
[58] Path space regularization for holistic and robust light transport PDF
[59] PBR-NeRF: Inverse Rendering with Physics-Based Neural Fields PDF
[60] PNRNet: Physically-Inspired Neural Rendering for Any-to-Any Relighting PDF
Curated Dataset of Objects and Scenes Under Controlled Lighting
The authors construct a new dataset featuring diverse objects from Objaverse and curated scenes from BlenderKit, all rendered under controlled lighting conditions with ground-truth intrinsic properties, addressing data scarcity in full-image relighting research and enabling comprehensive downstream benchmarking.