Dynamic Novel View Synthesis in High Dynamic Range
Overview
Overall Novelty Assessment
The paper introduces HDR Dynamic Novel View Synthesis (HDR DNVS), combining high dynamic range reconstruction with temporal scene modeling through HDR-4DGS, a Gaussian Splatting architecture featuring dynamic tone-mapping. It resides in the Neural HDR Radiance Field Methods leaf, which contains six papers including the original work. This leaf sits within the HDR Reconstruction and Tone Mapping for Dynamic Scenes branch, indicating a moderately populated research direction focused on neural volumetric representations with explicit tone mapping for multi-exposure LDR-to-HDR conversion.
The taxonomy reveals neighboring leaves addressing related challenges: Deblurring and Alternating-Exposure HDR Reconstruction handles motion blur in monocular videos, while Image-Based HDR Fusion focuses on alignment-based merging without volumetric representations. The Dynamic Scene Representation branch contains 4D Gaussian Splatting methods that model temporal variations but typically assume standard dynamic range. The paper bridges these areas by jointly addressing HDR reconstruction and dynamic scene modeling, diverging from siblings like Fast HDR Radiance or GaussHDR that primarily target static or simpler dynamic scenarios.
Among thirty candidates examined, the problem formulation contribution shows one refutable candidate from ten examined, suggesting some prior work addresses dynamic HDR synthesis. The HDR-4DGS framework contribution examined ten candidates with none clearly refuting it, indicating potential architectural novelty in the dynamic tone-mapping module design. The benchmark dataset contribution also found one refutable candidate among ten, implying existing HDR dynamic datasets may exist. The limited search scope means these statistics reflect top-semantic matches rather than exhaustive field coverage, with most contributions showing substantial non-refutable candidates.
Based on the top-thirty semantic search results, the work appears to occupy a niche intersection between HDR reconstruction and dynamic scene modeling. The taxonomy structure shows this combination is less densely populated than either HDR or dynamic synthesis alone. However, the analysis acknowledges limited coverage: the search examined thirty candidates across three contributions, leaving open questions about broader prior work in HDR video synthesis or related multi-exposure dynamic capture methods not surfaced by semantic similarity.
Taxonomy
Research Landscape Overview
Claimed Contributions
The authors formalize a new problem called High Dynamic Range Dynamic Novel View Synthesis (HDR DNVS), which extends existing HDR novel view synthesis methods to handle dynamic scenes with time-varying geometry and illumination, rather than being restricted to static scenes.
The authors introduce HDR-4DGS, a Gaussian Splatting-based framework that incorporates a biologically inspired dynamic tone-mapping module. This module uses a dynamic radiance context learner and per-channel tone-mapping functions to maintain temporal radiance coherence while translating between LDR and HDR domains.
The authors create two new benchmark datasets for evaluating HDR DNVS methods: HDR-4D-Syn with 8 synthetic dynamic scenes and HDR-4D-Real with 4 real-world captured sequences. Each dataset includes ground-truth HDR images, time-varying 3D geometry, and synchronized multi-view LDR observations.
Core Task Comparisons
Comparisons with papers in the same taxonomy category
[1] Fast high dynamic range radiance fields for dynamic scenes PDF
[14] Casual3DHDR: High Dynamic Range 3D Gaussian Splatting from Casually Captured Videos PDF
[17] GaussHDR: High Dynamic Range Gaussian Splatting via Learning Unified 3D and 2D Local Tone Mapping PDF
[19] Enhancing Neural Radiance Fields with Adaptive Multi-Exposure Fusion: A Bilevel Optimization Approach for Novel View Synthesis PDF
[27] Dynamic HDR Radiance Fields via Neural Scene Flow PDF
Contribution Analysis
Detailed comparisons for each claimed contribution
HDR Dynamic Novel View Synthesis problem formulation
The authors formalize a new problem called High Dynamic Range Dynamic Novel View Synthesis (HDR DNVS), which extends existing HDR novel view synthesis methods to handle dynamic scenes with time-varying geometry and illumination, rather than being restricted to static scenes.
[1] Fast high dynamic range radiance fields for dynamic scenes PDF
[5] GsNeRF: Fast novel view synthesis of dynamic radiance fields PDF
[36] EventSplat: 3D Gaussian Splatting from Moving Event Cameras for Real-time Rendering PDF
[51] Hdr-plenoxels: Self-calibrating high dynamic range radiance fields PDF
[52] Robust Dynamic Radiance Fields PDF
[53] High Dynamic Range Novel View Synthesis with Single Exposure PDF
[54] Non-Rigid Neural Radiance Fields: Reconstruction and Novel View Synthesis of a Dynamic Scene From Monocular Video PDF
[55] Pano-NeRF: Synthesizing high dynamic range novel views with geometry from sparse low dynamic range panoramic images PDF
[56] LTM-NeRF: Embedding 3D Local Tone Mapping in HDR Neural Radiance Field PDF
[57] Dynamic Mesh-Aware Radiance Fields PDF
HDR-4DGS framework with dynamic tone-mapping module
The authors introduce HDR-4DGS, a Gaussian Splatting-based framework that incorporates a biologically inspired dynamic tone-mapping module. This module uses a dynamic radiance context learner and per-channel tone-mapping functions to maintain temporal radiance coherence while translating between LDR and HDR domains.
[17] GaussHDR: High Dynamic Range Gaussian Splatting via Learning Unified 3D and 2D Local Tone Mapping PDF
[62] HDR-GS: Efficient High Dynamic Range Novel View Synthesis at 1000x Speed via Gaussian Splatting PDF
[63] Hdrsplat: Gaussian splatting for high dynamic range 3d scene reconstruction from raw images PDF
[64] HDRGS: High Dynamic Range Gaussian Splatting PDF
[65] GaSLight: Gaussian Splats for Spatially-Varying Lighting in HDR PDF
[66] EvHDR-GS: Event-guided HDR Video Reconstruction with 3D Gaussian Splatting PDF
[67] Cinematic Gaussians: RealâTime HDR Radiance Fields with Depth of Field PDF
[68] Generating an HDR Gaussian Splatting Representation from LDR Single-Exposure Images PDF
[69] Lighting Every Darkness with 3DGS: Fast Training and Real-Time Rendering for HDR View Synthesis PDF
[70] Reconstructing 3D Scenes in Native High Dynamic Range PDF
HDR-4D-Syn and HDR-4D-Real benchmark datasets
The authors create two new benchmark datasets for evaluating HDR DNVS methods: HDR-4D-Syn with 8 synthetic dynamic scenes and HDR-4D-Real with 4 real-world captured sequences. Each dataset includes ground-truth HDR images, time-varying 3D geometry, and synchronized multi-view LDR observations.