Unveiling the Mechanism of Continuous Representation Full-Waveform Inversion: A Wave Based Neural Tangent Kernel Framework

ICLR 2026 Conference SubmissionAnonymous Authors
Full-waveform inversion; Continuous representation; Implicit neural representation; Neural tangent kernel
Abstract:

Full-waveform inversion (FWI) estimates physical parameters in the wave equation from limited measurements and has been widely applied in geophysical exploration, medical imaging, and non-destructive testing. Conventional FWI methods are limited by their notorious sensitivity to the accuracy of the initial models. Recent progress in continuous representation FWI (CR-FWI) demonstrates that representing parameter models with a coordinate-based neural network, such as implicit neural representation (INR), can mitigate the dependence on initial models. However, its underlying mechanism remains unclear, and INR-based FWI shows slower high-frequency convergence. In this work, we investigate the general CR-FWI framework and develop a unified theoretical understanding by extending the neural tangent kernel (NTK) for FWI to establish a wave-based NTK framework. Unlike standard NTK, our analysis reveals that wave-based NTK is not constant, both at initialization and during training, due to the inherent nonlinearity of FWI. We further show that the eigenvalue decay behavior of the wave-based NTK can explain why CR-FWI alleviates the dependency on initial models and shows slower high-frequency convergence. Building on these insights, we propose several CR-FWI methods with tailored eigenvalue decay properties for FWI, including a novel hybrid representation combining INR and multi-resolution grid (termed IG-FWI) that achieves a more balanced trade-off between robustness and high-frequency convergence rate. Applications in geophysical exploration on Marmousi, 2D SEG/EAGE Salt and Overthrust, 2004 BP model, and the more realistic 2014 Chevron models show the superior performance of our proposed methods compared to conventional FWI and existing INR-based FWI methods.

Disclaimer
This report is AI-GENERATED using Large Language Models and WisPaper (A scholar search engine). It analyzes academic papers' tasks and contributions against retrieved prior work. While this system identifies POTENTIAL overlaps and novel directions, ITS COVERAGE IS NOT EXHAUSTIVE AND JUDGMENTS ARE APPROXIMATE. These results are intended to assist human reviewers and SHOULD NOT be relied upon as a definitive verdict on novelty.
NOTE that some papers exist in multiple, slightly different versions (e.g., with different titles or URLs). The system may retrieve several versions of the same underlying work. The current automated pipeline does not reliably align or distinguish these cases, so human reviewers will need to disambiguate them manually.
If you have any questions, please contact: mingzhang23@m.fudan.edu.cn

Overview

Overall Novelty Assessment

The paper develops a wave-based neural tangent kernel (NTK) framework to analyze continuous representation full-waveform inversion (CR-FWI), where subsurface parameters are represented via coordinate-based neural networks. It resides in the Neural Network Parameterization for FWI leaf, which contains only three papers including this one. This is a relatively sparse research direction within the broader taxonomy of 50 papers across approximately 36 topics, suggesting that neural network parameterization for FWI remains an emerging area compared to more established branches like Conventional FWI Algorithms or Rock Physics Integration.

The taxonomy tree reveals that this work sits within the Full-Waveform Inversion Methods branch, which also includes Conventional FWI Algorithms (four papers on traditional grid-based optimization), Stochastic and Bayesian Inversion (three papers on probabilistic frameworks), and Multiparameter and Elastic FWI (two papers on multi-parameter estimation). Neighboring branches include Supervised Deep Learning for Inversion, which focuses on direct data-to-model mapping rather than physics-informed parameterization, and Semi-Supervised and Unsupervised Learning Approaches. The scope notes clarify that this leaf excludes traditional grid-based FWI and supervised learning, positioning the work at the intersection of physics-based inversion and neural network theory.

Among 26 candidates examined across three contributions, no clearly refutable prior work was identified. The wave-based NTK framework examined six candidates with zero refutations, the eigenvalue decay analysis examined ten candidates with zero refutations, and the hybrid INR-multigrid representation examined ten candidates with zero refutations. This suggests that within the limited search scope—primarily top-K semantic matches and citation expansion—the specific combination of NTK theory applied to FWI and the proposed hybrid representation appears relatively unexplored. The theoretical analysis of optimization behavior through eigenvalue decay also lacks direct precedent among the examined candidates.

Based on the limited literature search of 26 candidates, the work appears to occupy a relatively novel position by bridging neural tangent kernel theory with full-waveform inversion. However, the sparse population of the Neural Network Parameterization for FWI leaf and the absence of refutable candidates should be interpreted cautiously, as the search scope does not guarantee exhaustive coverage of all relevant theoretical or applied work in neural network-based geophysical inversion.

Taxonomy

Core-task Taxonomy Papers
50
3
Claimed Contributions
26
Contribution Candidate Papers Compared
0
Refutable Paper

Research Landscape Overview

Core task: Estimating subsurface physical parameters from seismic waveform measurements. The field encompasses a diverse set of approaches organized around several major branches. Full-Waveform Inversion Methods form a central pillar, focusing on iterative optimization techniques that match observed and synthetic waveforms to recover velocity models and other elastic properties. Seismic Amplitude Inversion and Rock Physics Integration emphasizes the link between seismic amplitudes and petrophysical attributes such as porosity and fluid content, often combining statistical rock-physics models with inversion workflows. Supervised Deep Learning for Inversion leverages labeled training data to build end-to-end mappings from waveforms to subsurface parameters, while Semi-Supervised and Unsupervised Learning Approaches explore ways to reduce reliance on extensive labels by incorporating physical constraints or cycle-consistency ideas. Surface-Wave and Site Characterization Methods target near-surface imaging using surface waves and ambient noise, and Specialized Inversion Applications address domain-specific challenges such as anisotropy, fracture detection, and time-lapse monitoring. Finally, Methodological Foundations and Auxiliary Techniques provide the algorithmic and computational underpinnings—ranging from optimization strategies to uncertainty quantification—that support these inversion paradigms. Recent work highlights a growing interest in hybrid strategies that blend classical physics-based inversion with modern neural network parameterizations. For instance, Physics Guided FWI[17] and Deep Reparameterization[21] illustrate how neural architectures can regularize or reparameterize the subsurface model space, balancing data fit with geological realism. Wave Neural Tangent[0] sits within the Neural Network Parameterization for FWI branch, exploring how neural tangent kernel theory can inform the design of network-based velocity representations. This approach contrasts with purely data-driven supervised methods like Supervised Deep Learning[29], which rely heavily on labeled examples, and with semi-supervised frameworks such as Semisupervised Subsurface[3] that incorporate unlabeled data. Compared to neighboring works like Gabor Wavelet FWI[13], which uses wavelet-domain representations to improve convergence, Wave Neural Tangent[0] emphasizes theoretical insights from neural network training dynamics to guide parameterization choices. These developments reflect ongoing efforts to marry the interpretability and physical consistency of classical inversion with the flexibility and efficiency of deep learning.

Claimed Contributions

Wave-based neural tangent kernel framework for FWI

The authors extend the neural tangent kernel theory to full-waveform inversion by introducing a wave kernel for conventional FWI and a wave-based NTK for continuous representation FWI. This framework provides a unified theoretical foundation for analyzing both conventional and CR-FWI methods through eigenvalue decay properties.

6 retrieved papers
Theoretical analysis of eigenvalue decay and optimization behavior

The authors prove that the wave-based NTK is non-stationary during training and that its eigenvalue decay is faster than the wave kernel. This theoretical result explains the robustness-convergence trade-off observed in CR-FWI methods, where rapid eigenvalue decay enables multiscale inversion but slows high-frequency convergence.

10 retrieved papers
Hybrid INR-multigrid representation for FWI

The authors introduce IG-FWI, a new continuous representation method that integrates implicit neural representation with multi-resolution grid encoding. This hybrid approach is designed to achieve tailored eigenvalue decay properties that balance the robustness of INR-based methods with the faster convergence of grid-based methods.

10 retrieved papers

Core Task Comparisons

Comparisons with papers in the same taxonomy category

Contribution Analysis

Detailed comparisons for each claimed contribution

Contribution

Wave-based neural tangent kernel framework for FWI

The authors extend the neural tangent kernel theory to full-waveform inversion by introducing a wave kernel for conventional FWI and a wave-based NTK for continuous representation FWI. This framework provides a unified theoretical foundation for analyzing both conventional and CR-FWI methods through eigenvalue decay properties.

Contribution

Theoretical analysis of eigenvalue decay and optimization behavior

The authors prove that the wave-based NTK is non-stationary during training and that its eigenvalue decay is faster than the wave kernel. This theoretical result explains the robustness-convergence trade-off observed in CR-FWI methods, where rapid eigenvalue decay enables multiscale inversion but slows high-frequency convergence.

Contribution

Hybrid INR-multigrid representation for FWI

The authors introduce IG-FWI, a new continuous representation method that integrates implicit neural representation with multi-resolution grid encoding. This hybrid approach is designed to achieve tailored eigenvalue decay properties that balance the robustness of INR-based methods with the faster convergence of grid-based methods.

Unveiling the Mechanism of Continuous Representation Full-Waveform Inversion: A Wave Based Neural Tangent Kernel Framework | Novelty Validation