\boldsymbol{\partial^\infty}-Grid: Differentiable Grid Representations for Fast and Accurate Solutions to Differential Equations

ICLR 2026 Conference SubmissionAnonymous Authors
Differentiable Equations; Neural Field and Representations; Feature Grid; RBF Interpolation
Abstract:

We present a novel differentiable grid-based representation for efficiently solving differential equations (DEs). Widely used architectures for neural solvers, such as sinusoidal neural networks, are coordinate-based MLPs that are, both, computationally intensive and slow to train. Although grid-based alternatives for implicit representations (e.g., Instant-NGP and K-Planes) train faster by exploiting signal structure, their reliance on linear interpolation restricts their ability to compute higher-order derivatives, rendering them unsuitable for solving DEs. In contrast, our approach overcomes these limitations by combining the efficiency of feature grids with radial basis function interpolation, which is infinitely often differentiable. To effectively capture high-frequency solutions and enable stable and faster computation of global gradients, we introduce a multi-resolution decomposition with co-located grids. Our proposed representation, \boldsymbol{\partial^\infty}-Grid, is trained implicitly using the differential equations as loss functions, enabling accurate modeling of physical fields. We validate \boldsymbol{\partial^\infty}-Grid on a variety of tasks, including Poisson equation for image reconstruction, the Helmholtz equation for wave fields, and the Kirchhoff-Love boundary value problem for cloth simulation. Our results demonstrate a 5–20× speed-up over coordinate-based MLP-based methods, solving differential equations in seconds or minutes while maintaining comparable accuracy and compactness.

Disclaimer
This report is AI-GENERATED using Large Language Models and WisPaper (A scholar search engine). It analyzes academic papers' tasks and contributions against retrieved prior work. While this system identifies POTENTIAL overlaps and novel directions, ITS COVERAGE IS NOT EXHAUSTIVE AND JUDGMENTS ARE APPROXIMATE. These results are intended to assist human reviewers and SHOULD NOT be relied upon as a definitive verdict on novelty.
NOTE that some papers exist in multiple, slightly different versions (e.g., with different titles or URLs). The system may retrieve several versions of the same underlying work. The current automated pipeline does not reliably align or distinguish these cases, so human reviewers will need to disambiguate them manually.
If you have any questions, please contact: mingzhang23@m.fudan.edu.cn

Overview

Overall Novelty Assessment

The paper proposes ∂∞-Grid, a differentiable grid-based representation that combines feature grids with radial basis function (RBF) interpolation for solving differential equations. This work resides in the 'Advanced Representation Methods' leaf of the taxonomy, which contains only two papers total. This leaf sits within the broader 'Neural Network Representations and Approximation Theory' branch, indicating a focus on representation design rather than specific solver architectures. The sparse population of this leaf suggests the paper addresses a relatively underexplored niche: bridging efficient grid-based implicit representations with the smoothness requirements of DE solving.

The taxonomy reveals that most neural DE solving work concentrates in three major branches: Neural ODEs (17 papers across four leaves), PINNs (11 papers across three leaves), and Neural Operator Learning (5 papers across two leaves). The 'Advanced Representation Methods' leaf neighbors 'General Neural Approximation Methods for DEs' (7 papers), which covers foundational feedforward and trial solution approaches. The sibling paper in this leaf (Differentiable Grid) also explores structured spatial discretization. The paper's focus on multi-resolution grids and RBF interpolation distinguishes it from coordinate-based MLPs prevalent in PINNs and from operator learning methods that map between function spaces.

Among 29 candidates examined, the analysis identified varying novelty across contributions. The core ∂∞-Grid representation (10 candidates examined, 0 refutable) and multi-resolution decomposition (9 candidates, 0 refutable) appear to have limited direct prior work within this search scope. However, the implicit training framework using DEs as loss functions (10 candidates examined, 4 refutable) shows substantial overlap with existing PINN methodologies. This suggests the representation architecture itself may be more novel than the training paradigm, which builds on established physics-informed learning principles widely adopted in the field.

Based on this limited search of 29 semantically similar papers, the work appears to occupy a relatively sparse research direction within neural DE solving. The representation design shows fewer overlaps than the training methodology, though the small candidate pool and focused taxonomy leaf prevent definitive claims about absolute novelty. The analysis captures top-K semantic matches and does not constitute an exhaustive literature review across all grid-based or RBF-based neural methods.

Taxonomy

Core-task Taxonomy Papers
50
3
Claimed Contributions
29
Contribution Candidate Papers Compared
4
Refutable Paper

Research Landscape Overview

Core task: Solving differential equations with neural representations. The field has evolved into several major branches that reflect different modeling philosophies and application domains. Neural Ordinary Differential Equations (Neural ODEs[7]) treat the hidden state dynamics of deep networks as continuous-time flows, enabling memory-efficient training and adaptive computation. Physics-Informed Neural Networks (PINNs[46]) embed known physical laws directly into the loss function, allowing networks to approximate solutions to partial differential equations without large labeled datasets. Neural Operator Learning (e.g., Fourier Neural Operator[37]) shifts focus from point-wise approximation to learning mappings between entire function spaces, offering a data-driven route to surrogate modeling for complex PDEs. Meanwhile, Neural Network Representations and Approximation Theory investigates the expressive power and convergence guarantees of these architectures, and Specialized Topics cover extensions such as stochastic, delay, and inverse problems. Together, these branches span the spectrum from theoretical foundations to practical solvers for scientific computing. Recent work has explored trade-offs between expressiveness, computational efficiency, and adherence to physical constraints. Many studies in the Neural ODE line (Stiff Neural ODEs[1], Neural Event Functions[2]) address numerical stability and event-driven dynamics, while operator learning methods (Neural Operator[20], In-context Operator Learning[14]) emphasize generalization across parameter regimes. Within the Advanced Representation Methods cluster, Differentiable Grid[0] introduces a structured spatial discretization that bridges classical finite-difference schemes and implicit neural representations, offering a middle ground between mesh-based and mesh-free approaches. This contrasts with purely coordinate-based methods like Signal Processing INR[16], which encode signals as continuous functions without explicit grid structure. By combining differentiable grids with neural parameterizations, Differentiable Grid[0] aims to retain interpretability and computational tractability while leveraging the flexibility of learned representations, positioning itself at the intersection of classical numerical analysis and modern deep learning for PDEs.

Claimed Contributions

∂∞-Grid: differentiable grid-based representation combining feature grids with RBF interpolation

The authors introduce ∂∞-Grid, a new representation that combines the computational efficiency of feature grids with radial basis function (RBF) interpolation to enable infinite differentiability, overcoming limitations of existing grid-based methods that rely on linear interpolation and cannot compute higher-order derivatives needed for solving differential equations.

10 retrieved papers
Multi-resolution decomposition with co-located grids

The authors propose a multi-resolution decomposition approach using co-located grids to effectively capture high-frequency solutions and enable stable and faster computation of global gradients in their grid-based representation.

9 retrieved papers
Implicit training framework using differential equations as loss functions

The authors develop a training approach where the differential equations themselves serve as loss functions for implicit optimization, enabling accurate modeling of physical fields governed by these equations.

10 retrieved papers
Can Refute

Core Task Comparisons

Comparisons with papers in the same taxonomy category

Contribution Analysis

Detailed comparisons for each claimed contribution

Contribution

∂∞-Grid: differentiable grid-based representation combining feature grids with RBF interpolation

The authors introduce ∂∞-Grid, a new representation that combines the computational efficiency of feature grids with radial basis function (RBF) interpolation to enable infinite differentiability, overcoming limitations of existing grid-based methods that rely on linear interpolation and cannot compute higher-order derivatives needed for solving differential equations.

Contribution

Multi-resolution decomposition with co-located grids

The authors propose a multi-resolution decomposition approach using co-located grids to effectively capture high-frequency solutions and enable stable and faster computation of global gradients in their grid-based representation.

Contribution

Implicit training framework using differential equations as loss functions

The authors develop a training approach where the differential equations themselves serve as loss functions for implicit optimization, enabling accurate modeling of physical fields governed by these equations.

$\boldsymbol{\partial^\infty}$-Grid: Differentiable Grid Representations for Fast and Accurate Solutions to Differential Equations | Novelty Validation