Adaptive Canonicalization with Application to Invariant Anisotropic Geometric Networks

ICLR 2026 Conference SubmissionAnonymous Authors
Equivariant machine learningCanonicalizationUniversal approximationClassificationGraph neural networksSpectral methodsPoint cloud networksAnisotropic networks
Abstract:

Canonicalization is a widely used strategy in equivariant machine learning, enforcing symmetry in neural networks by mapping each input to a standard form. Yet, it often introduces discontinuities that can affect stability during training, limit generalization, and complicate universal approximation theorems. In this paper, we address this by introducing adaptive canonicalization, a general framework in which the canonicalization depends both on the input and the network. Specifically, we present the adaptive canonicalization based on prior maximization, where the standard form of the input is chosen to maximize the predictive confidence of the network. We prove that this construction yields continuous and symmetry-respecting models that admit universal approximation properties.

We propose two applications of our setting: (i) resolving eigenbasis ambiguities in spectral graph neural networks, and (ii) handling rotational symmetries in point clouds. We empirically validate our methods on molecular and protein classification, as well as point cloud classification tasks. Our adaptive canonicalization outperforms the three other common solutions to equivariant machine learning: data augmentation, standard canonicalization, and equivariant architectures.

Disclaimer
This report is AI-GENERATED using Large Language Models and WisPaper (A scholar search engine). It analyzes academic papers' tasks and contributions against retrieved prior work. While this system identifies POTENTIAL overlaps and novel directions, ITS COVERAGE IS NOT EXHAUSTIVE AND JUDGMENTS ARE APPROXIMATE. These results are intended to assist human reviewers and SHOULD NOT be relied upon as a definitive verdict on novelty.
NOTE that some papers exist in multiple, slightly different versions (e.g., with different titles or URLs). The system may retrieve several versions of the same underlying work. The current automated pipeline does not reliably align or distinguish these cases, so human reviewers will need to disambiguate them manually.
If you have any questions, please contact: mingzhang23@m.fudan.edu.cn

Overview

Overall Novelty Assessment

The paper introduces an adaptive canonicalization framework where the canonical form depends jointly on the input and the network's predictions, specifically via prior maximization. Within the taxonomy, it resides in the 'General Adaptive Canonicalization Theory' leaf alongside two sibling papers. This leaf is part of a broader 'Adaptive and Learned Canonicalization Frameworks' branch, indicating a relatively focused but not overcrowded research direction. The taxonomy contains 35 papers across multiple branches, suggesting the paper occupies a specialized niche within the larger equivariant learning landscape.

The taxonomy reveals three main strategies for equivariance: canonicalization-based methods, architecturally constrained networks, and symmetry-breaking approaches. The paper's leaf sits within the canonicalization branch, which also includes domain-specific applications (robotics, molecular modeling, 3D vision) and group-specific methods (SE(3), Lorentz groups). Neighboring leaves address pretrained model adaptation and specialized group canonicalization, while sibling branches explore frame-based architectures and message-passing networks. The scope notes clarify that this work focuses on foundational theory with continuity and universal approximation guarantees, distinguishing it from application-driven or group-specific canonicalization methods.

Among 13 candidates examined across three contributions, no clearly refutable prior work was identified. The adaptive canonicalization framework examined 2 candidates with no refutations, prior maximization examined 10 candidates with no refutations, and anisotropic geometric network applications examined 1 candidate with no refutations. This limited search scope suggests the specific combination of adaptive canonicalization with prior maximization and theoretical guarantees appears relatively unexplored within the examined literature. However, the small candidate pool means the analysis captures only a narrow slice of potentially relevant work, particularly given the paper's position in a specialized but active research area.

Based on the top-13 semantic matches examined, the work appears to occupy a distinct position within adaptive canonicalization theory, though the limited search scope prevents definitive claims about broader novelty. The taxonomy structure indicates this is a growing subfield with established foundations but room for theoretical contributions. A more exhaustive search across the 35-paper taxonomy and beyond would be needed to fully assess overlap with related canonicalization and equivariance methods.

Taxonomy

Core-task Taxonomy Papers
35
3
Claimed Contributions
13
Contribution Candidate Papers Compared
0
Refutable Paper

Research Landscape Overview

Core task: adaptive canonicalization for equivariant machine learning. The field addresses how to build neural networks that respect or exploit symmetries in data, with the taxonomy revealing several complementary strategies. Canonicalization-Based Equivariance Methods transform inputs into standard reference frames before processing, enabling ordinary networks to handle symmetric data; this branch includes both hand-crafted and learned canonicalization approaches such as Learned Canonicalization[1] and Canonicalization Perspective[22]. Architecturally Equivariant Network Designs instead bake symmetry constraints directly into layer operations, producing models that are equivariant by construction. Symmetry-Breaking and Generative Modeling explores how to handle settings where exact symmetry is undesirable or where generation requires controlled symmetry violation, as seen in Probabilistic Symmetry Breaking[7] and Frame-based Diffusion[12]. The remaining branches address canonical transformations in physics-inspired dynamical systems, domain-specific applications preserving particular symmetries, and broader machine learning contexts where equivariance plays a supporting role. Recent work has intensified around the trade-off between flexibility and interpretability: architecturally equivariant designs guarantee exact symmetry but can be rigid, while canonicalization methods offer modularity at the cost of potential frame ambiguities. Adaptive Canonicalization[0] sits squarely within the learned canonicalization cluster, proposing a general theoretical framework for adapting canonical frames during training rather than fixing them a priori. This contrasts with earlier fixed-frame approaches and aligns closely with Learned Canonicalization[1], which similarly trains canonicalization mappings end-to-end. Compared to Canonicalization Perspective[22], which surveys the conceptual landscape, Adaptive Canonicalization[0] emphasizes the algorithmic and optimization aspects of making canonicalization adaptive. Meanwhile, works like Equivariant Adaptation[3] and Local Canonicalization Equivariance[33] explore related ideas of local or task-specific frame selection, highlighting ongoing questions about when and how to balance global versus local canonicalization strategies.

Claimed Contributions

Adaptive canonicalization framework

The authors introduce adaptive canonicalization, a general framework where the standard form of an input depends on both the input itself and the neural network. This approach resolves discontinuities inherent in standard canonicalization methods while maintaining symmetry-respecting properties and universal approximation guarantees.

2 retrieved papers
Prior maximization adaptive canonicalization

The authors present a specific instantiation of adaptive canonicalization called prior maximization, where the canonical form is selected by maximizing the network's predictive confidence. They prove this construction yields continuous and symmetry-respecting models with universal approximation properties.

10 retrieved papers
Anisotropic geometric network applications

The authors develop two concrete applications of their framework: anisotropic nonlinear spectral filters for resolving eigenbasis ambiguities in spectral graph neural networks, and anisotropic point cloud networks for handling rotational symmetries. These methods are shown to outperform standard canonicalization, data augmentation, and equivariant architectures.

1 retrieved paper

Core Task Comparisons

Comparisons with papers in the same taxonomy category

Contribution Analysis

Detailed comparisons for each claimed contribution

Contribution

Adaptive canonicalization framework

The authors introduce adaptive canonicalization, a general framework where the standard form of an input depends on both the input itself and the neural network. This approach resolves discontinuities inherent in standard canonicalization methods while maintaining symmetry-respecting properties and universal approximation guarantees.

Contribution

Prior maximization adaptive canonicalization

The authors present a specific instantiation of adaptive canonicalization called prior maximization, where the canonical form is selected by maximizing the network's predictive confidence. They prove this construction yields continuous and symmetry-respecting models with universal approximation properties.

Contribution

Anisotropic geometric network applications

The authors develop two concrete applications of their framework: anisotropic nonlinear spectral filters for resolving eigenbasis ambiguities in spectral graph neural networks, and anisotropic point cloud networks for handling rotational symmetries. These methods are shown to outperform standard canonicalization, data augmentation, and equivariant architectures.