AdS-GNN - a Conformally Equivariant Graph Neural Network

ICLR 2026 Conference SubmissionAnonymous Authors
equivariance; conformal group; scale equivariance; ising model
Abstract:

Conformal symmetries, i.e.\ coordinate transformations that preserve angles, play a key role in many fields, including physics, mathematics, computer vision and (geometric) machine learning. Here we build a neural network that is equivariant under general conformal transformations. To achieve this, we lift data from flat Euclidean space to Anti de Sitter (AdS) space. This allows us to exploit a known correspondence between conformal transformations of flat space and isometric transformations on the Anti de Sitter space. We then build upon the fact that such isometric transformations have been extensively studied on general geometries in the geometric deep learning literature. In particular, we employ message-passing layers conditioned on the proper distance, yielding a computationally efficient framework. We validate our model on tasks from computer vision and statistical physics, demonstrating strong performance, improved generalization capacities, and the ability to extract conformal data such as scaling dimensions from the trained network.

Disclaimer
This report is AI-GENERATED using Large Language Models and WisPaper (A scholar search engine). It analyzes academic papers' tasks and contributions against retrieved prior work. While this system identifies POTENTIAL overlaps and novel directions, ITS COVERAGE IS NOT EXHAUSTIVE AND JUDGMENTS ARE APPROXIMATE. These results are intended to assist human reviewers and SHOULD NOT be relied upon as a definitive verdict on novelty.
NOTE that some papers exist in multiple, slightly different versions (e.g., with different titles or URLs). The system may retrieve several versions of the same underlying work. The current automated pipeline does not reliably align or distinguish these cases, so human reviewers will need to disambiguate them manually.
If you have any questions, please contact: mingzhang23@m.fudan.edu.cn

Overview

Overall Novelty Assessment

The paper introduces a neural network achieving conformal equivariance by embedding point clouds into Anti de Sitter (AdS) space, then applying message-passing layers conditioned on proper distance. It resides in the 'Conformal Transformation Equivariance via Geometric Embeddings' leaf, which contains only three papers total. This is a notably sparse research direction within the broader taxonomy of 30 papers across multiple equivariance paradigms, suggesting that conformal equivariance via geometric embeddings remains an emerging and relatively unexplored area compared to rotation-only or SE(3) methods.

The taxonomy reveals that the paper's immediate neighbors—'Scale-Inclusive Equivariant Registration and Alignment'—focus on 9DoF alignment with scaling but not full conformal transformations. Broader sibling branches include 'Rotation and Rigid Transformation Equivariance' (covering SO(3) and SE(3) methods without scaling) and 'General Equivariant Frameworks' (extending to Lie groups and continuous representations). The scope note for the paper's leaf explicitly excludes spherical embeddings for rotation-only tasks, clarifying that the AdS embedding targets conformal symmetries beyond rigid motions, distinguishing it from rotation-equivariant architectures using spherical harmonics or quaternions.

Among 27 candidates examined, the contribution-level analysis shows varied novelty profiles. The AdS-GNN architecture itself (7 candidates, 0 refutable) and the AdS embedding procedure (10 candidates, 0 refutable) appear to have no clear prior work within the limited search scope. However, the extraction of conformal dimensions from trained networks (10 candidates, 1 refutable) encounters at least one overlapping prior method among the examined papers. This suggests that while the core architectural and embedding ideas may be relatively fresh, the interpretability component has some precedent in the literature surveyed.

Based on the top-27 semantic matches and citation expansion, the work appears to occupy a sparsely populated niche within conformal equivariance research. The limited search scope means that additional relevant work outside the examined candidates could exist, particularly in physics-oriented conformal field theory applications or classical conformal geometry literature not captured by the semantic search. The analysis covers the immediate neighborhood but does not claim exhaustive coverage of all possible prior art.

Taxonomy

Core-task Taxonomy Papers
30
3
Claimed Contributions
27
Contribution Candidate Papers Compared
1
Refutable Paper

Research Landscape Overview

Core task: Building conformally equivariant neural networks for point cloud data. The field organizes itself around several complementary perspectives on geometric invariance and equivariance for point cloud processing. At the broadest level, one finds branches dedicated to conformal and scale equivariance, rotation and rigid transformation equivariance, and general equivariant frameworks that unify or extend these symmetries. Parallel to these symmetry-focused directions, other branches explore conformal geometry for point cloud analysis—often drawing on classical parameterization techniques such as Spherical conformal parameterization of[9]—and geometric or topological understanding that leverages manifold structure and persistent homology. Finally, specialized applications and domain-specific methods address concrete tasks like object detection, registration, and shape correspondence. Works such as E2PN[3] and Steerable 3D spherical neurons[18] illustrate how rotation equivariance can be achieved through spherical harmonics or geometric embeddings, while approaches like Embed me if you[21] demonstrate the use of higher-dimensional geometric spaces to encode conformal transformations. Within this landscape, a particularly active line of research focuses on embedding point clouds into spaces that naturally respect conformal or projective symmetries, contrasting with methods that enforce equivariance via group-theoretic constraints on network weights. AdS-GNN - a Conformally[0] sits squarely in this embedding-based cluster, using anti-de Sitter geometry to achieve conformal equivariance in a manner closely related to Embed me if you[21], which similarly exploits geometric embeddings for scale and inversion invariance. This approach differs from purely algebraic frameworks like E2PN[3], which builds equivariance through steerable filters, and from methods such as Steerable 3D spherical neurons[18] that rely on spherical harmonic decompositions. The central trade-off revolves around expressiveness versus computational overhead: geometric embeddings can elegantly capture continuous symmetries but may introduce higher-dimensional representations, while filter-based designs offer modularity at the cost of more complex parameterizations. Understanding where each method excels—whether in handling arbitrary scales, preserving local geometry, or scaling to large point clouds—remains an open question driving ongoing work.

Claimed Contributions

AdS-GNN: A conformally equivariant graph neural network

The authors introduce AdS-GNN, a graph neural network architecture that achieves equivariance under the full conformal group by lifting point cloud data from Euclidean space to Anti de Sitter (AdS) space, exploiting the correspondence between conformal transformations in flat space and isometric transformations on AdS space.

7 retrieved papers
AdS embedding procedure for conformal equivariance

The authors develop a computationally efficient message-passing framework that conditions on the proper distance in AdS space. This includes an embedding algorithm that lifts points into AdS using a center-of-mass procedure, preserving equivariance under translations, rotations, and scalings while mildly breaking special conformal transformations.

10 retrieved papers
Extraction of conformal dimensions from trained networks

The authors demonstrate that their model can extract physically meaningful conformal dimensions (scaling dimensions) as trainable parameters from data, providing interpretability by recovering universal quantities that characterize conformally invariant systems, as validated on Ising model tasks.

10 retrieved papers
Can Refute

Core Task Comparisons

Comparisons with papers in the same taxonomy category

Contribution Analysis

Detailed comparisons for each claimed contribution

Contribution

AdS-GNN: A conformally equivariant graph neural network

The authors introduce AdS-GNN, a graph neural network architecture that achieves equivariance under the full conformal group by lifting point cloud data from Euclidean space to Anti de Sitter (AdS) space, exploiting the correspondence between conformal transformations in flat space and isometric transformations on AdS space.

Contribution

AdS embedding procedure for conformal equivariance

The authors develop a computationally efficient message-passing framework that conditions on the proper distance in AdS space. This includes an embedding algorithm that lifts points into AdS using a center-of-mass procedure, preserving equivariance under translations, rotations, and scalings while mildly breaking special conformal transformations.

Contribution

Extraction of conformal dimensions from trained networks

The authors demonstrate that their model can extract physically meaningful conformal dimensions (scaling dimensions) as trainable parameters from data, providing interpretability by recovering universal quantities that characterize conformally invariant systems, as validated on Ising model tasks.