Tensor learning with orthogonal, Lorentz, and symplectic symmetries
Overview
Taxonomy
Research Landscape Overview
Claimed Contributions
The authors introduce machine learning architectures that can represent tensor-to-tensor functions while respecting equivariance under classical Lie groups (orthogonal, Lorentz, and symplectic groups). These architectures are universally expressive in the sense that they can approximate arbitrary equivariant functions.
The paper provides the first comprehensive mathematical framework for designing equivariant machine learning models on tensors. This framework gives explicit parameterizations for polynomial and analytic functions mapping tensor inputs to tensor outputs that are equivariant with respect to classical Lie groups.
The authors derive explicit mathematical parameterizations for equivariant functions by leveraging tensor invariant theory. These parameterizations cover polynomial functions and analytic functions, providing practical recipes for implementing equivariant models across multiple classical Lie groups.
Core Task Comparisons
Comparisons with papers in the same taxonomy category
[33] Lorentz group equivariant neural network for particle physics PDF
Contribution Analysis
Detailed comparisons for each claimed contribution
Universally expressive equivariant architectures for tensor functions
The authors introduce machine learning architectures that can represent tensor-to-tensor functions while respecting equivariance under classical Lie groups (orthogonal, Lorentz, and symplectic groups). These architectures are universally expressive in the sense that they can approximate arbitrary equivariant functions.
[26] A general framework for equivariant neural networks on reductive lie groups PDF
[62] Lorentz-equivariance without limitations PDF
[63] Lorentz Local Canonicalization: How to Make Any Network Lorentz-Equivariant PDF
[64] Diagrammatic algebra for equivariant neural network architectures PDF
[65] A Diagrammatic Approach to Improve Computational Efficiency in Group Equivariant Neural Networks PDF
[66] Symplectic convolutional neural networks PDF
[67] arXiv: Lorentz-Equivariance without Limitations PDF
[68] arXiv: A Lorentz-Equivariant Transformer for All of the LHC PDF
General mathematical framework for equivariant tensor learning
The paper provides the first comprehensive mathematical framework for designing equivariant machine learning models on tensors. This framework gives explicit parameterizations for polynomial and analytic functions mapping tensor inputs to tensor outputs that are equivariant with respect to classical Lie groups.
[10] Equivariant geometric convolutions for dynamical systems on vector and tensor images PDF
[12] Accurate piezoelectric tensor prediction with equivariant attention tensor graph neural network PDF
[17] Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds PDF
[32] Predicting tensorial molecular properties with equivariant machine learning models PDF
[50] Enabling Efficient Equivariant Operations in the Fourier Basis via Gaunt Tensor Products PDF
[57] e3nn: Euclidean neural networks PDF
[58] Representing spherical tensors with scalar-based machine-learning models PDF
[60] Implicit modeling of equivariant tensor basis with Euclidean turbulence closure neural network PDF
[61] Reducing SO(3) Convolutions to SO(2) for Efficient Equivariant GNNs PDF
Explicit parameterizations using tensor invariant theory
The authors derive explicit mathematical parameterizations for equivariant functions by leveraging tensor invariant theory. These parameterizations cover polynomial functions and analytic functions, providing practical recipes for implementing equivariant models across multiple classical Lie groups.