Gauge-invariant representation holonomy
Overview
Overall Novelty Assessment
The paper introduces representation holonomy, a gauge-invariant statistic measuring path-dependent curvature in neural network feature spaces. It occupies the 'Gauge-Invariant Path-Dependent Curvature Measures' leaf, which contains only this single paper within a taxonomy of 23 works. This isolation suggests the paper pioneers a distinct methodological direction within the broader field of geometric representation analysis, rather than extending a crowded research thread.
The taxonomy reveals neighboring approaches in sibling leaves: 'Manifold Geometry and Topology Preservation' (3 papers) focuses on preserving intrinsic structure during embedding, 'Spectral and Rank-Based Geometry Characterization' (1 paper) uses eigenspectrum properties, and 'Multi-View and Cross-Modal Diffusion Geometry' (1 paper) constructs geometries across data views. The paper diverges by quantifying curvature through parallel transport around loops rather than static manifold properties or spectral signatures, addressing a gap between pointwise similarity metrics and dynamic geometric behavior under perturbations.
Among 28 candidates examined across three contributions, zero refutable pairs emerged. The core holonomy statistic examined 10 candidates with no prior work providing overlapping methodology; the estimator with theoretical guarantees examined 8 candidates similarly; empirical validation examined 10 candidates. This limited search scope—top-K semantic matches plus citations—suggests the specific combination of gauge invariance, parallel transport, and loop-based curvature measurement has not been directly addressed in the retrieved literature, though the analysis cannot claim exhaustive coverage of all geometric representation work.
Given the constrained search and the paper's unique position as the sole occupant of its taxonomy leaf, the holonomy framework appears methodologically distinct within the examined scope. However, the analysis covers approximately 28 papers from semantic neighborhoods, not the entire geometric deep learning literature. The novelty assessment reflects what was retrieved, acknowledging that broader or differently-targeted searches might surface related curvature-based diagnostics not captured here.
Taxonomy
Research Landscape Overview
Claimed Contributions
The authors propose representation holonomy, a new gauge-invariant measure that quantifies path-dependent changes in learned representations by measuring the accumulated twist when features are parallel-transported around closed loops in input space, revealing hidden curvature beyond pointwise similarity metrics.
The authors develop a computationally practical estimator that fixes gauge through global whitening, aligns neighborhoods using shared subspaces and rotation-only Procrustes, and embeds results back to full feature space. They prove invariance to orthogonal and affine transformations, establish a linear null for affine layers, and show holonomy vanishes at small radii.
The authors demonstrate empirically that holonomy increases with loop radius and depth, separates models appearing similar under CKA, tracks training dynamics, and correlates with adversarial and corruption robustness across multiple training regimes including ERM, label smoothing, mixup, and adversarial training.
Core Task Comparisons
Comparisons with papers in the same taxonomy category
Contribution Analysis
Detailed comparisons for each claimed contribution
Representation holonomy as a gauge-invariant statistic
The authors propose representation holonomy, a new gauge-invariant measure that quantifies path-dependent changes in learned representations by measuring the accumulated twist when features are parallel-transported around closed loops in input space, revealing hidden curvature beyond pointwise similarity metrics.
[32] Parallel transport in rotating frames and projective holonomic quantum computation PDF
[33] Adjusted parallel transport for higher gauge theories PDF
[34] Gauge theory is about the geometry of internal spaces PDF
[35] Fluctuations, uncertainty relations, and the geometry of quantum state manifolds PDF
[36] A global geometric approach to parallel transport of strings in gauge theory PDF
[37] Nonmetricity theories and aspects of gauge symmetry PDF
[38] Random Surfaces and Higher Algebra PDF
[39] Time Evolution in the external field problem of Quantum Electrodynamics PDF
[40] Quantum Dynamics with the Parallel Transport Gauge PDF
[41] Higher gauge theory and a non-Abelian generalization of 2-form electrodynamics PDF
Practical estimator with theoretical guarantees
The authors develop a computationally practical estimator that fixes gauge through global whitening, aligns neighborhoods using shared subspaces and rotation-only Procrustes, and embeds results back to full feature space. They prove invariance to orthogonal and affine transformations, establish a linear null for affine layers, and show holonomy vanishes at small radii.
[24] Bridging Critical Gaps in Convergent Learning: How Representational Alignment Evolves Across Layers, Training, and Distribution Shifts PDF
[25] Similarity of neural network models: A survey of functional and representational measures PDF
[26] What Representational Similarity Measures Imply about Decodable Information PDF
[27] Deep networks as paths on the manifold of neural representations PDF
[28] The interpretation of generalized procrustes analysis and allied methods PDF
[29] Cue-Invariant Geometric Structure of the Population Codes in Macaque V1 and V2 PDF
[30] Neural Network Adaptive Coding Efficiency and Stochastic Representational Geometry PDF
[31] Information Geometry for Landmark Shape Analysis: Unifying Shape Representation and Deformation PDF
Empirical validation on vision tasks
The authors demonstrate empirically that holonomy increases with loop radius and depth, separates models appearing similar under CKA, tracks training dynamics, and correlates with adversarial and corruption robustness across multiple training regimes including ERM, label smoothing, mixup, and adversarial training.