Deep Learning for Subspace Regression
Overview
Taxonomy
Research Landscape Overview
Claimed Contributions
The authors formulate subspace regression as a statistical learning problem where a parametric model maps parameters to linear subspaces on the Grassmann manifold. They demonstrate applications across eigenproblems, reduced order modeling, iterative methods, and optimal control.
The authors introduce multiple loss functions that satisfy invariance requirements for Grassmannian data, including a stochastic variant based on least squares that scales better computationally as subspace dimensions increase.
The authors propose predicting larger-than-needed subspaces to simplify the learning problem. They provide theoretical justification showing this reduces mapping complexity for elliptic eigenproblems and makes mappings smoother on the Grassmann manifold, which aligns with neural network inductive biases.
Core Task Comparisons
Comparisons with papers in the same taxonomy category
[17] Gaussian process subspace prediction for model reduction PDF
[27] Non-intrusive parametric reduced order models with high-dimensional inputs via gradient-free active subspace PDF
[36] Gaussian Process Subspace Regression for Model Reduction PDF
[47] Non-intrusive model order reduction for parametric radiation transport simulations PDF
[50] Non-intrusive reduced order modeling of parametric electromagnetic scattering problems through Gaussian process regression PDF
Contribution Analysis
Detailed comparisons for each claimed contribution
Mathematical formulation of subspace regression problem
The authors formulate subspace regression as a statistical learning problem where a parametric model maps parameters to linear subspaces on the Grassmann manifold. They demonstrate applications across eigenproblems, reduced order modeling, iterative methods, and optimal control.
[75] Adapting Projection-Based Reduced-Order Models using Projected Gaussian Process PDF
[71] Parametric reduced-order modeling and mode sensitivity of actuated cylinder flow from a matrix manifold perspective PDF
[72] Multi-view Spectral Clustering on the Grassmannian Manifold With Hypergraph Representation PDF
[73] Grassmannian Dimensionality Reduction Using Triplet Margin Loss for Ume Classification of 3d Point Clouds PDF
[74] Grassmannian diffusion maps based dimension reduction and classification for high-dimensional data PDF
[76] Intrinsic Grassmann averages for online linear, robust and nonlinear subspace learning PDF
[77] Interpolation-based parametric reduced-order models via Galerkin projection and dynamic mode decomposition PDF
[78] Joint normalization and dimensionality reduction on Grassmannian: a generalized perspective PDF
[79] Sparse Grassmannian embeddings for hyperspectral data representation and classification PDF
[80] Parametric reduced-order modelling and mode sensitivity of actuated cylinder flow from a matrix manifold perspective PDF
Loss functions for subspace data suitable for neural network training
The authors introduce multiple loss functions that satisfy invariance requirements for Grassmannian data, including a stochastic variant based on least squares that scales better computationally as subspace dimensions increase.
[51] Discriminant locality preserving projection on Grassmann Manifold for image-set classification PDF
[52] Projection metric learning on Grassmann manifold with application to video based face recognition PDF
[53] Spatio-Temporal Tensor Analysis on Product Grassmann Manifolds and its Application to Action Recognition PDF
[54] Grassmann pooling as compact homogeneous bilinear pooling for fine-grained visual classification PDF
[55] Cross-View Approximation on Grassmann Manifold for Multiview Clustering PDF
[56] Accurate 3D action recognition using learning on the Grassmann manifold PDF
[57] A Riemannian gossip approach to subspace learning on Grassmann manifold PDF
[58] Domain adaptation as optimal transport on Grassmann manifolds PDF
[59] Microsecond Federated SVD on Grassmann Manifold for Real-time IoT Intrusion Detection PDF
[60] Grassmann Neighborhood Preserving Autoencoder for Image Set Classification PDF
Subspace embedding technique with theoretical justification
The authors propose predicting larger-than-needed subspaces to simplify the learning problem. They provide theoretical justification showing this reduces mapping complexity for elliptic eigenproblems and makes mappings smoother on the Grassmann manifold, which aligns with neural network inductive biases.