Learning a distance measure from the information-estimation geometry of data
Overview
Taxonomy
Research Landscape Overview
Claimed Contributions
The authors propose a new distance function that is induced by the geometry of a probability density. The IEM compares score vector fields of a blurred density around two signals over a range of noise amplitudes, adapting both locally and globally to the distribution's geometry.
The authors derive a second-order expansion of the IEM that yields a Riemannian metric. This local metric is most sensitive in regions of high log-density curvature and to perturbations that induce large changes in signal probability, behaving like a locally adaptive Mahalanobis distance.
The authors introduce a generalized version of the IEM that incorporates a scalar function f to measure deviations of the log-probability ratio process from zero. This generalization allows the distance to adapt to different types of data by selecting an appropriate function f.
Core Task Comparisons
Comparisons with papers in the same taxonomy category
Contribution Analysis
Detailed comparisons for each claimed contribution
Information-Estimation Metric (IEM)
The authors propose a new distance function that is induced by the geometry of a probability density. The IEM compares score vector fields of a blurred density around two signals over a range of noise amplitudes, adapting both locally and globally to the distribution's geometry.
[70] Convergence of score-based generative modeling for general data distributions PDF
[71] Learning gradient fields for shape generation PDF
[72] : Identity-Preserving-yet-Diversified Diffusion Models for Synthetic Face Recognition PDF
[73] Fighting uncertainty with gradients: Offline reinforcement learning via diffusion score matching PDF
[74] Probability density prediction of wind farm power generation: Benchmarking natural gradient boosting approach using ensemble weather forecast PDF
[75] Estimation of non-normalized statistical models by score matching. PDF
[76] Deep learning approaches for imaging inverse problems with structured noise PDF
Closed-form local Riemannian metric
The authors derive a second-order expansion of the IEM that yields a Riemannian metric. This local metric is most sensitive in regions of high log-density curvature and to perturbations that induce large changes in signal probability, behaving like a locally adaptive Mahalanobis distance.
[60] Wasserstein Riemannian geometry of Gaussian densities PDF
[61] The model of the local Universe in the framework of the second-order perturbation theory PDF
[62] Symplectic Stiefel manifold: tractable metrics, second-order geometry and Newton's methods PDF
[63] Estimating riemannian metric with noise-contaminated intrinsic distance PDF
[64] Generalized second approximation Matsumoto metric PDF
[65] Metric Learning Encoding Models: A Multivariate Framework for Interpreting Neural Representations PDF
[66] Riemannian optimization on the symplectic Stiefel manifold using second-order information PDF
[67] Radial basis approximation of tensor fields on manifolds: From operator estimation to manifold learning PDF
[68] A second-order in time, BGN-based parametric finite element method for geometric flows of curves PDF
[69] Post-Newtonian approximation up to second order to the Rastall equations PDF
Generalized Information-Estimation Metric
The authors introduce a generalized version of the IEM that incorporates a scalar function f to measure deviations of the log-probability ratio process from zero. This generalization allows the distance to adapt to different types of data by selecting an appropriate function f.