A Block Coordinate Descent Method for Nonsmooth Composite Optimization under Orthogonality Constraints
Overview
Overall Novelty Assessment
The paper proposes OBCD, a block coordinate descent algorithm for nonsmooth composite optimization under orthogonality constraints, updating k rows per iteration by solving small subproblems. According to the taxonomy, it resides in the 'Block Coordinate and Specialized Descent Methods' leaf, which contains only three papers total. This places the work in a relatively sparse research direction compared to the more crowded 'ADMM-Based Approaches' (four papers) and 'Proximal Gradient Methods' (two papers) within the broader 'Proximal and Splitting Methods' branch, suggesting the coordinate descent angle is less explored in this field.
The taxonomy tree shows that neighboring leaves include ADMM-based and proximal gradient methods under the 'Proximal and Splitting Methods' branch, as well as smoothing techniques in a sibling branch. OBCD diverges from ADMM frameworks by avoiding alternating direction splitting and instead partitions variables row-wise, maintaining feasibility throughout. It also differs from smoothing approaches, which replace nonsmooth terms with differentiable surrogates, by directly handling nonsmoothness in each subproblem. The scope notes clarify that coordinate descent methods are excluded from proximal and ADMM categories, reinforcing OBCD's distinct methodological niche within the taxonomy structure.
Among the three contributions analyzed, the first two—OBCD algorithm and block-k stationarity—show no refutable candidates across ten examined papers each, suggesting these aspects are relatively novel within the limited search scope. The third contribution, convergence analysis with iteration complexity and non-ergodic rates, examined ten candidates and found five refutable pairs, indicating more substantial prior work on convergence guarantees in this setting. This pattern suggests the algorithmic framework and stationarity concept are less anticipated by the thirty candidates examined, while convergence analysis builds on established techniques.
Based on the top-30 semantic matches and taxonomy structure, the work appears to introduce a distinct algorithmic approach in a sparse research direction, with the core method and stationarity concept showing limited overlap. The convergence analysis, however, aligns with existing theoretical frameworks. The limited search scope means these observations reflect the examined candidates rather than an exhaustive field survey, and the sparse taxonomy leaf suggests fewer direct competitors in the coordinate descent paradigm for this problem class.
Taxonomy
Research Landscape Overview
Claimed Contributions
The authors introduce OBCD, a block coordinate descent algorithm that updates k rows of the solution matrix per iteration by solving a small nonsmooth optimization problem under orthogonality constraints. The method maintains feasibility and has low per-iteration computational cost.
The authors define a new optimality condition called block-k stationary points and prove that these points provide stronger optimality guarantees than standard critical points used in existing methods for optimization on the Stiefel manifold.
The authors establish that OBCD achieves ε-block-k stationary points with O(1/ε) iteration complexity and derive non-ergodic (last-iterate) convergence rates under the Kurdyka-Lojasiewicz inequality, including finite-step, linear, and sublinear convergence depending on the KL exponent.
Core Task Comparisons
Comparisons with papers in the same taxonomy category
[1] ADMM for Nonsmooth Composite Optimization under Orthogonality Constraints PDF
[7] A Riemannian alternating direction method of multipliers PDF
[25] Nonlinear Eigen-approach ADMM for Sparse Optimization on Stiefel Manifold PDF
Contribution Analysis
Detailed comparisons for each claimed contribution
OBCD algorithm for nonsmooth composite optimization under orthogonality constraints
The authors introduce OBCD, a block coordinate descent algorithm that updates k rows of the solution matrix per iteration by solving a small nonsmooth optimization problem under orthogonality constraints. The method maintains feasibility and has low per-iteration computational cost.
[16] Orthogonal Constrained Minimization with Tensor Regularization for HSI Denoising and Destriping PDF
[31] An Augmented Lagrangian Method for -Regularized Optimization Problems with Orthogonality Constraints PDF
[32] Coordinate projected gradient descent minimization and its application to orthogonal nonnegative matrix factorization PDF
[33] A coordinate gradient descent method for â1-regularized convex minimization PDF
[34] Exploiting Sensing Signal in ISAC: A NOMA Inspired Scheme PDF
[35] Large scale composite optimization problems with coupled objective functions: theory, algorithms and applications PDF
[36] Block-coordinate gradient descent method for linearly constrained nonsmooth separable optimization PDF
[37] Provably Convergent Plug-and-play Proximal Block Coordinate Descent Method for Hyperspectral Anomaly Detection PDF
[38] Orthogonal Constrained Minimization with Tensor â2,p Regularization for HSI Denoising and Destriping PDF
[39] A Block Coordinate Descent-based Projected Gradient Algorithm for Orthogonal Non-negative Matrix Factorization PDF
Block-k stationary points with stronger optimality guarantees
The authors define a new optimality condition called block-k stationary points and prove that these points provide stronger optimality guarantees than standard critical points used in existing methods for optimization on the Stiefel manifold.
[2] A constraint dissolving approach for nonsmooth optimization over the Stiefel manifold PDF
[5] Smoothing algorithms for nonsmooth optimization over the Stiefel manifold with applications to the graph Fourier basis problem PDF
[6] Proximal gradient method for nonsmooth optimization over the Stiefel manifold PDF
[17] A Penalty-Free Infeasible Approach for a Class of Nonsmooth Optimization Problems Over the Stiefel Manifold PDF
[40] Riemannian stochastic proximal gradient methods for nonsmooth optimization over the Stiefel manifold PDF
[41] A Riemannian conjugate gradient method for optimization on the Stiefel manifold PDF
[42] Exact Penalty Function for Norm Minimization over the Stiefel Manifold PDF
[43] A semidefinite relaxation for sums of heterogeneous quadratics on the Stiefel manifold PDF
[44] Proximal point algorithm on the Stiefel manifold PDF
[45] An adaptive regularized proximal Newton-type methods for composite optimization over the Stiefel manifold PDF
Convergence analysis with iteration complexity and non-ergodic rates
The authors establish that OBCD achieves ε-block-k stationary points with O(1/ε) iteration complexity and derive non-ergodic (last-iterate) convergence rates under the Kurdyka-Lojasiewicz inequality, including finite-step, linear, and sublinear convergence depending on the KL exponent.