Symmetry-Aware Bayesian Optimization via Max Kernels

ICLR 2026 Conference SubmissionAnonymous Authors
Bayesian OptimizationInvariance
Abstract:

Bayesian Optimization (BO) is a powerful framework for optimizing noisy, expensive-to-evaluate black-box functions. When the objective exhibits invariances under a group action, exploiting these symmetries can substantially improve BO efficiency. While using maximum similarity across group orbits has long been considered in other domains, the fact that the max kernel is not positive semidefinite (PSD) has prevented its use in BO. In this work, we revisit this idea by considering a PSD projection of the max kernel. Compared to existing invariant (and non-invariant) kernels, we show it achieves significantly lower regret on both synthetic and real-world BO benchmarks, without increasing computational complexity.

Disclaimer
This report is AI-GENERATED using Large Language Models and WisPaper (A scholar search engine). It analyzes academic papers' tasks and contributions against retrieved prior work. While this system identifies POTENTIAL overlaps and novel directions, ITS COVERAGE IS NOT EXHAUSTIVE AND JUDGMENTS ARE APPROXIMATE. These results are intended to assist human reviewers and SHOULD NOT be relied upon as a definitive verdict on novelty.
NOTE that some papers exist in multiple, slightly different versions (e.g., with different titles or URLs). The system may retrieve several versions of the same underlying work. The current automated pipeline does not reliably align or distinguish these cases, so human reviewers will need to disambiguate them manually.
If you have any questions, please contact: mingzhang23@m.fudan.edu.cn

Overview

Overall Novelty Assessment

The paper proposes a positive semidefinite projection of the max kernel to exploit group symmetries in Bayesian optimization. It resides in the 'Invariant Kernel Construction' leaf under 'Symmetry-Aware Kernel Design and Theory', which contains only two papers total (including this one). This places the work in a relatively sparse research direction within a taxonomy of nine papers across seven leaf nodes. The sibling paper focuses on leveraging known invariances through explicit group averaging, suggesting that kernel construction methods for symmetry-aware BO remain an emerging area with limited prior exploration.

The taxonomy reveals that symmetry-aware kernel design sits alongside geometric manifold optimization (Riemannian kernels for curved spaces) and structured discrete optimization (tree ensemble methods). The original paper's approach differs from neighboring geometric methods by targeting Euclidean spaces with group invariances rather than non-Euclidean manifolds. It also diverges from set-valued optimization techniques, which handle unordered collections rather than orbit-based symmetries. The scope notes clarify that manifold-specific kernels and application-specific implementations belong elsewhere, positioning this work as a foundational kernel design contribution rather than a domain-specific extension.

Among four candidates examined across three contributions, none were found to clearly refute the proposed methods. The PSD projection of the max kernel examined one candidate with no refutable overlap. The empirical performance analysis examined three candidates, again with no clear prior work providing the same insights. The demonstration of gains over orbit averaging examined zero candidates. Given the limited search scope—only four papers reviewed—these statistics suggest the specific combination of max kernel projection and PSD constraints has not been extensively studied, though the small candidate pool prevents strong conclusions about absolute novelty.

Based on the limited literature search of four candidates, the work appears to occupy a sparsely populated research direction within symmetry-aware Bayesian optimization. The taxonomy structure and sibling paper count reinforce this impression, though the restricted search scope means potentially relevant work outside the top semantic matches may exist. The analysis covers kernel construction methods but does not exhaustively survey all symmetry-handling techniques in optimization or related fields.

Taxonomy

Core-task Taxonomy Papers
9
3
Claimed Contributions
4
Contribution Candidate Papers Compared
0
Refutable Paper

Research Landscape Overview

Core task: Bayesian optimization with symmetry-aware kernels for invariant black-box functions. The field addresses how to efficiently optimize expensive-to-evaluate functions that exhibit known or suspected symmetries—such as invariance under rotations, permutations, or other group actions. The taxonomy organizes research into several main branches: Symmetry-Aware Kernel Design and Theory focuses on constructing kernels that respect invariances, often by averaging over group orbits or embedding symmetry constraints directly into covariance structures (e.g., Symmetry Max Kernels[0], Known Invariances[1]). Geometric and Manifold Optimization extends these ideas to curved spaces, where standard Euclidean kernels fail and Riemannian or extrinsic embeddings become necessary (Extrinsic Manifolds[7], Riemannian Matern[9]). Structured and Discrete Spaces tackles combinatorial or tree-structured domains (Tree Ensemble Kernels[3]), while Set-Valued Optimization handles inputs that are unordered collections (Approximate Set Kernels[5]). Application Domains illustrate how these methods deploy in chemistry, materials science, and reinforcement learning (AUGUR Adsorption[2], Reward Functions IRL[4], Structured Environments[6]). A particularly active line of work explores how to encode known symmetries into Gaussian process priors without prohibitive computational overhead, balancing exact invariance guarantees against scalability. Symmetry Max Kernels[0] sits squarely within the Invariant Kernel Construction cluster, proposing a max-pooling strategy over group orbits to achieve invariance while maintaining tractable inference. This contrasts with Known Invariances[1], which emphasizes leveraging user-specified symmetry groups to construct invariant kernels through explicit averaging, and with broader geometric approaches like Extrinsic Manifolds[7] that embed manifold constraints into kernel design. The original paper's emphasis on max-based aggregation offers a middle ground between exact orbit averaging and approximate symmetry handling, positioning it as a methodological contribution to the kernel design branch rather than a domain-specific application or a purely geometric extension.

Claimed Contributions

PSD projection of max kernel for symmetry-aware Bayesian Optimization

The authors propose a positive semidefinite (PSD) version of the max-alignment kernel (kmax) for Bayesian Optimization. They construct k(D)+ via PSD projection and Nyström extension, ensuring it is G-invariant, equals kmax on the design set when kmax is PSD, and matches the asymptotic cost of orbit-averaged kernels.

1 retrieved paper
Demonstration of consistent BO performance gains over orbit averaging

The authors empirically demonstrate that their proposed kernel k(D)+ consistently achieves lower cumulative and simple regret compared to both the base kernel and the orbit-averaged alternative (kavg) across multiple synthetic and real-world benchmarks, with gains increasing as the group size grows.

0 retrieved papers
Analysis revealing mismatch between eigendecay and empirical performance

The authors analyze the spectral properties of their kernel and show that despite kavg often exhibiting faster empirical eigendecay than k(D)+, the latter consistently achieves better regret. This reveals a gap between standard spectral-based BO theory and empirical performance, suggesting that geometric considerations and approximation hardness play essential roles beyond pure spectral rates.

3 retrieved papers

Core Task Comparisons

Comparisons with papers in the same taxonomy category

Contribution Analysis

Detailed comparisons for each claimed contribution

Contribution

PSD projection of max kernel for symmetry-aware Bayesian Optimization

The authors propose a positive semidefinite (PSD) version of the max-alignment kernel (kmax) for Bayesian Optimization. They construct k(D)+ via PSD projection and Nyström extension, ensuring it is G-invariant, equals kmax on the design set when kmax is PSD, and matches the asymptotic cost of orbit-averaged kernels.

Contribution

Demonstration of consistent BO performance gains over orbit averaging

The authors empirically demonstrate that their proposed kernel k(D)+ consistently achieves lower cumulative and simple regret compared to both the base kernel and the orbit-averaged alternative (kavg) across multiple synthetic and real-world benchmarks, with gains increasing as the group size grows.

Contribution

Analysis revealing mismatch between eigendecay and empirical performance

The authors analyze the spectral properties of their kernel and show that despite kavg often exhibiting faster empirical eigendecay than k(D)+, the latter consistently achieves better regret. This reveals a gap between standard spectral-based BO theory and empirical performance, suggesting that geometric considerations and approximation hardness play essential roles beyond pure spectral rates.