Parameterized Hardness of Zonotope Containment and Neural Network Verification
Overview
Overall Novelty Assessment
The paper establishes W[1]-hardness results for positivity and surjectivity of 2-layer ReLU networks, zonotope containment, and Lipschitz constant computation. It resides in the 'Parameterized Complexity of Network Properties' leaf, which contains only two papers total. This leaf sits within the broader 'Computational Complexity and Hardness Results' branch, a relatively sparse area compared to the representation-methods and reachability branches that dominate the taxonomy with over ten papers. The work thus occupies a niche focused on fundamental intractability rather than algorithmic development.
The taxonomy reveals that most neighboring research concentrates on set-based verification methods—zonotope variants, hybrid zonotopes, and polynomial zonotopes—designed to make verification tractable in practice. The sibling paper in the same leaf examines ReLU injectivity complexity, while the adjacent 'Geometric Containment and Intersection Complexity' leaf addresses zonotope containment from a geometric perspective. The original paper bridges these areas by proving that zonotope containment inherits hardness from network verification, connecting theoretical limits across both domains and contrasting sharply with the algorithmic optimism in neighboring branches.
Among 21 candidates examined, two contributions show potential prior overlap. The zonotope containment hardness claim (10 candidates examined, 1 refutable) and the Lipschitz constant hardness results (10 candidates examined, 1 refutable) each encounter one candidate that may provide overlapping prior work. The positivity/surjectivity hardness for 2-layer networks (1 candidate examined, 0 refutable) appears more novel within this limited search scope. The statistics suggest that while some hardness results may have precedent, the specific parameterized complexity angle and the connection to zonotope containment represent less-explored territory.
Given the limited search scope of 21 candidates from top-K semantic matches, the analysis captures immediate neighbors but cannot claim exhaustive coverage. The sparse population of the complexity-hardness branch and the paper's bridging role between network verification and geometric containment suggest meaningful contributions, though the refutable pairs indicate that certain hardness results may build incrementally on known lower bounds. A broader literature search would clarify whether the parameterized perspective and the zonotope connection constitute substantial novelty or refinements of established themes.
Taxonomy
Research Landscape Overview
Claimed Contributions
The authors establish that determining whether a 2-layer ReLU network outputs a positive value (positivity) or is surjective is W[1]-hard with respect to the input dimension d, resolving an open problem from COLT '25. This result implies that simple enumeration algorithms are essentially optimal under the Exponential Time Hypothesis.
The authors prove that deciding whether one zonotope is contained in another is W[1]-hard when parameterized by dimension d. This follows from the duality between 2-layer ReLU networks and zonotopes, and has implications for computational geometry and robotics applications.
The authors establish W[1]-hardness (parameterized by input dimension d) for computing and approximating Lipschitz constants in ReLU networks. They extend previous NP-hardness results to all p-norms and prove that approximation within any multiplicative factor remains hard, showing that enumeration-based methods are essentially optimal.
Core Task Comparisons
Comparisons with papers in the same taxonomy category
[10] Complexity of Injectivity and Verification of ReLU Neural Networks PDF
Contribution Analysis
Detailed comparisons for each claimed contribution
W[1]-hardness of 2-layer ReLU network positivity and surjectivity
The authors establish that determining whether a 2-layer ReLU network outputs a positive value (positivity) or is surjective is W[1]-hard with respect to the input dimension d, resolving an open problem from COLT '25. This result implies that simple enumeration algorithms are essentially optimal under the Exponential Time Hypothesis.
[16] Training Fully Connected Neural Networks is -Complete PDF
W[1]-hardness of zonotope containment
The authors prove that deciding whether one zonotope is contained in another is W[1]-hard when parameterized by dimension d. This follows from the duality between 2-layer ReLU networks and zonotopes, and has implications for computational geometry and robotics applications.
[9] On the difficulty of intersection checking with polynomial zonotopes PDF
[3] Provably-Safe Neural Network Training Using Hybrid Zonotope Reachability Analysis PDF
[10] Complexity of Injectivity and Verification of ReLU Neural Networks PDF
[27] Open Problem: Fixed-Parameter Tractability of Zonotope Problems PDF
[28] Zonotope-Based Elastic Tube Model Predictive Control PDF
[29] Combinatorial Optimization PDF
[30] Robust explicit model predictive control for hybrid linear systems with parameter uncertainties PDF
[31] On the co-NP-completeness of the zonotope containment problem PDF
[32] Resilient set-based state estimation for linear time-invariant systems using zonotopes PDF
[33] Efficient computation of reachable sets of linear time-invariant systems with inputs PDF
W[1]-hardness results for Lipschitz constant computation and approximation
The authors establish W[1]-hardness (parameterized by input dimension d) for computing and approximating Lipschitz constants in ReLU networks. They extend previous NP-hardness results to all p-norms and prove that approximation within any multiplicative factor remains hard, showing that enumeration-based methods are essentially optimal.