Characterizing the Discrete Geometry of ReLU Networks

ICLR 2026 Conference SubmissionAnonymous Authors
PolyhedronsGeometryReLUActivations
Abstract:

It is well established that ReLU networks define continuous piecewise-linear functions, and that their linear regions are polyhedra in the input space. These regions form a complex that fully partitions the input space. The way these regions fit together is fundamental to the behavior of the network, as nonlinearities occur only at the boundaries where these regions connect. However, relatively little is known about the geometry of these complexes beyond bounds on the total number of regions, and calculating the complex exactly is intractable for most networks. In this work, we prove new theoretical results about these complexes that hold for all fully-connected ReLU networks, specifically about their connectivity graphs in which nodes correspond to regions and edges exist between each pair of regions connected by a face. We find that the average degree of this graph is upper bounded by twice the input dimension regardless of the width and depth of the network, and that the diameter of this graph has an upper bound that does not depend on input dimension, despite the number of regions increasing exponentially with input dimension. We corroborate our findings through experiments with networks trained on both synthetic and real-world data, which provide additional insight into the geometry of ReLU networks.

Disclaimer
This report is AI-GENERATED using Large Language Models and WisPaper (A scholar search engine). It analyzes academic papers' tasks and contributions against retrieved prior work. While this system identifies POTENTIAL overlaps and novel directions, ITS COVERAGE IS NOT EXHAUSTIVE AND JUDGMENTS ARE APPROXIMATE. These results are intended to assist human reviewers and SHOULD NOT be relied upon as a definitive verdict on novelty.
NOTE that some papers exist in multiple, slightly different versions (e.g., with different titles or URLs). The system may retrieve several versions of the same underlying work. The current automated pipeline does not reliably align or distinguish these cases, so human reviewers will need to disambiguate them manually.
If you have any questions, please contact: mingzhang23@m.fudan.edu.cn

Overview

Overall Novelty Assessment

The paper contributes new theoretical results on connectivity graphs of linear region complexes in fully-connected ReLU networks, specifically proving that average degree is bounded by twice the input dimension and that diameter has an upper bound independent of input dimension. Within the taxonomy, it resides in the 'Connectivity and Topological Properties' leaf under 'Geometric Structure and Representation Theory', alongside three sibling papers. This leaf represents a moderately populated research direction within a 50-paper taxonomy spanning 20 leaf nodes, indicating focused but not overcrowded attention to connectivity and topological characterization of ReLU region complexes.

The taxonomy reveals that neighboring research directions include 'Algebraic and Tropical Geometry Frameworks' (examining zonotopes and polyhedral theory) and 'Piecewise Linear Function Representation' (studying mathematical properties of continuous piecewise linear functions). The paper's focus on connectivity graphs distinguishes it from these adjacent areas: while algebraic approaches characterize regions through varieties and tropical geometry, this work analyzes how regions connect via shared faces. The scope note for its leaf explicitly excludes local complexity measures and training dynamics, positioning the work as a study of global structural properties rather than optimization-related phenomena or computational enumeration methods covered in other branches.

Among 16 candidates examined across three contributions, the average degree bound shows one refutable candidate from one paper examined, suggesting substantial prior work on this specific result. The diameter bound examined five candidates with none clearly refuting, indicating potentially greater novelty in this direction. The asymptotic characterization contribution examined ten candidates with no refutations, suggesting this aspect may be less directly addressed in prior literature. However, the limited search scope (16 total candidates from top-K semantic search) means these statistics reflect only a narrow slice of potentially relevant work, not an exhaustive survey of connectivity graph analysis in ReLU networks.

Based on the limited literature search covering 16 candidates, the diameter bound and asymptotic characterization appear to offer more novel contributions than the average degree result, which has identifiable prior overlap. The taxonomy structure suggests this work addresses a recognized but not saturated research direction, though the restricted search scope prevents definitive claims about overall novelty across the broader field of ReLU network geometry.

Taxonomy

Core-task Taxonomy Papers
50
3
Claimed Contributions
16
Contribution Candidate Papers Compared
1
Refutable Paper

Research Landscape Overview

Core task: discrete geometry of piecewise-linear regions in ReLU networks. The field examines how ReLU networks partition input space into polyhedral regions where the function is affine, and how this geometric structure relates to expressivity, complexity, and optimization. The taxonomy organizes work into several main branches. Geometric Structure and Representation Theory investigates the intrinsic properties of these polytopes, including connectivity, topological invariants, and algebraic characterizations—works like Topological Complexity Measures[27] and Bent Hyperplane Transversality[38] explore how activation patterns induce topological features. Computational Methods and Complexity focuses on counting and enumerating linear regions, with foundational contributions such as Response Regions Count[35] and more recent efforts like Counting Linear Regions[45]. Expressivity and Capacity Bounds studies how many regions a network can produce and what functions it can represent, exemplified by VC Dimension Bounds[3] and Neural Complexity Bounds[15]. Approximation Theory connects ReLU networks to classical finite element methods, as seen in ReLU Finite Elements[6] and Deep ReLU Elements[20]. Training Dynamics examines how optimization navigates the piecewise-linear landscape, while Applications extends these ideas to control theory, verification, and interpretability. Particularly active lines of work contrast theoretical upper bounds on region counts with empirical observations of how networks actually partition space during training. Some studies emphasize the role of depth and width in determining expressivity, while others investigate the connectivity and traversal properties of adjacent regions—Traversing Local Polytopes[12] and Polytope Lens Interpretation[13] illustrate how local geometry informs optimization and interpretability. The original paper, Discrete Geometry ReLU[0], sits within the Geometric Structure branch, specifically addressing connectivity and topological properties of linear regions. Its emphasis on topological invariants and structural characterization aligns closely with Topological Properties Access[36], which examines how to probe these features computationally, and contrasts with Bent Hyperplane Transversality[38], which focuses on transversality conditions governing region boundaries. This positioning highlights ongoing interest in understanding not just how many regions exist, but how they are organized and interconnected in the input space.

Claimed Contributions

Upper bound on average degree of connectivity graph

The authors prove that for any fully-connected ReLU network, the average number of neighbors (faces) of polyhedral regions in the network's complex is at most 2d, where d is the input dimension. This bound holds independently of network width and depth.

1 retrieved paper
Can Refute
Upper bound on connectivity graph diameter independent of input dimension

The authors establish that the diameter of the connectivity graph (longest shortest-path distance between any pair of regions) is bounded above by (m+1)l, where m is maximum layer width and l is depth. This bound does not depend on input dimension d, even though the number of regions grows exponentially with d.

5 retrieved papers
Characterization of asymptotic behavior and tightness of bounds

The authors prove that the average degree increases monotonically as network size grows and that the upper bound of 2d is tight, with the average converging exactly to 2d for shallow networks as the number of neurons approaches infinity. They also establish lower bounds showing every region has at least d neighbors when the network has at least d neurons.

10 retrieved papers

Core Task Comparisons

Comparisons with papers in the same taxonomy category

Contribution Analysis

Detailed comparisons for each claimed contribution

Contribution

Upper bound on average degree of connectivity graph

The authors prove that for any fully-connected ReLU network, the average number of neighbors (faces) of polyhedral regions in the network's complex is at most 2d, where d is the input dimension. This bound holds independently of network width and depth.

Contribution

Upper bound on connectivity graph diameter independent of input dimension

The authors establish that the diameter of the connectivity graph (longest shortest-path distance between any pair of regions) is bounded above by (m+1)l, where m is maximum layer width and l is depth. This bound does not depend on input dimension d, even though the number of regions grows exponentially with d.

Contribution

Characterization of asymptotic behavior and tightness of bounds

The authors prove that the average degree increases monotonically as network size grows and that the upper bound of 2d is tight, with the average converging exactly to 2d for shallow networks as the number of neurons approaches infinity. They also establish lower bounds showing every region has at least d neighbors when the network has at least d neurons.