Convex Efficient Coding

ICLR 2026 Conference SubmissionAnonymous Authors
NeuroscienceRepresentationIdentifiability
Abstract:

Why do neurons encode information the way they do? Normative answers to this question model neural activity as the solution to an optimisation problem; for example, the celebrated efficient coding hypothesis frames neural activity as the optimal encoding of information under efficiency constraints. Successful normative theories have varied dramatically in complexity, from simple linear models (Atick & Redlich, 1990), to complex deep neural networks (Lindsay, 2021). What complex models gain in flexibility, they lose in tractability and often understandability. Here, we split the difference by constructing a set of tractable but flexible normative representational theories. Instead of optimising the neural activities directly, following (Sengupta et al. 2018), we instead optimise the representational similarity, a matrix formed from the dot products of each pair of neural responses. Using this, we show that a large family of interesting optimisation problems are convex. This includes problems corresponding to linear and some non-linear neural networks, and problems from the literature not previously recognised as convex such as modified versions of semi-nonnegative matrix factorisation or nonnegative sparse coding. We put these findings to work in two ways. First, we extend previous results on modularity and mixed selectivity in neural activity; in so doing we provide the first necessary and sufficient identifiability result for a form of semi-nonnegative matrix factorisations. Second, we seek to understand the meaningfulness of single neural tuning curves as compared to neural representations. In particular we derive an identifiability result stating that, for an optimal representational similarity matrix, if neural tunings are `different enough' then they are uniquely linked to the optimal representational similarity, partially justifying the use of single neuron tuning analysis in neuroscience. In sum, we identify an interesting space of convex problems, and use that to derive neural coding results.

Disclaimer
This report is AI-GENERATED using Large Language Models and WisPaper (A scholar search engine). It analyzes academic papers' tasks and contributions against retrieved prior work. While this system identifies POTENTIAL overlaps and novel directions, ITS COVERAGE IS NOT EXHAUSTIVE AND JUDGMENTS ARE APPROXIMATE. These results are intended to assist human reviewers and SHOULD NOT be relied upon as a definitive verdict on novelty.
NOTE that some papers exist in multiple, slightly different versions (e.g., with different titles or URLs). The system may retrieve several versions of the same underlying work. The current automated pipeline does not reliably align or distinguish these cases, so human reviewers will need to disambiguate them manually.
If you have any questions, please contact: mingzhang23@m.fudan.edu.cn

Overview

Overall Novelty Assessment

The paper proposes a convex optimization framework for efficient coding by optimizing representational similarity matrices rather than neural activities directly. It resides in the 'Biologically-Inspired and Normative Coding' leaf, which contains four papers total (including this one). This leaf sits within the broader 'Representation Learning and Encoding Efficiency' branch, making it a relatively sparse research direction compared to more crowded areas like 'Image and Video Compression via INRs' (seven papers) or 'Model Compression and Quantization' (five papers). The work thus occupies a niche intersection of normative neuroscience and tractable optimization.

The taxonomy reveals neighboring leaves focused on 'Unsupervised and Self-Supervised Representation Learning' and 'Graph and Dynamic Representations', which emphasize computational learning objectives rather than normative principles. The sibling papers in the same leaf—covering metabolic constraints, grid cell coding, and perceptual illusions—share the biological motivation but differ in scope. Metabolic Neural Codes addresses spiking network budgets, Actionable Grid Cells targets spatial navigation, and Tilt Illusion Coding examines perceptual phenomena. This paper's contribution lies in providing a general mathematical framework applicable across sensory modalities, diverging from the domain-specific focus of its siblings.

Among seven candidates examined across three contributions, none were found to clearly refute the work. The convex reformulation contribution examined four candidates with zero refutations, suggesting limited direct overlap in the top-ranked semantic matches. The identifiability conditions for semi-nonnegative matrix factorization examined two candidates, and the linking of representational similarity to tuning curves examined one, both without refutation. Given the small search scope (seven total candidates), these statistics indicate that within the limited literature surveyed, the specific combination of convexity, representational similarity optimization, and identifiability appears underexplored, though the analysis does not claim exhaustive coverage.

Based on the top-seven semantic matches and taxonomy structure, the work appears to introduce a novel mathematical perspective within a sparse research direction. The convex reformulation and identifiability results seem less directly addressed in the examined candidates, though the limited scope means potentially relevant work in broader optimization or matrix factorization communities may not have been captured. The analysis reflects what was found among closely related normative coding papers, not a comprehensive survey of all convex optimization or efficient coding literature.

Taxonomy

Core-task Taxonomy Papers
50
3
Claimed Contributions
7
Contribution Candidate Papers Compared
0
Refutable Paper

Research Landscape Overview

Core task: Optimizing neural representations under efficiency constraints. The field encompasses diverse strategies for balancing representational power with computational and memory budgets. At the highest level, the taxonomy reveals several major branches: Implicit Neural Representations for Data Compression explores compact encodings of signals and geometry (e.g., NeRV[2], Implicit Neural Compression[1]); Efficient Neural Rendering and Reconstruction targets real-time graphics and 3D scene modeling (e.g., Instant Neural Graphics[19], Tri-MipRF[17]); Neural Network Architecture Optimization Under Resource Constraints focuses on designing lightweight models for deployment (e.g., Resource Constrained Optimization[6], Tiny Neural Networks[32]); Efficient Training and Optimization of Neural Representations addresses convergence speed and sample efficiency (e.g., Learned Initializations[34]); Application-Driven Neural Representations tailors methods to domains like medical imaging (Medical Implicit Survey[3]) or climate data (Climate Data Reduction[37]); Representation Learning and Encoding Efficiency investigates principled coding schemes, including biologically-inspired approaches; and Verification and Constraint Satisfaction in Neural Networks ensures correctness under formal specifications (e.g., Beta-CROWN[44]). Within Representation Learning and Encoding Efficiency, a particularly active line of work draws on normative principles from neuroscience and information theory to derive efficient codes. Metabolic Neural Codes[14] and Actionable Grid Cells[12] exemplify biologically-inspired frameworks that balance representational fidelity with metabolic or computational cost, while Tilt Illusion Coding[39] explores perceptual constraints. Convex Efficient Coding[0] sits squarely in this biologically-inspired cluster, proposing a convex optimization framework for learning efficient neural codes under resource constraints. Compared to Metabolic Neural Codes[14], which emphasizes metabolic budgets in spiking networks, Convex Efficient Coding[0] offers a more general mathematical treatment applicable to broader neural architectures. Meanwhile, Actionable Grid Cells[12] focuses on spatial navigation tasks, whereas Convex Efficient Coding[0] addresses general sensory coding problems. This work thus bridges normative coding theory and practical optimization, contributing a principled yet flexible approach to the broader challenge of designing resource-aware neural representations.

Claimed Contributions

Convex reformulation of efficient coding problems via representational similarity matrices

The authors demonstrate that a broad class of neural coding optimisation problems can be reformulated as convex optimisations over representational dot-product similarity matrices. This framework encompasses linear networks, certain nonlinear networks, and previously unrecognised convex problems like modified semi-nonnegative matrix factorisation.

4 retrieved papers
Necessary and sufficient identifiability conditions for semi-nonnegative matrix factorisation

The authors derive the first tight (both necessary and sufficient) identifiability criterion for semi-nonnegative matrix factorisation via nonnegative-affine autoencoders. They extend previous results on modularity and mixed selectivity to the case of linearly (rather than orthogonally) mixed sources.

2 retrieved papers
Identifiability conditions linking optimal representational similarity to unique neural tuning curves

The authors establish sufficient conditions under which an optimal representational similarity matrix uniquely determines single-neuron tuning curves. This result provides theoretical justification for studying individual neuron responses by showing when nonnegativity constraints break rotational symmetry and make tuning curves meaningful.

1 retrieved paper

Core Task Comparisons

Comparisons with papers in the same taxonomy category

Contribution Analysis

Detailed comparisons for each claimed contribution

Contribution

Convex reformulation of efficient coding problems via representational similarity matrices

The authors demonstrate that a broad class of neural coding optimisation problems can be reformulated as convex optimisations over representational dot-product similarity matrices. This framework encompasses linear networks, certain nonlinear networks, and previously unrecognised convex problems like modified semi-nonnegative matrix factorisation.

Contribution

Necessary and sufficient identifiability conditions for semi-nonnegative matrix factorisation

The authors derive the first tight (both necessary and sufficient) identifiability criterion for semi-nonnegative matrix factorisation via nonnegative-affine autoencoders. They extend previous results on modularity and mixed selectivity to the case of linearly (rather than orthogonally) mixed sources.

Contribution

Identifiability conditions linking optimal representational similarity to unique neural tuning curves

The authors establish sufficient conditions under which an optimal representational similarity matrix uniquely determines single-neuron tuning curves. This result provides theoretical justification for studying individual neuron responses by showing when nonnegativity constraints break rotational symmetry and make tuning curves meaningful.