MambaSL: Exploring Single-Layer Mamba for Time Series Classification

ICLR 2026 Conference SubmissionAnonymous Authors
modular selective SSMmulti-head adaptive poolingskip connectionsingle-layer Mambatime series classification
Abstract:

Despite recent advances in state space models (SSMs) such as Mamba across various sequence domains, research on their standalone capacity for time series classification (TSC) has remained limited. We propose MambaSL, a framework that minimally redesigns the selective SSM and projection layers of a single-layer Mamba, guided by four TSC-specific hypotheses. To address benchmarking limitations—restricted configurations, partial University of East Anglia (UEA) dataset coverage, and insufficiently reproducible setups—we re-evaluate 20 strong baselines across all 30 UEA datasets under a unified protocol. Our results show that MambaSL achieves state-of-the-art performance on the UEA benchmark among 21 models, with statistically significant average improvements over baselines while ensuring reproducibility via public checkpoints.

Disclaimer
This report is AI-GENERATED using Large Language Models and WisPaper (A scholar search engine). It analyzes academic papers' tasks and contributions against retrieved prior work. While this system identifies POTENTIAL overlaps and novel directions, ITS COVERAGE IS NOT EXHAUSTIVE AND JUDGMENTS ARE APPROXIMATE. These results are intended to assist human reviewers and SHOULD NOT be relied upon as a definitive verdict on novelty.
NOTE that some papers exist in multiple, slightly different versions (e.g., with different titles or URLs). The system may retrieve several versions of the same underlying work. The current automated pipeline does not reliably align or distinguish these cases, so human reviewers will need to disambiguate them manually.
If you have any questions, please contact: mingzhang23@m.fudan.edu.cn

Overview

Overall Novelty Assessment

MambaSL proposes a minimally redesigned single-layer Mamba architecture guided by four TSC-specific hypotheses, targeting time series classification on the UEA benchmark. The paper resides in the Mamba-Based Architectures leaf, which contains four papers including the original work. This leaf sits within the broader Core State Space Model Architectures branch, indicating a moderately populated research direction focused on foundational SSM designs. The sibling papers—TSCMamba, HydraMamba, and Residual Mamba Encoder—share the selective state space mechanism but explore different architectural strategies such as multi-resolution processing and hierarchical designs.

The taxonomy reveals neighboring leaves addressing structured SSMs (S4 variants), recurrent networks, and hybrid architectures combining Mamba with attention or convolution. The Mamba-Based Architectures leaf explicitly excludes hybrid models and domain-specific applications, positioning MambaSL as a pure SSM approach rather than a fusion framework. Nearby branches like SSM-Attention Hybrid Models and Biomedical Signal Classification suggest alternative pathways for enhancing temporal modeling or specializing to physiological signals, yet MambaSL remains within the general-purpose architectural exploration cluster, emphasizing streamlined design over domain-specific customization or multi-mechanism integration.

Among thirty candidates examined, the analysis found one refutable pair for the state-of-the-art performance claim, while the architectural hypotheses and benchmarking protocol contributions showed no clear refutations across ten candidates each. The limited search scope means these statistics reflect top-K semantic matches rather than exhaustive coverage. The architectural refinements appear less contested in the examined literature, whereas the performance claim encounters at least one overlapping prior result. The benchmarking contribution—addressing reproducibility and dataset coverage—shows no direct refutation among the candidates, suggesting this methodological angle may be less explored in the immediate literature neighborhood.

Based on the limited search of thirty semantically similar papers, MambaSL's novelty appears strongest in its methodological rigor and architectural simplification rather than in introducing entirely new mechanisms. The taxonomy context shows a moderately active Mamba-based research area with four papers, indicating neither a saturated nor nascent field. The analysis does not cover exhaustive citation networks or domain-specific venues, so the assessment reflects proximity to top-ranked semantic neighbors rather than comprehensive field coverage.

Taxonomy

Core-task Taxonomy Papers
50
3
Claimed Contributions
30
Contribution Candidate Papers Compared
1
Refutable Paper

Research Landscape Overview

Core task: time series classification using state space models. The field organizes around several major branches that reflect different modeling philosophies and application contexts. Core State Space Model Architectures for Time Series focuses on foundational designs—ranging from classical linear state space formulations to modern Mamba-based architectures that leverage selective state mechanisms for efficient long-range dependency modeling. Integration Approaches and Hybrid Architectures explore how state space models combine with transformers, convolutional layers, or graph structures to balance expressiveness and computational cost. Application Domains and Task-Specific Models address specialized settings such as biomedical signal analysis (e.g., ECG, EEG, sleep staging), video understanding, and industrial anomaly detection, where domain constraints shape model design. Feature Representation and Transformation Methods examine preprocessing and embedding strategies that condition state space dynamics, while Latent Variable and Generative Models emphasize probabilistic inference and hidden Markov frameworks. Finally, Specialized Learning Paradigms and Optimization tackle training strategies, including few-shot learning, continual adaptation, and efficient parameterization. Recent activity highlights a tension between architectural simplicity and task-specific customization. Many studies within the Mamba-based branch pursue scalable, general-purpose designs that handle diverse temporal scales with minimal inductive bias, as seen in TSCMamba[23] and HydraMamba[47], which emphasize modular scanning strategies and multi-resolution processing. In contrast, works like Ecgmamba[1] and EEG-SSM[33] tailor state transitions to physiological signal characteristics, trading generality for domain-aware feature extraction. MambaSL[0] sits within the Mamba-based cluster, sharing the emphasis on selective state space mechanisms with TSCMamba[23] and Residual Mamba Encoder[48], yet it appears to prioritize streamlined architectures over the multi-head or hierarchical designs explored by neighbors like HydraMamba[47]. This positioning suggests an ongoing exploration of how much domain knowledge to encode directly into state space parameterizations versus relying on data-driven learning of temporal dependencies.

Claimed Contributions

Four TSC-specific hypotheses and architectural refinements for Mamba

The authors introduce four hypotheses (H1: scale input projection, H2: modularize time variance, H3: remove skip connection, H4: aggregate via adaptive pooling) that guide minimal redesigns of selective SSM components and projection layers in a single-layer Mamba framework for time series classification.

10 retrieved papers
Comprehensive and reproducible UEA benchmarking protocol

The authors establish a unified benchmarking protocol that re-evaluates 20 strong baselines across all 30 multivariate UEA datasets with extensive hyperparameter sweeps, addressing prior limitations in coverage, fairness, and reproducibility, and providing public checkpoints.

10 retrieved papers
MambaSL framework achieving state-of-the-art TSC performance

The authors develop MambaSL, a single-layer Mamba framework for time series classification that achieves state-of-the-art performance on the UEA benchmark with statistically significant improvements over baselines, demonstrating Mamba's standalone capacity for TSC.

10 retrieved papers
Can Refute

Core Task Comparisons

Comparisons with papers in the same taxonomy category

Contribution Analysis

Detailed comparisons for each claimed contribution

Contribution

Four TSC-specific hypotheses and architectural refinements for Mamba

The authors introduce four hypotheses (H1: scale input projection, H2: modularize time variance, H3: remove skip connection, H4: aggregate via adaptive pooling) that guide minimal redesigns of selective SSM components and projection layers in a single-layer Mamba framework for time series classification.

Contribution

Comprehensive and reproducible UEA benchmarking protocol

The authors establish a unified benchmarking protocol that re-evaluates 20 strong baselines across all 30 multivariate UEA datasets with extensive hyperparameter sweeps, addressing prior limitations in coverage, fairness, and reproducibility, and providing public checkpoints.

Contribution

MambaSL framework achieving state-of-the-art TSC performance

The authors develop MambaSL, a single-layer Mamba framework for time series classification that achieves state-of-the-art performance on the UEA benchmark with statistically significant improvements over baselines, demonstrating Mamba's standalone capacity for TSC.

MambaSL: Exploring Single-Layer Mamba for Time Series Classification | Novelty Validation