MambaSL: Exploring Single-Layer Mamba for Time Series Classification
Overview
Overall Novelty Assessment
MambaSL proposes a minimally redesigned single-layer Mamba architecture guided by four TSC-specific hypotheses, targeting time series classification on the UEA benchmark. The paper resides in the Mamba-Based Architectures leaf, which contains four papers including the original work. This leaf sits within the broader Core State Space Model Architectures branch, indicating a moderately populated research direction focused on foundational SSM designs. The sibling papers—TSCMamba, HydraMamba, and Residual Mamba Encoder—share the selective state space mechanism but explore different architectural strategies such as multi-resolution processing and hierarchical designs.
The taxonomy reveals neighboring leaves addressing structured SSMs (S4 variants), recurrent networks, and hybrid architectures combining Mamba with attention or convolution. The Mamba-Based Architectures leaf explicitly excludes hybrid models and domain-specific applications, positioning MambaSL as a pure SSM approach rather than a fusion framework. Nearby branches like SSM-Attention Hybrid Models and Biomedical Signal Classification suggest alternative pathways for enhancing temporal modeling or specializing to physiological signals, yet MambaSL remains within the general-purpose architectural exploration cluster, emphasizing streamlined design over domain-specific customization or multi-mechanism integration.
Among thirty candidates examined, the analysis found one refutable pair for the state-of-the-art performance claim, while the architectural hypotheses and benchmarking protocol contributions showed no clear refutations across ten candidates each. The limited search scope means these statistics reflect top-K semantic matches rather than exhaustive coverage. The architectural refinements appear less contested in the examined literature, whereas the performance claim encounters at least one overlapping prior result. The benchmarking contribution—addressing reproducibility and dataset coverage—shows no direct refutation among the candidates, suggesting this methodological angle may be less explored in the immediate literature neighborhood.
Based on the limited search of thirty semantically similar papers, MambaSL's novelty appears strongest in its methodological rigor and architectural simplification rather than in introducing entirely new mechanisms. The taxonomy context shows a moderately active Mamba-based research area with four papers, indicating neither a saturated nor nascent field. The analysis does not cover exhaustive citation networks or domain-specific venues, so the assessment reflects proximity to top-ranked semantic neighbors rather than comprehensive field coverage.
Taxonomy
Research Landscape Overview
Claimed Contributions
The authors introduce four hypotheses (H1: scale input projection, H2: modularize time variance, H3: remove skip connection, H4: aggregate via adaptive pooling) that guide minimal redesigns of selective SSM components and projection layers in a single-layer Mamba framework for time series classification.
The authors establish a unified benchmarking protocol that re-evaluates 20 strong baselines across all 30 multivariate UEA datasets with extensive hyperparameter sweeps, addressing prior limitations in coverage, fairness, and reproducibility, and providing public checkpoints.
The authors develop MambaSL, a single-layer Mamba framework for time series classification that achieves state-of-the-art performance on the UEA benchmark with statistically significant improvements over baselines, demonstrating Mamba's standalone capacity for TSC.
Core Task Comparisons
Comparisons with papers in the same taxonomy category
[23] TSCMamba: Mamba Meets Multi-View Learning for Time Series Classification PDF
[47] HydraMamba: An Efficient and High-Performance Architecture for Time Series Classification through Multi-Mechanism Fusion PDF
[48] Time Series Class-Incremental Learning with Residual Mamba Encoder PDF
Contribution Analysis
Detailed comparisons for each claimed contribution
Four TSC-specific hypotheses and architectural refinements for Mamba
The authors introduce four hypotheses (H1: scale input projection, H2: modularize time variance, H3: remove skip connection, H4: aggregate via adaptive pooling) that guide minimal redesigns of selective SSM components and projection layers in a single-layer Mamba framework for time series classification.
[5] Long movie clip classification with state-space video models PDF
[11] Deep latent state space models for time-series generation PDF
[17] Effectively modeling time series with simple discrete state spaces PDF
[23] TSCMamba: Mamba Meets Multi-View Learning for Time Series Classification PDF
[33] EEG-SSM: Leveraging State-Space Model for Dementia Detection PDF
[51] Mamba-360: Survey of state space models as transformer alternative for long sequence modelling: Methods, applications, and challenges PDF
[56] Vmamba: Visual state space model PDF
[57] MS-SSM: A Multi-Scale State Space Model for Efficient Sequence Modeling PDF
[58] InsectMamba: Insect Pest Classification with State Space Model PDF
[59] Leto: Modeling Multivariate Time Series with Memorizing at Test Time PDF
Comprehensive and reproducible UEA benchmarking protocol
The authors establish a unified benchmarking protocol that re-evaluates 20 strong baselines across all 30 multivariate UEA datasets with extensive hyperparameter sweeps, addressing prior limitations in coverage, fairness, and reproducibility, and providing public checkpoints.
[60] Timesead: Benchmarking deep multivariate time-series anomaly detection PDF
[61] The elephant in the room: Towards a reliable time-series anomaly detection benchmark PDF
[62] Optimized Temporal Denoised Convolutional Autoencoder for Enhanced ADHD Classification Using fMRI Data PDF
[63] Enhanced ECG Signal Classification with CNN-LSTM Networks using Aquila Optimization PDF
[64] Multitask LSTM for Arboviral Outbreak Prediction Using Public Health Data PDF
[65] Graph Neural Network and Temporal Sequence Integration for AI-Powered Financial Compliance Detection PDF
[66] Detection of breast cancer using machine learning on time-series diffuse optical transillumination data PDF
[67] A very concise feature representation for time series classification understanding PDF
[68] AUV Fault Diagnosis Based on Multidimensional Temporal Classification Transformer PDF
[69] Learning hidden patterns from patient multivariate time series data using convolutional neural networks: A case study of healthcare cost prediction PDF
MambaSL framework achieving state-of-the-art TSC performance
The authors develop MambaSL, a single-layer Mamba framework for time series classification that achieves state-of-the-art performance on the UEA benchmark with statistically significant improvements over baselines, demonstrating Mamba's standalone capacity for TSC.