Numerion: A Multi-Hypercomplex Model for Time Series Forecasting

ICLR 2026 Conference SubmissionAnonymous Authors
Time Series ForecastingHypercomplex NumbersHypercomplex Time Series ModelsMulti-Hypercomplex Space
Abstract:

Many methods aim to enhance time series forecasting by decomposing the series through intricate model structures and prior knowledge, yet they are inevitably limited by computational complexity and the robustness of the assumptions. Our research uncovers that in the complex domain and higher-order hypercomplex spaces, the characteristic frequencies of time series naturally decrease. Leveraging this insight, we propose Numerion, a time series forecasting model based on multiple hypercomplex spaces. Specifically, grounded in theoretical support, we generalize linear layers and activation functions to hypercomplex spaces of arbitrary power-of-two dimensions and introduce a novel Real-Hypercomplex-Real Domain Multi-Layer Perceptron (RHR-MLP) architecture. Numerion utilizes multiple RHR-MLPs to map time series into hypercomplex spaces of varying dimensions, naturally decomposing and independently modeling the series, and adaptively fuses the latent patterns exhibited in different spaces through a dynamic fusion mechanism. Experiments validate the model’s performance, achieving state-of-the-art results on multiple public datasets. Visualizations and quantitative analyses comprehensively demonstrate the ability of multi-dimensional RHR-MLPs to naturally decompose time series and reveal the tendency of higher-dimensional hypercomplex spaces to capture lower-frequency features.

Disclaimer
This report is AI-GENERATED using Large Language Models and WisPaper (A scholar search engine). It analyzes academic papers' tasks and contributions against retrieved prior work. While this system identifies POTENTIAL overlaps and novel directions, ITS COVERAGE IS NOT EXHAUSTIVE AND JUDGMENTS ARE APPROXIMATE. These results are intended to assist human reviewers and SHOULD NOT be relied upon as a definitive verdict on novelty.
NOTE that some papers exist in multiple, slightly different versions (e.g., with different titles or URLs). The system may retrieve several versions of the same underlying work. The current automated pipeline does not reliably align or distinguish these cases, so human reviewers will need to disambiguate them manually.
If you have any questions, please contact: mingzhang23@m.fudan.edu.cn

Overview

Overall Novelty Assessment

The paper proposes Numerion, a time series forecasting model that maps sequences into multiple hypercomplex spaces of varying power-of-two dimensions, introducing a Real-Hypercomplex-Real MLP architecture with generalized linear layers and activation functions. It resides in the Multi-Dimensional Hypercomplex Architectures leaf, which contains four papers including the original work. This leaf sits within the broader Hypercomplex Neural Network Architectures branch, indicating a moderately populated research direction focused on extending beyond quaternion-only methods to higher-dimensional algebras like octonions and sedenions.

The taxonomy reveals neighboring leaves addressing Quaternion-Based Neural Architectures (five papers) and Spatiotemporal Hypercomplex Networks (four papers), alongside parallel branches for Frequency-Domain methods, Data Fusion, and Statistical Modeling. Numerion's emphasis on arbitrary power-of-two dimensions and natural frequency decomposition positions it at the intersection of architectural innovation and transform-based insights, diverging from purely quaternion-focused designs and from methods that rely on explicit Fourier or wavelet transforms. The scope notes clarify that higher-dimensional methods belong here, while quaternion-specific work and general theory reside elsewhere.

Among thirty candidates examined, none clearly refuted any of the three contributions. Contribution A (RHR-MLP generalization) examined ten candidates with zero refutable overlaps; Contribution B (Numerion model) and Contribution C (frequency-characteristic insight) each examined ten candidates with identical outcomes. This suggests that within the limited semantic search scope, the specific combination of arbitrary-dimensional hypercomplex layers, multi-space fusion, and the frequency-decomposition principle appears distinct from prior work, though the search scale precludes exhaustive coverage of the broader hypercomplex forecasting literature.

Based on top-thirty semantic matches, the work appears to occupy a relatively novel position within multi-dimensional hypercomplex architectures, though the analysis cannot rule out relevant prior art outside this candidate set. The taxonomy context indicates a moderately active subfield where architectural choices and algebraic dimensionality remain open research questions, lending plausibility to the claimed contributions while acknowledging the inherent limitations of a bounded literature search.

Taxonomy

Core-task Taxonomy Papers
44
3
Claimed Contributions
30
Contribution Candidate Papers Compared
0
Refutable Paper

Research Landscape Overview

Core task: time series forecasting using hypercomplex spaces. The field leverages hypercomplex algebras—quaternions, octonions, and higher-order structures—to capture multi-dimensional dependencies and correlations in temporal data. The taxonomy reveals a diverse landscape organized around several main branches: Hypercomplex Neural Network Architectures for Time Series develop specialized layers and recurrent models that exploit quaternionic or octonion arithmetic for richer feature interactions, as seen in works like Deep Hypercomplex Networks[4] and Deep Hypercomplex Spatiotemporal[5]. Frequency-Domain and Transform-Based Hypercomplex Methods apply wavelet or Fourier transforms in hypercomplex domains to extract spectral patterns, exemplified by Hyper-Complex Wavelet Expression[8] and Hyper-Complex Frequency Aggregation[6]. Meanwhile, Hypercomplex Statistical Modeling and ARMA Methods extend classical autoregressive frameworks to quaternionic settings, such as ARMA Quaternions[12], and Domain-Specific Applications demonstrate practical deployments in finance, environmental monitoring, and beyond, including Hypercomplex Stock Forecasting[2] and Quaternion Air Quality[39]. Additional branches address learning algorithms, data fusion strategies, graph-structured correlations, and theoretical foundations, collectively illustrating how hypercomplex representations unify multi-channel or spatiotemporal signals within a single algebraic framework. A particularly active line of work explores multi-dimensional hypercomplex architectures that scale beyond quaternions to octonions and sedenions, enabling the modeling of increasingly complex inter-variable relationships. Numerion[0] sits squarely within this branch, emphasizing higher-order algebras for time series forecasting and sharing conceptual ground with Octonion Rainfall Runoff[21] and Metacognitive Sedenion[22], which similarly exploit extended hypercomplex spaces for environmental and cognitive prediction tasks. Compared to Metacognitive Octonion[23], which integrates metacognitive learning strategies with octonion representations, Numerion[0] appears to focus more directly on leveraging the algebraic structure itself for temporal pattern extraction. Across these studies, a recurring theme is the trade-off between representational richness and computational complexity: while higher-dimensional hypercomplex spaces can encode intricate dependencies, they also demand careful design of training algorithms and initialization schemes. Open questions remain about optimal algebra selection for different forecasting horizons and the interpretability of learned hypercomplex weights in practical deployment scenarios.

Claimed Contributions

Generalization of linear layers and Tanh activation to arbitrary power-of-two hypercomplex spaces with RHR-MLP architecture

The authors extend standard linear transformations and the Tanh activation function from real numbers to hypercomplex spaces of any power-of-two dimension (complex, quaternion, octonion, sedenion, etc.). They propose the RHR-MLP architecture that maps inputs to hypercomplex spaces, processes them with hypercomplex linear layers and a novel HNTanh activation, then maps back to real space.

10 retrieved papers
Numerion: a multi-hypercomplex time series forecasting model with natural multi-frequency decomposition

Numerion is a forecasting model that employs multiple RHR-MLPs operating in parallel across different hypercomplex spaces (real, complex, quaternion, octonion, sedenion). It naturally decomposes time series into multi-frequency components without complex model structures, then adaptively fuses predictions from each space using a learned fusion mechanism.

10 retrieved papers
Demonstration that higher-dimensional hypercomplex spaces naturally capture lower-frequency temporal features

Through visualization and quantitative analysis, the authors show that mapping time series to higher-dimensional hypercomplex spaces causes characteristic frequencies to naturally decrease, with higher-dimensional spaces (e.g., sedenions) primarily modeling low-frequency trends while lower-dimensional spaces (real, complex) capture high-frequency fluctuations.

10 retrieved papers

Core Task Comparisons

Comparisons with papers in the same taxonomy category

Contribution Analysis

Detailed comparisons for each claimed contribution

Contribution

Generalization of linear layers and Tanh activation to arbitrary power-of-two hypercomplex spaces with RHR-MLP architecture

The authors extend standard linear transformations and the Tanh activation function from real numbers to hypercomplex spaces of any power-of-two dimension (complex, quaternion, octonion, sedenion, etc.). They propose the RHR-MLP architecture that maps inputs to hypercomplex spaces, processes them with hypercomplex linear layers and a novel HNTanh activation, then maps back to real space.

Contribution

Numerion: a multi-hypercomplex time series forecasting model with natural multi-frequency decomposition

Numerion is a forecasting model that employs multiple RHR-MLPs operating in parallel across different hypercomplex spaces (real, complex, quaternion, octonion, sedenion). It naturally decomposes time series into multi-frequency components without complex model structures, then adaptively fuses predictions from each space using a learned fusion mechanism.

Contribution

Demonstration that higher-dimensional hypercomplex spaces naturally capture lower-frequency temporal features

Through visualization and quantitative analysis, the authors show that mapping time series to higher-dimensional hypercomplex spaces causes characteristic frequencies to naturally decrease, with higher-dimensional spaces (e.g., sedenions) primarily modeling low-frequency trends while lower-dimensional spaces (real, complex) capture high-frequency fluctuations.