Numerion: A Multi-Hypercomplex Model for Time Series Forecasting
Overview
Overall Novelty Assessment
The paper proposes Numerion, a time series forecasting model that maps sequences into multiple hypercomplex spaces of varying power-of-two dimensions, introducing a Real-Hypercomplex-Real MLP architecture with generalized linear layers and activation functions. It resides in the Multi-Dimensional Hypercomplex Architectures leaf, which contains four papers including the original work. This leaf sits within the broader Hypercomplex Neural Network Architectures branch, indicating a moderately populated research direction focused on extending beyond quaternion-only methods to higher-dimensional algebras like octonions and sedenions.
The taxonomy reveals neighboring leaves addressing Quaternion-Based Neural Architectures (five papers) and Spatiotemporal Hypercomplex Networks (four papers), alongside parallel branches for Frequency-Domain methods, Data Fusion, and Statistical Modeling. Numerion's emphasis on arbitrary power-of-two dimensions and natural frequency decomposition positions it at the intersection of architectural innovation and transform-based insights, diverging from purely quaternion-focused designs and from methods that rely on explicit Fourier or wavelet transforms. The scope notes clarify that higher-dimensional methods belong here, while quaternion-specific work and general theory reside elsewhere.
Among thirty candidates examined, none clearly refuted any of the three contributions. Contribution A (RHR-MLP generalization) examined ten candidates with zero refutable overlaps; Contribution B (Numerion model) and Contribution C (frequency-characteristic insight) each examined ten candidates with identical outcomes. This suggests that within the limited semantic search scope, the specific combination of arbitrary-dimensional hypercomplex layers, multi-space fusion, and the frequency-decomposition principle appears distinct from prior work, though the search scale precludes exhaustive coverage of the broader hypercomplex forecasting literature.
Based on top-thirty semantic matches, the work appears to occupy a relatively novel position within multi-dimensional hypercomplex architectures, though the analysis cannot rule out relevant prior art outside this candidate set. The taxonomy context indicates a moderately active subfield where architectural choices and algebraic dimensionality remain open research questions, lending plausibility to the claimed contributions while acknowledging the inherent limitations of a bounded literature search.
Taxonomy
Research Landscape Overview
Claimed Contributions
The authors extend standard linear transformations and the Tanh activation function from real numbers to hypercomplex spaces of any power-of-two dimension (complex, quaternion, octonion, sedenion, etc.). They propose the RHR-MLP architecture that maps inputs to hypercomplex spaces, processes them with hypercomplex linear layers and a novel HNTanh activation, then maps back to real space.
Numerion is a forecasting model that employs multiple RHR-MLPs operating in parallel across different hypercomplex spaces (real, complex, quaternion, octonion, sedenion). It naturally decomposes time series into multi-frequency components without complex model structures, then adaptively fuses predictions from each space using a learned fusion mechanism.
Through visualization and quantitative analysis, the authors show that mapping time series to higher-dimensional hypercomplex spaces causes characteristic frequencies to naturally decrease, with higher-dimensional spaces (e.g., sedenions) primarily modeling low-frequency trends while lower-dimensional spaces (real, complex) capture high-frequency fluctuations.
Core Task Comparisons
Comparisons with papers in the same taxonomy category
[21] Rainfallârunoff modelling using octonion-valued neural networks PDF
[22] Metacognitive sedenion-valued neural network and its learning algorithm PDF
[23] Metacognitive octonion-valued neural networks as they relate to time series analysis PDF
Contribution Analysis
Detailed comparisons for each claimed contribution
Generalization of linear layers and Tanh activation to arbitrary power-of-two hypercomplex spaces with RHR-MLP architecture
The authors extend standard linear transformations and the Tanh activation function from real numbers to hypercomplex spaces of any power-of-two dimension (complex, quaternion, octonion, sedenion, etc.). They propose the RHR-MLP architecture that maps inputs to hypercomplex spaces, processes them with hypercomplex linear layers and a novel HNTanh activation, then maps back to real space.
[5] Deep hypercomplex networks for spatiotemporal data processing: Parameter efficiency and superior performance PDF
[11] Hypercomplex signal processing in digital twin of the ocean: theory and application PDF
[65] Quaternionic Convolutional Neural Networks with Trainable Bessel Activation Functions PDF
[66] ⦠neural networks and their relationship with real and hypercomplex-valued neural networks: Incorporating intercorrelation between features into neural networks ⦠PDF
[67] QCNN-H: Single-image dehazing using quaternion neural networks PDF
[68] Synchronization of hypercomplex neural networks with mixed time-varying delays PDF
[69] Towards Explaining Hypercomplex Neural Networks PDF
[70] Understanding vector-valued neural networks and their relationship with real and hypercomplex-valued neural networks: Incorporating intercorrelation between ⦠PDF
[71] A survey of quaternion neural networks PDF
[72] Hypercomplex neural networks: Exploring quaternion, octonion, and beyond in deep learning PDF
Numerion: a multi-hypercomplex time series forecasting model with natural multi-frequency decomposition
Numerion is a forecasting model that employs multiple RHR-MLPs operating in parallel across different hypercomplex spaces (real, complex, quaternion, octonion, sedenion). It naturally decomposes time series into multi-frequency components without complex model structures, then adaptively fuses predictions from each space using a learned fusion mechanism.
[55] Fedformer: Frequency enhanced decomposed transformer for long-term series forecasting PDF
[56] A survey on deep learning based time series analysis with frequency transformation PDF
[57] KEDformer: Knowledge extraction seasonal trend decomposition for long-term sequence prediction PDF
[58] Frequency-domain MLPs are more effective learners in time series forecasting PDF
[59] A SpatialâTemporal Time Series Decomposition for Improving Independent Channel Forecasting PDF
[60] Time series online forecasting based on sequence decomposition learning networks PDF
[61] Timekan: Kan-based frequency decomposition learning architecture for long-term time series forecasting PDF
[62] Time Series Diffusion in the Frequency Domain PDF
[63] Learning hierarchical timeâfrequency representation for long-term time series forecasting PDF
[64] Autoformer: Decomposition transformers with auto-correlation for long-term series forecasting PDF
Demonstration that higher-dimensional hypercomplex spaces naturally capture lower-frequency temporal features
Through visualization and quantitative analysis, the authors show that mapping time series to higher-dimensional hypercomplex spaces causes characteristic frequencies to naturally decrease, with higher-dimensional spaces (e.g., sedenions) primarily modeling low-frequency trends while lower-dimensional spaces (real, complex) capture high-frequency fluctuations.