Enhanced Forecasting Model Using Transformers, Extended Long-Short-Term Memory, and Randomized Fuzzy Cognitive Maps
Resumo
This paper presents TxL-RHFCM, a cutting-edge hybrid deep learning model designed for time series forecasting. The architecture seamlessly combines Transformers, extended long-short-term memory (xLSTM), and randomized high-order fuzzy cognitive maps (R-HFCM), with the latter serving as a reservoir computing mechanism. The model architecture is structured into four key components. Initially, the Transformer block processes the raw time series data, employing attention mechanisms to uncover intricate relationships between sequence elements and produce refined representations. These transformed sequences are then passed to the xLSTM unit to compute hidden states from the final tokens. Subsequently, the R-HFCM module leverages these hidden states to derive concept activation states. Finally, a feed-forward network extracts meaningful features from the R-HFCM output to generate the forecast. The proposed TxL-RHFCM model is rigorously evaluated on six benchmark case studies to predict electrical load consumption. Experimental results demonstrate its superior predictive accuracy compared to various state-of-the-art deep learning, machine learning, and fuzzy time series approaches.
Publicado
29/09/2025
Como Citar
ORANG, Omid; SILVA, Petrônio C. L.; GUIMARÃES, Frederico G..
Enhanced Forecasting Model Using Transformers, Extended Long-Short-Term Memory, and Randomized Fuzzy Cognitive Maps. In: BRAZILIAN CONFERENCE ON INTELLIGENT SYSTEMS (BRACIS), 35. , 2025, Fortaleza/CE.
Anais [...].
Porto Alegre: Sociedade Brasileira de Computação,
2025
.
p. 424-438.
ISSN 2643-6264.
