Gaoyong Lu1, Yang Ou1, Zhihong Wang2, Yingnan Qu2, Yingsheng Xia2, Dibin Tang2, Igor Kotenko3, Wei Li2,4,*
CMC-Computers, Materials & Continua, Vol.85, No.2, pp. 2403-2441, 2025, DOI:10.32604/cmc.2025.068024
- 23 September 2025
Abstract Deep learning (DL) has revolutionized time series forecasting (TSF), surpassing traditional statistical methods (e.g., ARIMA) and machine learning techniques in modeling complex nonlinear dynamics and long-term dependencies prevalent in real-world temporal data. This comprehensive survey reviews state-of-the-art DL architectures for TSF, focusing on four core paradigms: (1) Convolutional Neural Networks (CNNs), adept at extracting localized temporal features; (2) Recurrent Neural Networks (RNNs) and their advanced variants (LSTM, GRU), designed for sequential dependency modeling; (3) Graph Neural Networks (GNNs), specialized for forecasting structured relational data with spatial-temporal dependencies; and (4) Transformer-based models, leveraging self-attention mechanisms to… More >