Crandore Hub

transformerForecasting

Transformer Deep Learning Model for Time Series Forecasting

Time series forecasting faces challenges due to the non-stationarity, nonlinearity, and chaotic nature of the data. Traditional deep learning models like Recurrent Neural Network (RNN), Long Short-Term Memory (LSTM), and Gated Recurrent Unit (GRU) process data sequentially but are inefficient for long sequences. To overcome the limitations of these models, we proposed a transformer-based deep learning architecture utilizing an attention mechanism for parallel processing, enhancing prediction accuracy and efficiency. This paper presents user-friendly code for the implementation of the proposed transformer-based deep learning architecture utilizing an attention mechanism for parallel processing. References: Nayak et al. (2024) <doi:10.1007/s40808-023-01944-7> and Nayak et al. (2024) <doi:10.1016/j.simpa.2024.100716>.

Versions across snapshots

VersionRepositoryFileSize
0.1.0 rolling linux/jammy R-4.5 transformerForecasting_0.1.0.tar.gz 70.9 KiB
0.1.0 rolling linux/noble R-4.5 transformerForecasting_0.1.0.tar.gz 70.8 KiB
0.1.0 rolling source/ R- transformerForecasting_0.1.0.tar.gz 68.9 KiB
0.1.0 latest linux/jammy R-4.5 transformerForecasting_0.1.0.tar.gz 70.9 KiB
0.1.0 latest linux/noble R-4.5 transformerForecasting_0.1.0.tar.gz 70.8 KiB
0.1.0 latest source/ R- transformerForecasting_0.1.0.tar.gz 68.9 KiB
0.1.0 2026-04-26 source/ R- transformerForecasting_0.1.0.tar.gz 68.9 KiB
0.1.0 2026-04-23 source/ R- transformerForecasting_0.1.0.tar.gz 68.9 KiB
0.1.0 2026-04-09 windows/windows R-4.5 transformerForecasting_0.1.0.zip 74.0 KiB
0.1.0 2025-04-20 source/ R- transformerForecasting_0.1.0.tar.gz 68.9 KiB

Dependencies (latest)

Imports

Suggests