Engineering, Technology and Applied Science Research, cilt.16, sa.1, ss.32524-32533, 2026 (Scopus)
The unpredictable and highly dynamic nature of cryptocurrency markets has driven researchers to develop advanced forecasting techniques that can support decision-making in trading and risk management. This study proposes a hybrid deep learning framework that combines a Transformer with recurrent models for multi-step Bitcoin price forecasting. The model operates on log-differenced closing prices and is evaluated for 7-, 14-, and 21-day ahead prediction using recursive multi-step forecasting schemes. In addition to the proposed Transformer-based architectures, a comprehensive comparison was conducted against four benchmark recurrent models, namely Long Short-Term Memory (LSTM) and Gated Recurrent Unit (GRU), along with their bidirectional variants Bidirectional-LSTM (BiLSTM) and Bidirectional GRU (BiGRU). Model performance was assessed using standard regression metrics, including Mean Absolute Error (MAE), Root Mean Square Error (RMSE), Mean Absolute Percentage Error (MAPE), and Directional Accuracy (DA) to quantify the accuracy of predicted price movements. The experimental results show that the Transformer-based models outperform standalone recurrent architectures across all forecasting methods, with the Transformer-LSTM achieving the lowest error values and strong trend-tracking behavior. These findings highlight the effectiveness of hybrid attention-recurrent architectures for modeling the nonlinear and volatile dynamics of Bitcoin markets.