Stock Price Forecasting Using Deep Learning: A Comparative Study of CNN, LSTM, and Transformer Models with Feature Engineering Insights

Authors

  • Jiyu Ning Author

DOI:

https://doi.org/10.61173/j8z47028

Keywords:

LSTM, Transformer, CNN, Stock Forecasting, Feature Engineering

Abstract

Accurate forecasting of stock prices remains a challenging yet essential task in financial modeling. This paper investigates the effectiveness of deep learning architectures—convolutional neural networks (CNN), long short-term memory networks (LSTM), and transformer models—for predicting the average stock price over a 30-day horizon. Historical market data from 12 representative Hong Kong-listed companies are employed to evaluate the performance of each model, both with and without the inclusion of traditional technical indicators (Type 1) and derived statistical features (Type 2). Results show that the Transformer model trained on raw price and volume data achieves superior performance, recording the lowest Root Mean Square Error (RMSE = 0.3799), Mean Absolute Error (MAE = 0.2909), and the highest directional accuracy (51.00%), outperforming other models and exceeding the random baseline. In contrast, the integration of technical indicators results in decreased performance, suggesting potential overfitting or feature redundancy. The research underscores the effectiveness of attention-based architectures in financial time series prediction and emphasizes the importance of cautious feature selection when incorporating domain-specific indicators. These findings provide practical guidance for quantitative analysts and financial data scientists regarding model architecture choices and feature engineering strategies in stock forecasting tasks.

Downloads

Published

2025-10-23

Issue

Section

Articles