PERTANIKA JOURNAL OF SCIENCE AND TECHNOLOGY

 

e-ISSN 2231-8526
ISSN 0128-7680

Home / Regular Issue / JST Vol. 33 (2) Mar. 2025 / JST-5206-2024

 

Stock Trend Prediction Using Multi-attention Network on Domain-specific and Domain-general Features in News Headline

Phaik Ching Soon, Tien-Ping Tan, Huah Yong Chan and Keng Hoon Gan

Pertanika Journal of Science & Technology, Volume 33, Issue 2, March 2025

DOI: https://doi.org/10.47836/pjst.33.2.13

Keywords: Domain-general features, domain-specific features, multi-attention network, news sentiment analysis, stock price trend

Published on: 2025-03-07

In stock market prediction, using news headlines to anticipate stock trends has become increasingly important. Analyzing sentiment from these headlines makes it possible to predict the stock price trends of the targeted company and profit from the resulting trades. This study examines the impact of company-related news headlines on stock price trends. The objectives of this study are as follows: First, we propose a multi-attention network that incorporates the strength of long short-term memory (LSTM) and bidirectional encoder representations from transformers (BERT) to model domain-specific and domain-general features in news headlines to predict the stock price trend of companies. Second, the proposed model can model and evaluate the effect of news on the stock price trend of different companies. Third, we construct the Bursa Malaysia news headline dataset and automatically align headlines with target companies and their stock price trend. This study proposes that the LSTM WITH ATTENTION +BERT model should use domain-specific and domain-general features to predict stock price trends using news headlines. The proposed model is compared to several convention models and deep learning models. The LSTM WITH ATTENTION +BERT model achieved an accuracy of 50.68%, showing notable improvements over other approaches. It surpassed the Decision Tree by 11.2%, Naïve Bayes by 20.13%, and Support Vector Machine by 5.12%. Compared to the CNN, LSTM, and BERT models, the proposed model is 4.27%, 2.91% and 1.64% higher, respectively, in terms of accuracy. These results highlight the strength of the proposed model.