Bi-directional Encoder Representations from Transformers Based for Sentiment Analysis from Consumer Reviews


GÖKER H.

Sakarya University Journal of Computer and Information Sciences, cilt.8, sa.3, ss.484-495, 2025 (Scopus) identifier

Özet

Structured data has a standardized format for easy access, organization, and categorization. However, approximately 95% of data, such as text files or online reviews, is unstructured, and these texts do not have standard rules. Unstructured data analysis, especially when the amount of data to be examined is substantial, requires considerable effort, cost, and time, and classical statistical methods are often insufficient. Transformer models, the latest technological models in natural language processing (NLP), are the strongest candidates to overcome these limits. In this paper, we propose the bi-directional encoder representations from transformers (BERT) model-based solution for sentiment analysis of consumer reviews. The dataset comprises 10975 consumer reviews of technological products from an e-commerce platform and was transformed into a structured dataset using data preprocessing. Then, we compared the performance of the BERT transformer model with deep learning models, specifically convolutional neural networks (CNN), long short-term memory (LSTM), and bidirectional long short-term memory (B-LSTM). Experimental results confirmed that the BERT transformer model achieved a higher kappa of 96.6% and an overall accuracy of 97.78% for multi-classification of consumer reviews. The proposed transformer-based model outperforms the state-of-the-art models, providing a reliable and efficient solution.