Bidirectional Encoder Representations from Transformers (Bert) And Serialized Multi-Layer Multi-Head Attention Feature Location Model Foraspect-Level Sentiment Analysis

Main Article Content

I. Anette Regina, Dr. P. Sengottuvelan

Abstract

With the popularity of internet social platforms, sentiment analysis has become one of the hottest topics in Natural Language Processing (NLP). It has seen a lot of attention in the last few years. The purpose of Aspect-level Sentiment Classification (ASC) is to expose the sentiment polarity of users' opinions on a specific aspect in the text.  ASC has two distinct parts such as Aspect Extraction (AE) and labeling the Aspects with Sentiment Polarity (ALSA). However, the existing ALSA methods mainly focus on attention mechanisms and recurrent neural networks. They lack emotional sensitivity to the position of aspect words and tend to ignore long-term dependencies.  In this paper, argue that the prediction of aspect-level sentiment polarity depends on both context and target. Bidirectional Encoder Representations from Transformers (BERT) and Serialized Multi-layer Multi-Head Attention (SMMHA) based feature location method is proposed to solve the problem of ALSA.  Specifically, a pretrained BERT model is proposed to mine more aspect-level auxiliary information from the comment context. For the sake of learning the expression features of aspect words and the interactive information of aspect words’ context, SMMHA feature extraction method is introduced for ASC. Amazon customer review data is proposed which focuses on finding aspect terms from each review, applying classification algorithms to find the score of each review. In this method, SMMHA is introduced to better capture the sentiment features in short texts. “Amazon Customer Review Dataset” which is collected from Amazon. The study shows that assigning higher results in accuracy, precision, recall, and F1-score of the sentiment prediction when compared to existing methods.


 

Article Details

Section
Articles