
What is ABSA? Aspect-based sentiment analysis (ABSA) is the problem of determining the'sentiment polarity' (i.e., the negative or positive emotional implications) of specific words in a sentence. While people may find this activity obvious and simple, computing models may find it significantly more difficult. Why LMIAN used for ABSA? Existing studies have recognised the significance of interactive learning in ABSA and have created numerous approaches to precisely model aspect words and their contexts through interactive learning. However, these systems usually use a superficial interactive approach to describe aspect words and their contexts, which may result in a lack of sophisticated sentiment information. To address this issue, we propose a Lightweight Multilayer Interactive Attention Network (LMIAN) for ABSA. How researchers teach ABSA? We begin by initialising word embedding vectors with a pre-trained language model. Second, an interactive computational layer is intended to establish relationships between aspect words and their surroundings. Multiple computational layers using neural attention models calculate the correlation degree. Third, among the computational levels, we employ a parameter-sharing technique. This enables the model to learn complex sentiment features while using less memory. Extensive testing have shown that our LMIAN provides a superior balance between model performance, size, and GPU memory consumption. In the future, we will further optimise our interactive attention model to have higher performance while using less GPU memory. Deep interaction of local features with aspect words is one of our concepts for improving model performance by lowering interference information.
Previous Post