Next Article in Journal
Performance Analysis of Root Anycast Nodes Based on Active Measurement
Previous Article in Journal
Study on the Compact Balance Control Mechanism for Guinea Fowl Jumping Robot
Previous Article in Special Issue
Mixing and Matching Emotion Frameworks: Investigating Cross-Framework Transfer Learning for Dutch Emotion Detection
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Multitasking Learning Model Based on Hierarchical Attention Network for Arabic Sentiment Analysis Classification

by
Muath Alali
1,
Nurfadhlina Mohd Sharef
1,2,*,
Masrah Azrifah Azmi Murad
1,
Hazlina Hamdan
1 and
Nor Azura Husin
1
1
Intelligent Computing Research Group, Faculty of Computer Science and Information Technology, Universiti Putra Malaysia, Serdang 43400, Selangor, Malaysia
2
Laboratory of Computational Statistics and Operational Research, Institute of Mathematical Research, Universiti Putra Malaysia, Serdang 43400, Selangor, Malaysia
*
Author to whom correspondence should be addressed.
Electronics 2022, 11(8), 1193; https://doi.org/10.3390/electronics11081193
Submission received: 27 November 2021 / Revised: 22 December 2021 / Accepted: 24 December 2021 / Published: 9 April 2022
(This article belongs to the Special Issue Emerging Application of Sentiment Analysis Technologies)

Abstract

Limited approaches have been applied to Arabic sentiment analysis for a five-point classification problem. These approaches are based on single task learning with a handcrafted feature, which does not provide robust sentence representation. Recently, hierarchical attention networks have performed outstandingly well. However, when training such models as single-task learning, these models do not exhibit superior performance and robust latent feature representation in the case of a small amount of data, specifically on the Arabic language, which is considered a low-resource language. Moreover, these models are based on single task learning and do not consider the related tasks, such as ternary and binary tasks (cross-task transfer). Centered on these shortcomings, we regard five ternary tasks as relative. We propose a multitask learning model based on hierarchical attention network (MTLHAN) to learn the best sentence representation and model generalization, with shared word encoder and attention network across both tasks, by training three-polarity and five-polarity Arabic sentiment analysis tasks alternately and jointly. Experimental results showed outstanding performance of the proposed model, with high accuracy of 83.98%, 87.68%, and 84.59 on LABR, HARD, and BRAD datasets, respectively, and a minimum macro mean absolute error of 0.632% on the Arabic tweets dataset for five-point Arabic sentiment classification problem.
Keywords: Arabic sentiment analysis; multitask learning; ordinal classification; Arabic language Arabic sentiment analysis; multitask learning; ordinal classification; Arabic language

Share and Cite

MDPI and ACS Style

Alali, M.; Mohd Sharef, N.; Azmi Murad, M.A.; Hamdan, H.; Husin, N.A. Multitasking Learning Model Based on Hierarchical Attention Network for Arabic Sentiment Analysis Classification. Electronics 2022, 11, 1193. https://doi.org/10.3390/electronics11081193

AMA Style

Alali M, Mohd Sharef N, Azmi Murad MA, Hamdan H, Husin NA. Multitasking Learning Model Based on Hierarchical Attention Network for Arabic Sentiment Analysis Classification. Electronics. 2022; 11(8):1193. https://doi.org/10.3390/electronics11081193

Chicago/Turabian Style

Alali, Muath, Nurfadhlina Mohd Sharef, Masrah Azrifah Azmi Murad, Hazlina Hamdan, and Nor Azura Husin. 2022. "Multitasking Learning Model Based on Hierarchical Attention Network for Arabic Sentiment Analysis Classification" Electronics 11, no. 8: 1193. https://doi.org/10.3390/electronics11081193

APA Style

Alali, M., Mohd Sharef, N., Azmi Murad, M. A., Hamdan, H., & Husin, N. A. (2022). Multitasking Learning Model Based on Hierarchical Attention Network for Arabic Sentiment Analysis Classification. Electronics, 11(8), 1193. https://doi.org/10.3390/electronics11081193

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop