Residential College | false |
Status | 即將出版Forthcoming |
BMT-Net: Broad Multitask Transformer Network for Sentiment Analysis | |
Zhang, Tong1; Gong, Xinrong2; Chen, C. L.P.3 | |
2021 | |
Source Publication | IEEE Transactions on Cybernetics |
ABS Journal Level | 3 |
ISSN | 2168-2267 |
Abstract | Sentiment analysis uses a series of automated cognitive methods to determine the author's or speaker's attitudes toward an expressed object or text's overall emotional tendencies. In recent years, the growing scale of opinionated text from social networks has brought significant challenges to humans' sentimental tendency mining. The pretrained language model designed to learn contextual representation achieves better performance than traditional learning word vectors. However, the existing two basic approaches for applying pretrained language models to downstream tasks, feature-based and fine-tuning methods, are usually considered separately. What is more, different sentiment analysis tasks cannot be handled by the single task-specific contextual representation. In light of these pros and cons, we strive to propose a broad multitask transformer network (BMT-Net) to address these problems. BMT-Net takes advantage of both feature-based and fine-tuning methods. It was designed to explore the high-level information of robust and contextual representation. Primarily, our proposed structure can make the learned representations universal across tasks via multitask transformers. In addition, BMT-Net can roundly learn the robust contextual representation utilized by the broad learning system due to its powerful capacity to search for suitable features in deep and broad ways. The experiments were conducted on two popular datasets of binary Stanford Sentiment Treebank (SST-2) and SemEval Sentiment Analysis in Twitter (Twitter). Compared with other state-of-the-art methods, the improved representation with both deep and broad ways is shown to achieve a better F1-score of 0.778 in Twitter and accuracy of 94.0% in the SST-2 dataset, respectively. These experimental results demonstrate the abilities of recognition in sentiment analysis and highlight the significance of previously overlooked design decisions about searching contextual features in deep and broad spaces. |
Keyword | Analytical Models Attention Mechanism Bit Error Rate Broad Learning System (Bls) Context Modeling Feature Extraction Multitask Learning (Mtl) Representation Learning Sentiment Analysis Sentiment Analysis Social Networking (Online) Task Analysis |
DOI | 10.1109/TCYB.2021.3050508 |
URL | View the original |
Language | 英語English |
WOS ID | WOS:000732184600001 |
Scopus ID | 2-s2.0-85102263947 |
Fulltext Access | |
Citation statistics | |
Document Type | Journal article |
Collection | University of Macau |
Affiliation | 1.School of Computer Science and Engineering, South China University of Technology, Guangzhou 510006, China, and also with the Pazhou Laboratory, Guangzhou 510335, China. 2.School of Computer Science and Engineering, South China University of Technology, Guangzhou 510006, China. 3.School of Computer Science and Engineering, South China University of Technology, Guangzhou 510006, China, also with the Pazhou Laboratory, Guangzhou 510335, China, and also with the Department of Computer and Information Science, Faculty of Science and Technology, University of Macau, Macau, China (e-mail: [email protected]). |
Recommended Citation GB/T 7714 | Zhang, Tong,Gong, Xinrong,Chen, C. L.P.. BMT-Net: Broad Multitask Transformer Network for Sentiment Analysis[J]. IEEE Transactions on Cybernetics, 2021. |
APA | Zhang, Tong., Gong, Xinrong., & Chen, C. L.P. (2021). BMT-Net: Broad Multitask Transformer Network for Sentiment Analysis. IEEE Transactions on Cybernetics. |
MLA | Zhang, Tong,et al."BMT-Net: Broad Multitask Transformer Network for Sentiment Analysis".IEEE Transactions on Cybernetics (2021). |
Files in This Item: | There are no files associated with this item. |
Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.
Edit Comment