Residential College | false |
Status | 已發表Published |
TriMLP: A Foundational MLP-Like Architecture for Sequential Recommendation | |
Jiang, Yiheng1; Xu, Yuanbo1![]() ![]() | |
2024-11 | |
Source Publication | ACM Transactions on Information Systems
![]() |
ISSN | 1046-8188 |
Volume | 42Issue:6Pages:157 |
Abstract | In this work, we present TriMLP as a foundational MLP-like architecture for the sequential recommendation, simultaneously achieving computational efficiency and promising performance. First, we empirically study the incompatibility between existing purely MLP-based models and sequential recommendation, that the inherent fully-connective structure endows historical user-item interactions (referred as tokens) with unrestricted communications and overlooks the essential chronological order in sequences. Then, we propose the MLP-based Triangular Mixer to establish ordered contact among tokens and excavate the primary sequential modeling capability under the standard auto-regressive training fashion. It contains (1) a global mixing layer that drops the lower-triangle neurons in MLP to block the anti-chronological connections from future tokens and (2) a local mixing layer that further disables specific upper-triangle neurons to split the sequence as multiple independent sessions. The mixer serially alternates these two layers to support fine-grained preferences modeling, where the global one focuses on the long-range dependency in the whole sequence, and the local one calls for the short-term patterns in sessions. Experimental results on 12 datasets of different scales from 4 benchmarks elucidate that TriMLP consistently attains favorable accuracy/efficiency tradeoff over all validated datasets, where the average performance boost against several state-of-the-art baselines achieves up to 14.88%, and the maximum reduction of inference time reaches 23.73%. The intriguing properties render TriMLP a strong contender to the well-established RNN-, CNN-, and Transformer-based sequential recommenders. Code is available at https://github.com/jiangyiheng1/TriMLP. |
Keyword | Sequential Recommendation Data Mining Multi-layer Perceptron |
DOI | 10.1145/3670995 |
URL | View the original |
Indexed By | SCIE |
Language | 英語English |
WOS Research Area | Computer Science |
WOS Subject | Computer Science, Information Systems |
WOS ID | WOS:001358217900002 |
Publisher | ASSOC COMPUTING MACHINERY, 1601 Broadway, 10th Floor, NEW YORK, NY 10019-7434 |
Scopus ID | 2-s2.0-85207724751 |
Fulltext Access | |
Citation statistics | |
Document Type | Journal article |
Collection | THE STATE KEY LABORATORY OF INTERNET OF THINGS FOR SMART CITY (UNIVERSITY OF MACAU) |
Corresponding Author | Xu, Yuanbo |
Affiliation | 1.Lab of Mobile Intelligent Computing (MIC), College of Computer Science and Technology, Jilin University, Changchun, China 2.Department of Computer and Information Science, The State Key Laboratory of Internet of Things for Smart City, University of Macau,Macau, China 3.Key Laboratory of Trustworthy Distributed Computing and Service (MoE), Beijing University of Posts and Telecommunications, Beijing, China 4.Institute of Artificial Intelligence, Beihang University, Beijing, China 5.Thrust of Artificial Intelligence, Hong Kong University of Science and Technology (Guangzhou), Hong Kong, Hong Kong |
Recommended Citation GB/T 7714 | Jiang, Yiheng,Xu, Yuanbo,Yang, Yongjian,et al. TriMLP: A Foundational MLP-Like Architecture for Sequential Recommendation[J]. ACM Transactions on Information Systems, 2024, 42(6), 157. |
APA | Jiang, Yiheng., Xu, Yuanbo., Yang, Yongjian., Yang, Funing., Wang, Pengyang., Li, Chaozhuo., Zhuang, Fuzhen., & Xiong, Hui (2024). TriMLP: A Foundational MLP-Like Architecture for Sequential Recommendation. ACM Transactions on Information Systems, 42(6), 157. |
MLA | Jiang, Yiheng,et al."TriMLP: A Foundational MLP-Like Architecture for Sequential Recommendation".ACM Transactions on Information Systems 42.6(2024):157. |
Files in This Item: | There are no files associated with this item. |
Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.
Edit Comment