UM
Residential Collegefalse
Status已發表Published
KRAN: Knowledge Refining Attention Network for Recommendation
Zhang, Zhenyu1; Zhang, Lei1; Yang, Dingqi2; Yang, Liu1
2022-04-01
Source PublicationACM Transactions on Knowledge Discovery from Data
ISSN1556-4681
Volume16Issue:2
Abstract

Recommender algorithms combining knowledge graph and graph convolutional network are becoming more and more popular recently. Specifically, attributes describing the items to be recommended are often used as additional information. These attributes along with items are highly interconnected, intrinsically forming a Knowledge Graph (KG). These algorithms use KGs as an auxiliary data source to alleviate the negative impact of data sparsity. However, these graph convolutional network based algorithms do not distinguish the importance of different neighbors of entities in the KG, and according to Pareto's principle, the important neighbors only account for a small proportion. These traditional algorithms can not fully mine the useful information in the KG. To fully release the power of KGs for building recommender systems, we propose in this article KRAN, a Knowledge Refining Attention Network, which can subtly capture the characteristics of the KG and thus boost recommendation performance. We first introduce a traditional attention mechanism into the KG processing, making the knowledge extraction more targeted, and then propose a refining mechanism to improve the traditional attention mechanism to extract the knowledge in the KG more effectively. More precisely, KRAN is designed to use our proposed knowledge-refining attention mechanism to aggregate and obtain the representations of the entities (both attributes and items) in the KG. Our knowledge-refining attention mechanism first measures the relevance between an entity and it's neighbors in the KG by attention coefficients, and then further refines the attention coefficients using a "richer-get-richer"principle, in order to focus on highly relevant neighbors while eliminating less relevant neighbors for noise reduction. In addition, for the item cold start problem, we propose KRAN-CD, a variant of KRAN, which further incorporates pre-trained KG embeddings to handle cold start items. Experiments show that KRAN and KRAN-CD consistently outperform state-of-the-art baselines across different settings.

KeywordAttention Mechanism Data Sparsity Item Cold Start Knowledge Graph Refine
DOI10.1145/3470783
URLView the original
Indexed BySCIE
Language英語English
WOS Research AreaComputer Science
WOS SubjectComputer Science, Information Systems ; Computer Science, Software Engineering
WOS IDWOS:000696204600020
Scopus ID2-s2.0-85115038676
Fulltext Access
Citation statistics
Document TypeJournal article
CollectionUniversity of Macau
Corresponding AuthorYang, Liu
Affiliation1.Tianjin University, Beijing Dahua Electronic Instrument LLC, Tianjin, China
2.University of Macau, Macao
Recommended Citation
GB/T 7714
Zhang, Zhenyu,Zhang, Lei,Yang, Dingqi,et al. KRAN: Knowledge Refining Attention Network for Recommendation[J]. ACM Transactions on Knowledge Discovery from Data, 2022, 16(2).
APA Zhang, Zhenyu., Zhang, Lei., Yang, Dingqi., & Yang, Liu (2022). KRAN: Knowledge Refining Attention Network for Recommendation. ACM Transactions on Knowledge Discovery from Data, 16(2).
MLA Zhang, Zhenyu,et al."KRAN: Knowledge Refining Attention Network for Recommendation".ACM Transactions on Knowledge Discovery from Data 16.2(2022).
Files in This Item:
There are no files associated with this item.
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Zhang, Zhenyu]'s Articles
[Zhang, Lei]'s Articles
[Yang, Dingqi]'s Articles
Baidu academic
Similar articles in Baidu academic
[Zhang, Zhenyu]'s Articles
[Zhang, Lei]'s Articles
[Yang, Dingqi]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Zhang, Zhenyu]'s Articles
[Zhang, Lei]'s Articles
[Yang, Dingqi]'s Articles
Terms of Use
No data!
Social Bookmark/Share
All comments (0)
No comment.
 

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.