Residential Collegefalse
Status已發表Published
Redundancy-Free Message Passing for Graph Neural Networks.
Chen, Rongqin1; Zhang, Shenghui1; U, LEONG HOU1; LI, Ye2
2022-11
Conference Name2022 Conference on Neural Information Processing Systems
Conference Date2022 Nov 28 - 2022 Dec 09
Conference PlaceNew Orleans Ernest N. Morial Convention Center
Abstract

Graph Neural Networks (GNNs) resemble the Weisfeiler-Lehman (1-WL) test, which iteratively update the representation of each node by aggregating information from WL-tree. However, despite the computational superiority of the iterative aggregation scheme, it introduces redundant message flows to encode nodes. We found that the redundancy in message passing prevented conventional GNNs from propagating the information of long-length paths and learning graph similarities. In order to address this issue, we proposed Redundancy-Free Graph Neural Network (RFGNN), in which the information of each path (of limited length) in the original graph is propagated along a single message flow. Our rigorous theoretical analysis demonstrates the following advantages of RFGNN: (1) RFGNN is strictly more powerful than 1-WL; (2) RFGNN efficiently propagate structural information in original graphs, avoiding the over-squashing issue; and (3) RFGNN could capture subgraphs at multiple levels of granularity, and are more likely to encode graphs with closer graph edit distances into more similar representations. The experimental evaluation of graph-level prediction benchmarks confirmed our theoretical assertions, and the performance of the RFGNN can achieve the best results in most datasets.

URLView the original
Document TypeConference paper
CollectionDEPARTMENT OF COMPUTER AND INFORMATION SCIENCE
Faculty of Science and Technology
Corresponding AuthorChen, Rongqin; Zhang, Shenghui
Affiliation1.University of Macau
2.Shenzhen Institutes of Advanced Technology
First Author AffilicationUniversity of Macau
Corresponding Author AffilicationUniversity of Macau
Recommended Citation
GB/T 7714
Chen, Rongqin,Zhang, Shenghui,U, LEONG HOU,et al. Redundancy-Free Message Passing for Graph Neural Networks.[C], 2022.
APA Chen, Rongqin., Zhang, Shenghui., U, LEONG HOU., & LI, Ye (2022). Redundancy-Free Message Passing for Graph Neural Networks.. .
Files in This Item:
There are no files associated with this item.
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Chen, Rongqin]'s Articles
[Zhang, Shenghui]'s Articles
[U, LEONG HOU]'s Articles
Baidu academic
Similar articles in Baidu academic
[Chen, Rongqin]'s Articles
[Zhang, Shenghui]'s Articles
[U, LEONG HOU]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Chen, Rongqin]'s Articles
[Zhang, Shenghui]'s Articles
[U, LEONG HOU]'s Articles
Terms of Use
No data!
Social Bookmark/Share
All comments (0)
No comment.
 

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.