Residential Collegefalse
Status已發表Published
UGTransformer: Unsupervised Graph Transformer Representation Learning
Xu, Lixiang1; Liu, Haifeng1; Cui, Qingzhe1; Luo, Bin2; Li, Ning1; Chen, Yan1; Tang, Yuanyan3
2023
Conference NameInternational Joint Conference on Neural Networks (IJCNN)
Source PublicationProceedings of the International Joint Conference on Neural Networks
Volume2023-June
Conference DateJUN 18-23, 2023
Conference PlaceBroadbeach, AUSTRALIA
Abstract

This paper mainly studies graph representation learning in unsupervised scenarios combined with Transformer models. Transformer network models have been widely used in many fields of machine learning and deep learning, and the application of transformer architectures to graph data has been very popular recently. For graph data, the field of graph representation learning has recently attracted a lot of attention. Graph-level representation is widely used in the real world, such as drug molecule design and disease classification in biochemistry. Traditional graph kernel methods, which design different graph kernels for different substructures, are simple but have poor generalization performance. Recently methods based on language models, such as graph2vec, use a particular substructure as the graph representation, which is also similar to the hand-crafted approach and also leads to poor generalization ability. In this paper, we propose the UGTransformer model, which builds on the standard Transformer architecture. We introduce several simple and effective structural encoding methods in order to encode the structural information of the graph into the model efficiently. The unsupervised representation of graphs is learned through a multi-headed attention mechanism and by using powerful aggregation functions. We conducted experiments on a benchmark date set for graph classification, and the experimental results validate the effectiveness of our proposed model.

KeywordGraph Classification Structural Encoding Methods Transformer Unsupervised Graph Representation Learning
DOI10.1109/IJCNN54540.2023.10192010
URLView the original
Indexed ByCPCI-S
Language英語English
WOS Research AreaComputer Science ; Engineering
WOS SubjectComputer Science, Artificial Intelligence ; Computer Science, Hardware & Architecture ; Engineering, Electrical & Electronic
WOS IDWOS:001046198707077
Scopus ID2-s2.0-85169567619
Fulltext Access
Citation statistics
Document TypeConference paper
CollectionDEPARTMENT OF COMPUTER AND INFORMATION SCIENCE
Corresponding AuthorTang, Yuanyan
Affiliation1.School of Artificial Intelligence and Big Data, Hefei University, Hefei, China
2.School of Computer Science and Technology, Anhui University, Hefei, China
3.Zhuhai UM Science and Technology Research Institute, FST University of Macau, Macao
Corresponding Author AffilicationFaculty of Science and Technology
Recommended Citation
GB/T 7714
Xu, Lixiang,Liu, Haifeng,Cui, Qingzhe,et al. UGTransformer: Unsupervised Graph Transformer Representation Learning[C], 2023.
APA Xu, Lixiang., Liu, Haifeng., Cui, Qingzhe., Luo, Bin., Li, Ning., Chen, Yan., & Tang, Yuanyan (2023). UGTransformer: Unsupervised Graph Transformer Representation Learning. Proceedings of the International Joint Conference on Neural Networks, 2023-June.
Files in This Item:
There are no files associated with this item.
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Xu, Lixiang]'s Articles
[Liu, Haifeng]'s Articles
[Cui, Qingzhe]'s Articles
Baidu academic
Similar articles in Baidu academic
[Xu, Lixiang]'s Articles
[Liu, Haifeng]'s Articles
[Cui, Qingzhe]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Xu, Lixiang]'s Articles
[Liu, Haifeng]'s Articles
[Cui, Qingzhe]'s Articles
Terms of Use
No data!
Social Bookmark/Share
All comments (0)
No comment.
 

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.