arXiv Analytics

Sign in

arXiv:2308.08235 [cs.LG]AbstractReferencesReviewsResources

The Expressive Power of Graph Neural Networks: A Survey

Bingxu Zhang, Changjun Fan, Shixuan Liu, Kuihua Huang, Xiang Zhao, Jincai Huang, Zhong Liu

Published 2023-08-16Version 1

Graph neural networks (GNNs) are effective machine learning models for many graph-related applications. Despite their empirical success, many research efforts focus on the theoretical limitations of GNNs, i.e., the GNNs expressive power. Early works in this domain mainly focus on studying the graph isomorphism recognition ability of GNNs, and recent works try to leverage the properties such as subgraph counting and connectivity learning to characterize the expressive power of GNNs, which are more practical and closer to real-world. However, no survey papers and open-source repositories comprehensively summarize and discuss models in this important direction. To fill the gap, we conduct a first survey for models for enhancing expressive power under different forms of definition. Concretely, the models are reviewed based on three categories, i.e., Graph feature enhancement, Graph topology enhancement, and GNNs architecture enhancement.

Related articles: Most relevant | Search more
arXiv:2312.08671 [cs.LG] (Published 2023-12-14)
Uplifting the Expressive Power of Graph Neural Networks through Graph Partitioning
arXiv:2003.04078 [cs.LG] (Published 2020-03-09)
A Survey on The Expressive Power of Graph Neural Networks
arXiv:2304.01575 [cs.LG] (Published 2023-04-04)
The expressive power of pooling in Graph Neural Networks