{ "id": "2006.05205", "version": "v1", "published": "2020-06-09T12:04:50.000Z", "updated": "2020-06-09T12:04:50.000Z", "title": "On the Bottleneck of Graph Neural Networks and its Practical Implications", "authors": [ "Uri Alon", "Eran Yahav" ], "categories": [ "cs.LG", "stat.ML" ], "abstract": "Graph neural networks (GNNs) were shown to effectively learn from highly structured data containing elements (nodes) with relationships (edges) between them. GNN variants differ in how each node in the graph absorbs the information flowing from its neighbor nodes. In this paper, we highlight an inherent problem in GNNs: the mechanism of propagating information between neighbors creates a bottleneck when every node aggregates messages from its neighbors. This bottleneck causes the over-squashing of exponentially-growing information into fixed-size vectors. As a result, the graph fails to propagate messages flowing from distant nodes and performs poorly when the prediction task depends on long-range information. We demonstrate that the bottleneck hinders popular GNNs from fitting the training data. We show that GNNs that absorb incoming edges equally, like GCN and GIN, are more susceptible to over-squashing than other GNN types. We further show that existing, extensively-tuned, GNN-based models suffer from over-squashing and that breaking the bottleneck improves state-of-the-art results without any hyperparameter tuning or additional weights.", "revisions": [ { "version": "v1", "updated": "2020-06-09T12:04:50.000Z" } ], "analyses": { "keywords": [ "graph neural networks", "practical implications", "structured data containing elements", "information", "bottleneck hinders popular gnns" ], "note": { "typesetting": "TeX", "pages": 0, "language": "en", "license": "arXiv", "status": "editable" } } }