{ "id": "1912.07832", "version": "v1", "published": "2019-12-17T06:02:59.000Z", "updated": "2019-12-17T06:02:59.000Z", "title": "Deep Iterative and Adaptive Learning for Graph Neural Networks", "authors": [ "Yu Chen", "Lingfei Wu", "Mohammed J. Zaki" ], "comment": "6 pages. Accepted at the AAAI 2020 Workshop on Deep Learning on Graphs: Methodologies and Applications (AAAI DLGMA 2020). Final Version", "categories": [ "cs.LG", "stat.ML" ], "abstract": "In this paper, we propose an end-to-end graph learning framework, namely Deep Iterative and Adaptive Learning for Graph Neural Networks (DIAL-GNN), for jointly learning the graph structure and graph embeddings simultaneously. We first cast the graph structure learning problem as a similarity metric learning problem and leverage an adapted graph regularization for controlling smoothness, connectivity and sparsity of the generated graph. We further propose a novel iterative method for searching for a hidden graph structure that augments the initial graph structure. Our iterative method dynamically stops when the learned graph structure approaches close enough to the optimal graph. Our extensive experiments demonstrate that the proposed DIAL-GNN model can consistently outperform or match state-of-the-art baselines in terms of both downstream task performance and computational time. The proposed approach can cope with both transductive learning and inductive learning.", "revisions": [ { "version": "v1", "updated": "2019-12-17T06:02:59.000Z" } ], "analyses": { "keywords": [ "graph neural networks", "deep iterative", "adaptive learning", "learned graph structure approaches close", "downstream task performance" ], "note": { "typesetting": "TeX", "pages": 6, "language": "en", "license": "arXiv", "status": "editable" } } }