{ "id": "2211.15322", "version": "v1", "published": "2022-11-28T14:00:50.000Z", "updated": "2022-11-28T14:00:50.000Z", "title": "Transductive Kernels for Gaussian Processes on Graphs", "authors": [ "Yin-Cong Zhi", "Felix L. Opolka", "Yin Cheng Ng", "Pietro LiĆ²", "Xiaowen Dong" ], "categories": [ "cs.LG", "stat.ML" ], "abstract": "Kernels on graphs have had limited options for node-level problems. To address this, we present a novel, generalized kernel for graphs with node feature data for semi-supervised learning. The kernel is derived from a regularization framework by treating the graph and feature data as two Hilbert spaces. We also show how numerous kernel-based models on graphs are instances of our design. A kernel defined this way has transductive properties, and this leads to improved ability to learn on fewer training points, as well as better handling of highly non-Euclidean data. We demonstrate these advantages using synthetic data where the distribution of the whole graph can inform the pattern of the labels. Finally, by utilizing a flexible polynomial of the graph Laplacian within the kernel, the model also performed effectively in semi-supervised classification on graphs of various levels of homophily.", "revisions": [ { "version": "v1", "updated": "2022-11-28T14:00:50.000Z" } ], "analyses": { "keywords": [ "gaussian processes", "transductive kernels", "node feature data", "node-level problems", "hilbert spaces" ], "note": { "typesetting": "TeX", "pages": 0, "language": "en", "license": "arXiv", "status": "editable" } } }