arXiv Analytics

Sign in

arXiv:2211.15322 [cs.LG]AbstractReferencesReviewsResources

Transductive Kernels for Gaussian Processes on Graphs

Yin-Cong Zhi, Felix L. Opolka, Yin Cheng Ng, Pietro Liò, Xiaowen Dong

Published 2022-11-28Version 1

Kernels on graphs have had limited options for node-level problems. To address this, we present a novel, generalized kernel for graphs with node feature data for semi-supervised learning. The kernel is derived from a regularization framework by treating the graph and feature data as two Hilbert spaces. We also show how numerous kernel-based models on graphs are instances of our design. A kernel defined this way has transductive properties, and this leads to improved ability to learn on fewer training points, as well as better handling of highly non-Euclidean data. We demonstrate these advantages using synthetic data where the distribution of the whole graph can inform the pattern of the labels. Finally, by utilizing a flexible polynomial of the graph Laplacian within the kernel, the model also performed effectively in semi-supervised classification on graphs of various levels of homophily.

Related articles: Most relevant | Search more
arXiv:2312.07694 [cs.LG] (Published 2023-12-12)
GP+: A Python Library for Kernel-based learning via Gaussian Processes
arXiv:2206.11683 [cs.LG] (Published 2022-06-23)
A generalised form for a homogeneous population of structures using an overlapping mixture of Gaussian processes
arXiv:2403.11782 [cs.LG] (Published 2024-03-18, updated 2024-03-24)
A tutorial on learning from preferences and choices with Gaussian Processes