arXiv Analytics

Sign in

arXiv:1203.3495 [cs.LG]AbstractReferencesReviewsResources

Parameter-Free Spectral Kernel Learning

Qi Mao, Ivor W. Tsang

Published 2012-03-15Version 1

Due to the growing ubiquity of unlabeled data, learning with unlabeled data is attracting increasing attention in machine learning. In this paper, we propose a novel semi-supervised kernel learning method which can seamlessly combine manifold structure of unlabeled data and Regularized Least-Squares (RLS) to learn a new kernel. Interestingly, the new kernel matrix can be obtained analytically with the use of spectral decomposition of graph Laplacian matrix. Hence, the proposed algorithm does not require any numerical optimization solvers. Moreover, by maximizing kernel target alignment on labeled data, we can also learn model parameters automatically with a closed-form solution. For a given graph Laplacian matrix, our proposed method does not need to tune any model parameter including the tradeoff parameter in RLS and the balance parameter for unlabeled data. Extensive experiments on ten benchmark datasets show that our proposed two-stage parameter-free spectral kernel learning algorithm can obtain comparable performance with fine-tuned manifold regularization methods in transductive setting, and outperform multiple kernel learning in supervised setting.

Comments: Appears in Proceedings of the Twenty-Sixth Conference on Uncertainty in Artificial Intelligence (UAI2010)
Categories: cs.LG, stat.ML
Related articles: Most relevant | Search more
arXiv:1904.11717 [cs.LG] (Published 2019-04-26)
Classification from Pairwise Similarities/Dissimilarities and Unlabeled Data via Empirical Risk Minimization
arXiv:1905.11866 [cs.LG] (Published 2019-05-28)
When can unlabeled data improve the learning rate?
arXiv:1811.04820 [cs.LG] (Published 2018-11-12)
Learning From Positive and Unlabeled Data: A Survey