arXiv Analytics

Sign in

arXiv:1206.0994 [cs.LG]AbstractReferencesReviewsResources

An Optimization Framework for Semi-Supervised and Transfer Learning using Multiple Classifiers and Clusterers

Ayan Acharya, Eduardo R. Hruschka, Joydeep Ghosh, Sreangsu Acharyya

Published 2012-04-20Version 1

Unsupervised models can provide supplementary soft constraints to help classify new, "target" data since similar instances in the target set are more likely to share the same class label. Such models can also help detect possible differences between training and target distributions, which is useful in applications where concept drift may take place, as in transfer learning settings. This paper describes a general optimization framework that takes as input class membership estimates from existing classifiers learnt on previously encountered "source" data, as well as a similarity matrix from a cluster ensemble operating solely on the target data to be classified, and yields a consensus labeling of the target data. This framework admits a wide range of loss functions and classification/clustering methods. It exploits properties of Bregman divergences in conjunction with Legendre duality to yield a principled and scalable approach. A variety of experiments show that the proposed framework can yield results substantially superior to those provided by popular transductive learning techniques or by naively applying classifiers learnt on the original task to the target data.

Related articles: Most relevant | Search more
arXiv:1902.04151 [cs.LG] (Published 2019-01-26)
Evaluation of Transfer Learning for Classification of: (1) Diabetic Retinopathy by Digital Fundus Photography and (2) Diabetic Macular Edema, Choroidal Neovascularization and Drusen by Optical Coherence Tomography
arXiv:1906.02816 [cs.LG] (Published 2019-06-06)
Robust Attacks against Multiple Classifiers
arXiv:1910.07012 [cs.LG] (Published 2019-10-15)
Transfer Learning for Algorithm Recommendation