arXiv Analytics

Sign in

arXiv:2402.01922 [cs.LG]AbstractReferencesReviewsResources

A General Framework for Learning from Weak Supervision

Hao Chen, Jindong Wang, Lei Feng, Xiang Li, Yidong Wang, Xing Xie, Masashi Sugiyama, Rita Singh, Bhiksha Raj

Published 2024-02-02, updated 2024-05-28Version 2

Weakly supervised learning generally faces challenges in applicability to various scenarios with diverse weak supervision and in scalability due to the complexity of existing algorithms, thereby hindering the practical deployment. This paper introduces a general framework for learning from weak supervision (GLWS) with a novel algorithm. Central to GLWS is an Expectation-Maximization (EM) formulation, adeptly accommodating various weak supervision sources, including instance partial labels, aggregate statistics, pairwise observations, and unlabeled data. We further present an advanced algorithm that significantly simplifies the EM computational demands using a Non-deterministic Finite Automaton (NFA) along with a forward-backward algorithm, which effectively reduces time complexity from quadratic or factorial often required in existing solutions to linear scale. The problem of learning from arbitrary weak supervision is therefore converted to the NFA modeling of them. GLWS not only enhances the scalability of machine learning models but also demonstrates superior performance and versatility across 11 weak supervision scenarios. We hope our work paves the way for further advancements and practical deployment in this field.

Related articles: Most relevant | Search more
arXiv:2403.13249 [cs.LG] (Published 2024-03-20)
A Unified and General Framework for Continual Learning
arXiv:2310.18564 [cs.LG] (Published 2023-10-28)
A General Framework for Robust G-Invariance in G-Equivariant Networks
arXiv:2306.01658 [cs.LG] (Published 2023-06-02)
An Adaptive Method for Weak Supervision with Drifting Data