arXiv:1506.06318 [cs.LG]AbstractReferencesReviewsResources
Communication Efficient Distributed Agnostic Boosting
Shang-Tse Chen, Maria-Florina Balcan, Duen Horng Chau
Published 2015-06-21Version 1
We consider the problem of learning from distributed data in the agnostic setting, i.e., in the presence of arbitrary forms of noise. Our main contribution is a general distributed boosting-based procedure for learning an arbitrary concept space, that is simultaneously noise tolerant, communication efficient, and computationally efficient. This improves significantly over prior works that were either communication efficient only in noise-free scenarios or computationally prohibitive. Empirical results on large synthetic and real-world datasets demonstrate the effectiveness and scalability of the proposed approach.
Related articles: Most relevant | Search more
arXiv:1809.04737 [cs.LG] (Published 2018-09-13)
Fairness-aware Classification: Criterion, Convexity, and Bounds
arXiv:2201.09199 [cs.LG] (Published 2022-01-23)
Deep Learning on Attributed Sequences
arXiv:1806.11212 [cs.LG] (Published 2018-06-28)
Proxy Fairness