arXiv Analytics

Sign in

arXiv:1207.4132 [cs.LG]AbstractReferencesReviewsResources

MOB-ESP and other Improvements in Probability Estimation

Rodney Nielsen

Published 2012-07-11Version 1

A key prerequisite to optimal reasoning under uncertainty in intelligent systems is to start with good class probability estimates. This paper improves on the current best probability estimation trees (Bagged-PETs) and also presents a new ensemble-based algorithm (MOB-ESP). Comparisons are made using several benchmark datasets and multiple metrics. These experiments show that MOB-ESP outputs significantly more accurate class probabilities than either the baseline BPETs algorithm or the enhanced version presented here (EB-PETs). These results are based on metrics closely associated with the average accuracy of the predictions. MOB-ESP also provides much better probability rankings than B-PETs. The paper further suggests how these estimation techniques can be applied in concert with a broader category of classifiers.

Comments: Appears in Proceedings of the Twentieth Conference on Uncertainty in Artificial Intelligence (UAI2004)
Categories: cs.LG, cs.AI, stat.ML
Related articles: Most relevant | Search more
arXiv:1912.09592 [cs.LG] (Published 2019-12-19)
Graph Convolutional Networks: analysis, improvements and results
arXiv:2406.09405 [cs.LG] (Published 2024-06-13)
Why Warmup the Learning Rate? Underlying Mechanisms and Improvements
arXiv:2006.10124 [cs.LG] (Published 2020-06-17)
Improvements in Computation and Usage of Joint CDFs for the N-Dimensional Order Statistic