arXiv Analytics

Sign in

arXiv:2405.06627 [cs.LG]AbstractReferencesReviewsResources

Conformal Validity Guarantees Exist for Any Data Distribution

Drew Prinster, Samuel Stanton, Anqi Liu, Suchi Saria

Published 2024-05-10Version 1

As machine learning (ML) gains widespread adoption, practitioners are increasingly seeking means to quantify and control the risk these systems incur. This challenge is especially salient when ML systems have autonomy to collect their own data, such as in black-box optimization and active learning, where their actions induce sequential feedback-loop shifts in the data distribution. Conformal prediction has emerged as a promising approach to uncertainty and risk quantification, but existing variants either fail to accommodate sequences of data-dependent shifts, or do not fully exploit the fact that agent-induced shift is under our control. In this work we prove that conformal prediction can theoretically be extended to \textit{any} joint data distribution, not just exchangeable or quasi-exchangeable ones, although it is exceedingly impractical to compute in the most general case. For practical applications, we outline a procedure for deriving specific conformal algorithms for any data distribution, and we use this procedure to derive tractable algorithms for a series of agent-induced covariate shifts. We evaluate the proposed algorithms empirically on synthetic black-box optimization and active learning tasks.

Comments: ICML 2024. Code available at https://github.com/drewprinster/ conformal-mfcs
Categories: cs.LG, cs.AI, stat.ML
Related articles: Most relevant | Search more
arXiv:2404.08168 [cs.LG] (Published 2024-04-12)
Conformal Prediction via Regression-as-Classification
arXiv:2402.05806 [cs.LG] (Published 2024-02-08)
On Calibration and Conformal Prediction of Deep Classifiers
arXiv:2401.11810 [cs.LG] (Published 2024-01-22)
Generalization and Informativeness of Conformal Prediction