{ "id": "1209.3352", "version": "v4", "published": "2012-09-15T03:27:11.000Z", "updated": "2014-02-03T07:09:03.000Z", "title": "Thompson Sampling for Contextual Bandits with Linear Payoffs", "authors": [ "Shipra Agrawal", "Navin Goyal" ], "comment": "Improvements from previous version: (1) dependence on d improved from d^2 to d^{3/2} (2) Simpler and more modular proof techniques (3) bounds in terms of log(N) added", "categories": [ "cs.LG", "cs.DS", "stat.ML" ], "abstract": "Thompson Sampling is one of the oldest heuristics for multi-armed bandit problems. It is a randomized algorithm based on Bayesian ideas, and has recently generated significant interest after several studies demonstrated it to have better empirical performance compared to the state-of-the-art methods. However, many questions regarding its theoretical performance remained open. In this paper, we design and analyze a generalization of Thompson Sampling algorithm for the stochastic contextual multi-armed bandit problem with linear payoff functions, when the contexts are provided by an adaptive adversary. This is among the most important and widely studied versions of the contextual bandits problem. We provide the first theoretical guarantees for the contextual version of Thompson Sampling. We prove a high probability regret bound of $\\tilde{O}(d^{3/2}\\sqrt{T})$ (or $\\tilde{O}(d\\sqrt{T \\log(N)})$), which is the best regret bound achieved by any computationally efficient algorithm available for this problem in the current literature, and is within a factor of $\\sqrt{d}$ (or $\\sqrt{\\log(N)}$) of the information-theoretic lower bound for this problem.", "revisions": [ { "version": "v4", "updated": "2014-02-03T07:09:03.000Z" } ], "analyses": { "subjects": [ "68W40", "68Q25", "F.2.0" ], "keywords": [ "thompson sampling", "contextual bandits", "high probability regret bound", "stochastic contextual multi-armed bandit problem", "linear payoff functions" ], "note": { "typesetting": "TeX", "pages": 0, "language": "en", "license": "arXiv", "status": "editable", "adsabs": "2012arXiv1209.3352A" } } }