{ "id": "2003.11539", "version": "v1", "published": "2020-03-25T17:58:42.000Z", "updated": "2020-03-25T17:58:42.000Z", "title": "Rethinking Few-Shot Image Classification: a Good Embedding Is All You Need?", "authors": [ "Yonglong Tian", "Yue Wang", "Dilip Krishnan", "Joshua B. Tenenbaum", "Phillip Isola" ], "comment": "First two authors contributed equally. Code: http://github.com/WangYueFt/rfs/", "categories": [ "cs.CV", "cs.LG" ], "abstract": "The focus of recent meta-learning research has been on the development of learning algorithms that can quickly adapt to test time tasks with limited data and low computational cost. Few-shot learning is widely used as one of the standard benchmarks in meta-learning. In this work, we show that a simple baseline: learning a supervised or self-supervised representation on the meta-training set, followed by training a linear classifier on top of this representation, outperforms state-of-the-art few-shot learning methods. An additional boost can be achieved through the use of self-distillation. This demonstrates that using a good learned embedding model can be more effective than sophisticated meta-learning algorithms. We believe that our findings motivate a rethinking of few-shot image classification benchmarks and the associated role of meta-learning algorithms. Code is available at: http://github.com/WangYueFt/rfs/.", "revisions": [ { "version": "v1", "updated": "2020-03-25T17:58:42.000Z" } ], "analyses": { "keywords": [ "rethinking few-shot image classification", "outperforms state-of-the-art few-shot learning methods", "few-shot image classification benchmarks", "test time tasks", "low computational cost" ], "tags": [ "github project" ], "note": { "typesetting": "TeX", "pages": 0, "language": "en", "license": "arXiv", "status": "editable" } } }