{ "id": "1903.12117", "version": "v1", "published": "2019-03-28T17:03:04.000Z", "updated": "2019-03-28T17:03:04.000Z", "title": "Many Task Learning with Task Routing", "authors": [ "Gjorgji Strezoski", "Nanne van Noord", "Marcel Worring" ], "comment": "8 Pages, 5 Figures, 2 Tables", "categories": [ "cs.CV" ], "abstract": "Typical multi-task learning (MTL) methods rely on architectural adjustments and a large trainable parameter set to jointly optimize over several tasks. However, when the number of tasks increases so do the complexity of the architectural adjustments and resource requirements. In this paper, we introduce a method which applies a conditional feature-wise transformation over the convolutional activations that enables a model to successfully perform a large number of tasks. To distinguish from regular MTL, we introduce Many Task Learning (MaTL) as a special case of MTL where more than 20 tasks are performed by a single model. Our method dubbed Task Routing (TR) is encapsulated in a layer we call the Task Routing Layer (TRL), which applied in an MaTL scenario successfully fits hundreds of classification tasks in one model. We evaluate our method on 5 datasets against strong baselines and state-of-the-art approaches.", "revisions": [ { "version": "v1", "updated": "2019-03-28T17:03:04.000Z" } ], "analyses": { "keywords": [ "task routing", "task learning", "architectural adjustments", "matl scenario successfully fits hundreds", "large trainable parameter set" ], "note": { "typesetting": "TeX", "pages": 8, "language": "en", "license": "arXiv", "status": "editable" } } }