{ "id": "2104.06970", "version": "v1", "published": "2021-04-14T16:53:13.000Z", "updated": "2021-04-14T16:53:13.000Z", "title": "Eluder Dimension and Generalized Rank", "authors": [ "Gene Li", "Pritish Kamath", "Dylan J. Foster", "Nathan Srebro" ], "comment": "Technical Note", "categories": [ "cs.LG", "stat.ML" ], "abstract": "We study the relationship between the eluder dimension for a function class and a generalized notion of rank, defined for any monotone \"activation\" $\\sigma : \\mathbb{R} \\to \\mathbb{R}$, which corresponds to the minimal dimension required to represent the class as a generalized linear model. When $\\sigma$ has derivatives bounded away from $0$, it is known that $\\sigma$-rank gives rise to an upper bound on eluder dimension for any function class; we show however that eluder dimension can be exponentially smaller than $\\sigma$-rank. We also show that the condition on the derivative is necessary; namely, when $\\sigma$ is the $\\mathrm{relu}$ activation, we show that eluder dimension can be exponentially larger than $\\sigma$-rank.", "revisions": [ { "version": "v1", "updated": "2021-04-14T16:53:13.000Z" } ], "analyses": { "keywords": [ "eluder dimension", "generalized rank", "function class", "upper bound", "minimal dimension" ], "note": { "typesetting": "TeX", "pages": 0, "language": "en", "license": "arXiv", "status": "editable" } } }