This shows you the differences between two versions of the page.
— |
papers:zhang-2023 [2023/08/29 06:15] (current) leonb created |
||
---|---|---|---|
Line 1: | Line 1: | ||
+ | ===== Learning useful representations for shifting tasks and distributions ===== | ||
+ | // | ||
+ | cost for a single training distribution) remain a | ||
+ | good approach when we are dealing with multiple distributions? | ||
+ | richer than those obtained with a single optimization episode. We support this thesis with simple | ||
+ | theoretical arguments and with experiments utilizing an apparently naïve ensembling technique: | ||
+ | concatenating the representations obtained from | ||
+ | multiple training episodes using the same data, | ||
+ | model, algorithm, and hyper-parameters, | ||
+ | networks perform similarly. Yet, in a number of | ||
+ | scenarios involving new distributions, | ||
+ | than an equivalently sized network trained with a | ||
+ | single training run. This proves that the representations constructed by multiple training episodes | ||
+ | are in fact different. Although their concatenation carries little additional information about the | ||
+ | training task under the training distribution, | ||
+ | or distributions change. Meanwhile, a single training episode is unlikely to yield such a redundant | ||
+ | representation because the optimization process | ||
+ | has no reason to accumulate features that do not | ||
+ | incrementally improve the training performance. | ||
+ | |||
+ | {{ jianyus.png? | ||
+ | |||
+ | <box 99% orange> | ||
+ | Jianyu Zhang and Léon Bottou: | ||
+ | |||
+ | [[http:// | ||
+ | [[http:// | ||
+ | [[http:// | ||
+ | </ | ||
+ | |||
+ | @article{zhang-2023, | ||
+ | title = {Learning useful representations for shifting tasks and distributions}, | ||
+ | author = {Zhang, Jianyu and Bottou, L{\' | ||
+ | booktitle = {International Conference on Machine Learning}, | ||
+ | pages = {40830--40850}, | ||
+ | year = {2023}, | ||
+ | organization = {PMLR}, | ||
+ | url = {http:// | ||
+ | } |