Unifying distillation and privileged information

Abstract: Distillation (Hinton et al., 2015) and privileged information (Vapnik & Izmailov, 2015) are two techniques that enable machines to learn from other machines. This paper unifies these two techniques into generalized distillation, a framework to learn from multiple machines and data representations. We provide theoretical and causal insight about the inner workings of generalized distillation, extend it to unsupervised, semisupervised and multitask learning scenarios, and illustrate its efficacy on a variety of numerical simulations on both synthetic and real-world data.

David Lopez-Paz, Leon Bottou, Bernhard Schölkopf and Vladimir Vapnik: Unifying distillation and privileged information, International Conference on Learning Representations (ICLR 2016), 2016.

iclr-2016.djvu iclr-2016.pdf iclr-2016.ps.gz

@inproceedings{lopez-paz-2016,
  author = {Lopez-Paz, David and Bottou, Leon and Sch\"{o}lkopf, Bernhard and Vapnik, Vladimir},
  title = {Unifying distillation and privileged information},
  booktitle = {International Conference on Learning Representations (ICLR 2016)},
  year = {2016},
  url = {http://leon.bottou.org/papers/lopez-paz-2016},
}