This is an old revision of the document!
Abstract: We consider situations where training data is abundant and computing resources are comparatively scarce. We argue that suitably designed online learning algorithms asymptotically outperform any batch learning algorithm. Both theoretical and experimental evidences are presented.
@inproceedings{bottou-lecun-2004, author = {Bottou, L\'{e}on and {LeCun}, Yann}, title = {Large Scale Online Learning}, booktitle = {Advances in Neural Information Processing Systems 16}, editor = {Thrun, Sebastian and Saul, Lawrence and Bernhard {Sch\"{o}lkopf}}, publisher = {MIT Press}, address = {Cambridge, MA}, year = {2004}, url = {http://leon.bottou.org/papers/bottou-lecun-2004}, }
Complete proofs can be found in (Bottou and LeCun, 2004a).