User Tools

Site Tools


Stochastic Approximations and Efficient Learning

Excerpt: The analysis of online algorithms is much more difficult than that of ordinary optimization algorithms. Practical successes in signal processing (Widrow and Stearns, 1985) motivated the creation of sophisticated mathematical tools known as {\em stochastic approximations} (Ljung and Soderstrom, 1983; Benveniste, Metivier and Priouret, 1990) […] The first section describes and illustrates a general framework for neural network learning algorithms based on stochastic gradient descent. The second section presents stochastic approximation results describing the final phase. The third section discusses the conceptual aspects of the search phase and comments some of the newest results.

Léon Bottou and Noboru Murata: Stochastic Approximations and Efficient Learning, The Handbook of Brain Theory and Neural Networks, Second edition,, Edited by M. A. Arbib, The MIT Press, Cambridge, MA, 2002.

arbib-2002.djvu arbib-2002.pdf arbib-2002.ps.gz

@incollection{bottou-murata-2002,
  author = {Bottou, L\'{e}on and Murata, Noboru},
  title = {Stochastic Approximations and Efficient Learning},
  booktitle = {The Handbook of Brain Theory and Neural Networks, Second edition,},
  editor = {Arbib, M. A.},
  publisher = {The MIT Press},
  address = {Cambridge, MA},
  year = {2002},
  url = {http://leon.bottou.org/papers/bottou-murata-2002},
}
papers/bottou-murata-2002.txt · Last modified: 2006/04/20 11:04 by leonb

Page Tools