User Tools

Site Tools


Efficient Backprop

Abstract: The convergence of back-propagation learning is analyzed so as to explain common phenomenon observed by practitioners. Many undesirable behaviors of backprop can be avoided with tricks that are rarely exposed in serious technical publications. This paper gives some of those tricks, and offers explanations of why they work. Many authors have suggested that second-order optimization methods are advantageous for neural net training. It is shown that most “classical” second-order methods are impractical for large neural networks. A few methods are proposed that do not have these limitations.

Note: This paper follows the presentation The BackPropagation CookBook given during the NIPS 1996 Workshop Tricks of the Trade organized by Jenny Orr and Klaus-Robert Müller.

Yann Le Cun, Léon Bottou, Genevieve B. Orr and Klaus-Robert Müller: Efficient Backprop, Neural Networks, Tricks of the Trade, Lecture Notes in Computer Science LNCS 1524, Springer Verlag, 1998.

tricks-1998.djvu tricks-1998.pdf tricks-1998.ps.gz

@incollection{lecun-98x,
  author = {{Le Cun}, Yann and Bottou, L\'{e}on and Orr, Genevieve B. and M{\"{u}}ller, Klaus-Robert},
  title = {Efficient Backprop},
  booktitle = {Neural Networks, Tricks of the Trade},
  series = {Lecture Notes in Computer Science LNCS~1524},
  publisher = {Springer Verlag},
  year = {1998},
  url = {http://leon.bottou.org/papers/lecun-98x},
}
papers/lecun-98x.txt · Last modified: 2007/08/17 14:39 by leonb

Page Tools