User Tools

Site Tools


This is an old revision of the document!


SGD-QN: Careful Quasi-Newton Stochastic Gradient Descent

Abstract: The SGDQN algorithm is a stochastic gradient descent algorithm that makes careful use of second-order information and splits the parameter update into independently scheduled components. Thanks to this design, SGDQN iterates nearly as fast as a first-order stochastic gradient descent but requires less iterations to achieve the same accuracy. This algorithm won the ``Wild Track'' of the first PASCAL Large Scale Learning Challenge.

Antoine Bordes, Léon Bottou and Patrick Gallinari: SGD-QN: Careful Quasi-Newton Stochastic Gradient Descent, Journal of Machine Learning Research, to appear, 2009.

@article{bordes-bottou-gallinari-2009,
  author = {Bordes, Antoine and Bottou, L\'{e}on and Gallinari, Patrick},
  title = {SGD-QN: Careful Quasi-Newton Stochastic Gradient Descent},
  journal = {Journal of Machine Learning Research},
  year = {2009},
  volume = {10},
  pages = {to appear},
  url = {http://leon.bottou.org/papers/bordes-bottou-gallinari-2009},
}
papers/bordes-bottou-gallinari-2009.1248467955.txt.gz · Last modified: 2009/07/24 16:39 by leonb

Page Tools