You can now download fast stochastic gradient optimizers for linear Support Vector Machines (SVMs) and Conditional Random Fields (CRFs). Stochastic Gradient Descent has been historically associated with back-propagation algorithms in multilayer neural networks. These nonlinear nonconvex problems can be very difficult. Therefore it is useful to see how Stochastic Gradient Descent performs on such simple linear and convex problems. The benchmarks are very clear!