Neural Networks as Statistical Learning Systems

Building on conceptual work done earlier by Yakov Z.Tsypkin and Vladimir Vapnik, I proposed one of the first interpretations of neural networks as statistical learning machines. This framework unifies a large number of neural network models and provides a common proof of convergence. I also proposed the concept of modular learning systems in which all components are globally trained by optimizing a single cost function. This global training approach was first applied to speech recognition systems integrating neural networks and Hidden Markov Models. Global training was then applied to various pattern recognition problems, like speech recognition, optical character recognition, speaker recognition, or radar spot tracking. This led to the first characterization of the so-called label-bias problem in complex learning systems.


Léon Bottou: Une Approche théorique de l'Apprentissage Connexionniste: Applications à la Reconnaissance de la Parole, Orsay, France, 1991.


Léon Bottou: Stochastic Gradient Learning in Neural Networks, Proceedings of Neuro-Nîmes 91, EC2, Nimes, France, 1991.


Léon Bottou and Patrick Gallinari: A Framework for the Cooperation of Learning Algorithms, Advances in Neural Information Processing Systems, 3, Edited by D. Touretzky and R. Lippmann, Morgan Kaufmann, Denver, 1991.


research/thesis.txt · Last modified: 2011/01/03 12:52 by leonb
Driven by DokuWiki Recent changes RSS feed Valid CSS Valid XHTML 1.0