User Tools

Site Tools



I am a research scientist with broad interests in machine learning and artificial intelligence. My work on large scale learning and stochastic gradient algorithms has received attention in the recent years. I am also known for the DjVu document compression system. I joined Facebook AI Research in March 2015.

Use the sidebar to navigate this site.



Patrice Simard and I have been friends since the old AT&T Bell Labs times. He eventually convinced me to work for him at Microsoft. He told me to expect “interesting times”.

I can see several reasons for these interesting times.

  • The scientific point of view. There are few places where I can find machine learning problems with similar scale, similar challenges, and similar impact. This practical experience will surely feed my future machine learning research. In fact I believe that such experiences are necessary to do research. One needs to see the world…
  • The social point of view. The Internet is the largest encyclopedia of knowledge ever known to mankind, and this is great. On the other hand, everything you do on the Internet is recorded by someone somewhere. Large online services such as Google or Microsoft concentrate unprecedented amounts of such information. Our society is not ready for that. Very good things or very bad things can happen equally easily. They will affect all of us. We cannot just watch and count the points.
  • The competitive point of view. Microsoft combines a difficult competitive position with considerable resources: it has both the will and the means to do new things on the scientific, engineering, economical, and social levels. How to resist that? Of course nothing is ever certain…
2010/05/14 15:17


Rob Schapire and David Blei gave me the opportunity to teach the cos424 course at Princeton University for the spring 2010 semester. In fact Rob is on sabbatical leave at Yahoo! and David is parenting. Running the orphan course was a useful experience. One thousand slides later, I am really eager to see the student projects…

2010/04/29 23:00

Semantic Extraction with a Neural Network Architecture

Use BLAS, not PERL!

It is the nineties again. Ronan Collobert from NEC Labs just released a noncommercial version of his neural network system for semantic extraction. Given an input sentence in plain english, Senna outputs a host of Natural Language Processing (NLP) tags: part-of-speech (POS) tags, chunking (CHK), name entity recognition (NER), and semantic role labeling (SRL). Senna does this with state-of-the-art accuracies, roughly two hundred times faster than competing approaches.

The Senna source code represents about 2000 lines of C. This is probably one thousand times smaller than your usual natural language processing program. In fact all the Senna tagging tasks are performed using the same neural network simulation code.

Download Senna here. A Senna paper has been submitted to JMLR.

2010/02/16 10:16


The SGDQN paper has been published on the JMLR site. This variant of stochastic gradient got very good results during the first PASCAL Large Scale Learning Challenge. The paper gives a lot of explanation on the design of the algorithm. Source code is available from Antoine's web site.

2009/08/07 10:32
start.txt · Last modified: 2018/08/21 16:35 by leonb

Page Tools