I am a research scientist with broad interests in machine learning and artificial intelligence. My work on large scale learning and stochastic gradient algorithms has received attention in the recent years. I am also known for the DjVu document compression system. I joined Facebook AI Research in March 2015.
Use the sidebar to navigate this site.
Alex Peysakhovich and I represent Facebook on the organizing committee of the NYC Data Science Seminar Series. This rotating seminar organized by Columbia, CornellTech, Facebook, Microsoft Research NYC, and New York University has featured a number of prominent speakers.
I was scavenging my old emails a couple weeks ago and found a copy of an early technical report that not only describes Graph Transformer Networks in a couple pages but also explains why they are defined the way they are.
Why settle for 60000 MNIST training examples when you can have one trillion?
The MNIST8M dataset was generated using the elastic deformation code originally written
for (Loosli, Canu, and Bottou, 2007). Unfortunately the
original MNIST8M files were accidentally deleted from the NEC servers a couple weeks ago.
Instead of regenerating the files, I have repackaged the generation code in
a convenient form.
Our paper“Counterfactual Reasoning and Learning Systems: The Example of Computational Advertising” has appeared in JMLR. This paper takes the example of ad placement to illustrate how one can leverage causal inference to understand the behavior of complex learning systems interacting with their environment.