This lecture describe Graph Transformer Networks It took place at the 2001 ICML workshop Machine Learning for Spatial and Temporal Data organized by Tom Dietterich. Graph Transformer Networks are one of the most powerful and successful method for learning sequential data. About 10% to 20% of the checks written in the U.S. since 1996 have been processed by a Graph Transformer Network.

Graph Transformer Networks are related to Conditional Random Fields but have variable geometry and non-linear energies.

- See a short paper or a long paper.