Abstract: We develop a fast online kernel algorithm for classification which can be viewed as an improvement over the one suggested by (Crammer et al., 2003). In that previous work, the authors introduced an on-the-fly compression of the number of examples used in the prediction function using the size of the margin as a quality measure. Although displaying impressive results on relatively noise-free data we show how their algorithm is susceptible in noisy problems. Utilizing a new quality measure for an included example, namely the error induced on a selected subset of the training data, we gain improved compression rates and sensitivity to noise over the existing approach. Our method is also extendable to the batch training mode case.
@inproceedings{weston-bordes-bottou-2005, title = {Online (and Offline) on an Even Tighter Budget}, pages = {413-420}, author = {Weston, Jason and Bordes, Antoine and Bottou, L\'{e}on}, booktitle = {Proceedings of the Tenth International Workshop on Artificial Intelligence and Statistics, January 2005, Barbados}, editor = {Cowell, Robert G. and Ghahramani, Zoubin}, year = {2005}, publisher = {Society for Artificial Intelligence and Statistics}, url = {http://leon.bottou.org/papers/weston-bordes-bottou-2005}, }