Learning Methods for Generic Object Recognition with Invariance to Pose and Lighting

Abstract: We assess the applicability of several popular learning methods for the problem of recognizing generic visual categories with invariance to pose, lighting, and surrounding clutter. A large dataset comprising stereo image pairs of 50 uniform-colored toys under 36 angles, 9 azimuths, and 6 lighting conditions was collected (for a total of 194,400 individual images). The objects were 10 instances of 5 generic categories: four-legged animals, human figures, airplanes, trucks, and cars. Five instances of each category were used for training, and the other five for testing. Low-resolution grayscale images of the objects with various amounts of variability and surrounding clutter were used for training and testing. Nearest Neighbor methods, Support Vector Machines, and Convolutional Networks, operating on raw pixels or on PCA-derived features were tested. Test error rates for unseen object instances placed on uniform backgrounds were around 13% for SVM and 7% for Convolutional Nets. On a segmentation/recognition task with highly cluttered images, SVM proved impractical, while Convolutional nets yielded 14% error. A real-time version of the system was implemented that can detect and classify objects in natural scenes at around 10 frames per second.

Yann LeCun, Léon Bottou and Jie HuangFu: Learning Methods for Generic Object Recognition with Invariance to Pose and Lighting, Proceedings of Computer Vision and Pattern Recognition (CVPR), IEEE, Washington, D.C., 2004.

cvpr-2004.djvu cvpr-2004.pdf cvpr-2004.ps.gz

@inproceedings{lecun-bottou-huangfu-2004,
  author = {{LeCun}, Yann and Bottou, L\'{e}on and HuangFu, Jie},
  title = {Learning Methods for Generic Object Recognition with Invariance to Pose and Lighting},
  year = {2004},
  booktitle = {Proceedings of Computer Vision and Pattern Recognition (CVPR)},
  publisher = {IEEE},
  address = {Washington, D.C.},
  url = {http://leon.bottou.org/papers/lecun-bottou-huangfu-2004},
}