Gender and Smile Classification Using Deep Convolutional Neural Networks

Kaipeng Zhang, Lianzhi Tan, Zhifeng Li, Yu Qiao; The IEEE Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, 2016, pp. 34-38

Abstract


Facial gender and smile classification in unconstrained environment is challenging due to the invertible and large variations of face images. In this paper, we propose a deep model composed of GNet and SNet for these two tasks. We leverage the multi-task learning and the general-to-specific fine-tuning scheme to enhance the performance of our model. Our strategies exploit the inherent correlation between face identity, smile, gender and other face attributes to relieve the problem of over-fitting on small training set and improve the classification performance. We also propose the tasks-aware face cropping scheme to extract attribute-specific regions. The experimental results on the ChaLearn'16 FotW dataset for gender and smile classification demonstrate the effectiveness of our proposed methods.

Related Material


[pdf]
[bibtex]
@InProceedings{Zhang_2016_CVPR_Workshops,
author = {Zhang, Kaipeng and Tan, Lianzhi and Li, Zhifeng and Qiao, Yu},
title = {Gender and Smile Classification Using Deep Convolutional Neural Networks},
booktitle = {The IEEE Conference on Computer Vision and Pattern Recognition (CVPR) Workshops},
month = {June},
year = {2016}
}