Efficient feature selection for linear discriminant analysis and its application to face recognition

Research output: Chapter in Book/Report/Conference proceedingConference contribution

11 Citations (Scopus)

Abstract

Feature selection is an important issue in pattern recognition. In face recognition, one of the state-of-the-art methods is that some feature selection methods (e.g., AdaBoost) are first utilized to select the most discriminative features and then the subspace learning methods (e.g., LDA) are further applied to learn the discriminant subspace for classification. However, in these methods, the objective of feature selection and subspace learning is not so consistent and the combination is not the optimal. In this paper, we propose a novel and efficient feature selection method that is designed for linear discriminant analysis (LDA). We use the Fisher criterion to select the most discriminative and appropriate features so that the objectives of feature selection and classifier learning are consistent (both follow the Fisher criterion) and the face recognition performance is expected to be improved. Experiments on FRGC v2.0 face database validate the efficacy of the proposed method.

Original languageEnglish
Title of host publicationICPR 2012 - 21st International Conference on Pattern Recognition
Pages1136-1139
Number of pages4
Publication statusPublished - 2012
Externally publishedYes
Event21st International Conference on Pattern Recognition, ICPR 2012 - Tsukuba, Japan
Duration: Nov 11 2012Nov 15 2012

Publication series

NameProceedings - International Conference on Pattern Recognition
ISSN (Print)1051-4651

Conference

Conference21st International Conference on Pattern Recognition, ICPR 2012
Country/TerritoryJapan
CityTsukuba
Period11/11/1211/15/12

ASJC Scopus subject areas

  • Computer Vision and Pattern Recognition

Fingerprint

Dive into the research topics of 'Efficient feature selection for linear discriminant analysis and its application to face recognition'. Together they form a unique fingerprint.

Cite this