Multi-view Deep Features for Robust Facial Kinship Verification

Automatic kinship verification from facial images is an emerging research topic in machine learning community. In this paper, we proposed an effective facial features extraction model based on multi-view deep features. Thus, we used four pre-trained deep learning models using eight features layers (FC6 and FC7 layers of each VGG-F, VGG-M, VGG-S and VGG-Face models) to train the proposed Multilinear Side-Information based Discriminant Analysis integrating Within Class Covariance Normalization (MSIDA + WCCN) method. Furthermore, we show that how can metric learning methods based on WCCN method integration improves the Simple Scoring Cosine similarity (SSC) method. We refer that we used the SSC method in RFIW’20 competition using the eight deep features concatenation. Thus, the integration of WCCN in the metric learning methods decreases the intra-class variations effect introduced by the deep features weights. We evaluate our proposed method on two kinship benchmarks namely KinFaceW-I and KinFaceW-II databases using four Parent-Child relations (Father-Son, Father-Daughter, Mother-Son and Mother-Daughter). Thus, the proposed MSIDA + WCCN method improves the SSC method with 12.80% and 14.65% on KinFaceW-I and KinFaceW-II databases, respectively. The results obtained are positively compared with some modern methods, including those that rely on deep learning.

Laiadi Oualid, Ouamane Abdelmalik, Benakcha Abdelhamid, Taleb-Ahmed Abdelmalik, Hadid Abdenour

Publication type:
A4 Article in conference proceedings

Place of publication:
2020 15th IEEE International Conference on Automatic Face and Gesture Recognition (FG 2020)

Kinship verification, tensor learning


Full citation:
O. Laiadi, A. Ouamane, A. Benakcha, A. Taleb-Ahmed and A. Hadid, “Multi-view Deep Features for Robust Facial Kinship Verification,” 2020 15th IEEE International Conference on Automatic Face and Gesture Recognition (FG 2020), Buenos Aires, Argentina, 2020, pp. 877-881, doi: 10.1109/FG47880.2020.00118


Read the publication here: