4.6 Article

Average biased ReLU based CNN descriptor for improved face retrieval

期刊

MULTIMEDIA TOOLS AND APPLICATIONS
卷 80, 期 15, 页码 23181-23206

出版社

SPRINGER
DOI: 10.1007/s11042-020-10269-x

关键词

ReLU; AB-ReLU; VGGFace; Image retrieval; CNN

资金

  1. IIIT Sri City, India through the Faculty Seed Research Grant
  2. NVIDIA Corporation

向作者/读者索取更多资源

This paper proposes using AB-ReLU to improve the discriminative ability of deep image representation models, achieving superior performance in face retrieval tasks.
The convolutional neural networks (CNN), including AlexNet, GoogleNet, VGGNet, etc. extract features for many computer vision problems which are very discriminative. The trained CNN model over one dataset performs reasonably well whereas on another dataset of similar type the hand-designed feature descriptor outperforms the same trained CNN model. The Rectified Linear Unit (ReLU) layer discards some values in order to introduce the non-linearity. In this paper, it is proposed that the discriminative ability of deep image representation using trained model can be improved by Average Biased ReLU (AB-ReLU) at the last few layers. Basically, AB-ReLU improves the discriminative ability in two ways: 1) it exploits some of the discriminative and discarded negative information of ReLU and 2) it also neglects the irrelevant and positive information used in ReLU. The VGGFace model trained in MatConvNet over the VGG-Face dataset is used as the feature descriptor for face retrieval over other face datasets. The proposed approach is tested over six challenging, unconstrained and robust face datasets (PubFig, LFW, PaSC, AR, FERET and ExtYale) and also on a large scale face dataset (PolyUNIR) in retrieval framework. It is observed that the AB-ReLU outperforms the ReLU when used with a pre-trained VGGFace model over the face datasets. The validation error by training the network after replacing all ReLUs with AB-ReLUs is also observed to be favorable over each dataset. The AB-ReLU even outperforms the state-of-the-art activation functions, such as Sigmoid, ReLU, Leaky ReLU and Flexible ReLU over all seven face datasets.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据