Papers
arxiv:2003.03151

Demographic Bias in Presentation Attack Detection of Iris Recognition Systems

Published on Mar 6, 2020
Authors:
,
,
,

Abstract

The study investigates demographic bias in iris presentation attack detection, finding that female users are less protected compared to males.

AI-generated summary

With the widespread use of biometric systems, the demographic bias problem raises more attention. Although many studies addressed bias issues in biometric verification, there are no works that analyze the bias in presentation attack detection (PAD) decisions. Hence, we investigate and analyze the demographic bias in iris PAD algorithms in this paper. To enable a clear discussion, we adapt the notions of differential performance and differential outcome to the PAD problem. We study the bias in iris PAD using three baselines (hand-crafted, transfer-learning, and training from scratch) using the NDCLD-2013 database. The experimental results point out that female users will be significantly less protected by the PAD, in comparison to males.

Community

Sign up or log in to comment

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2003.03151 in a model README.md to link it from this page.

Datasets citing this paper 1

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2003.03151 in a Space README.md to link it from this page.

Collections including this paper 0

No Collection including this paper

Add this paper to a collection to link it from this page.