hh.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Cross-sensor periocular biometrics in a global pandemic: Comparative benchmark and novel multialgorithmic approach
Halmstad University, School of Information Technology.ORCID iD: 0000-0002-1400-346X
Norwegian University of Science and Technology, Gjøvik, Norway.ORCID iD: 0000-0002-9489-5161
Norwegian University of Science and Technology, Gjøvik, Norway.ORCID iD: 0000-0001-8615-098X
Norwegian University of Science and Technology, Gjøvik, Norway.ORCID iD: 0000-0002-9159-2923
Show others and affiliations
2022 (English)In: Information Fusion, ISSN 1566-2535, E-ISSN 1872-6305, Vol. 83-84, p. 110-130Article in journal (Refereed) Published
Abstract [en]

The massive availability of cameras and personal devices results in a wide variability between imaging conditions, producing large intra-class variations and a significant performance drop if images from heterogeneous environments are compared for person recognition purposes. However, as biometric solutions are extensively deployed, it will be common to replace acquisition hardware as it is damaged or newer designs appear or to exchange information between agencies or applications operating in different environments. Furthermore, variations in imaging spectral bands can also occur. For example, face images are typically acquired in the visible (VIS) spectrum, while iris images are usually captured in the near-infrared (NIR) spectrum. However, cross-spectrum comparison may be needed if, for example, a face image obtained from a surveillance camera needs to be compared against a legacy database of iris imagery. Here, we propose a multialgorithmic approach to cope with periocular images captured with different sensors. With face masks in the front line to fight against the COVID-19 pandemic, periocular recognition is regaining popularity since it is the only region of the face that remains visible. As a solution to the mentioned cross-sensor issues, we integrate different biometric comparators using a score fusion scheme based on linear logistic regression This approach is trained to improve the discriminating ability and, at the same time, to encourage that fused scores are represented by log-likelihood ratios. This allows easy interpretation of output scores and the use of Bayes thresholds for optimal decision-making since scores from different comparators are in the same probabilistic range. We evaluate our approach in the context of the 1st Cross-Spectral Iris/Periocular Competition, whose aim was to compare person recognition approaches when periocular data from visible and near-infrared images is matched. The proposed fusion approach achieves reductions in the error rates of up to 30%–40% in cross-spectral NIR–VIS comparisons with respect to the best individual system, leading to an EER of 0.2% and a FRR of just 0.47% at FAR = 0.01%. It also represents the best overall approach of the mentioned competition. Experiments are also reported with a database of VIS images from two different smartphones as well, achieving even bigger relative improvements and similar performance numbers. We also discuss the proposed approach from the point of view of template size and computation times, with the most computationally heavy comparator playing an important role in the results. Lastly, the proposed method is shown to outperform other popular fusion approaches in multibiometrics, such as the average of scores, Support Vector Machines, or Random Forest. © 2022 The Authors

Place, publisher, year, edition, pages
Amsterdam: Elsevier, 2022. Vol. 83-84, p. 110-130
Keywords [en]
Cross-sensor, Cross-spectral, Linear logistic regression, Multibiometrics fusion, Ocular biometrics, Periocular recognition, Sensor interoperability
National Category
Signal Processing
Identifiers
URN: urn:nbn:se:hh:diva-46748DOI: 10.1016/j.inffus.2022.03.008ISI: 000794868000003Scopus ID: 2-s2.0-85127807945OAI: oai:DiVA.org:hh-46748DiVA, id: diva2:1655781
Funder
Swedish Research Council, 2016-03497Knowledge FoundationVinnova, 2018-00472
Note

Funding: Part of this work was done while F. A.-F. was a visiting researcher at the Norwegian University of Science and Technology in Gjøvik (Norway), funded by EU COSTAction IC1106. Authors from HH thank the Swedish Research Council (project 2016-03497), the Swedish Knowledge Foundation (CAISR and SIDUS-AIR Program), and the Swedish Innovation Agency VINNOVA (project 2018-00472) for funding his research. Authors from UAM are funded by projects: PRIMA (MSCA-ITN-2019-860315), TRESPASS-ETN (MSCA-ITN-2019-860813), and BIBECA (RTI2018-101248-B-I00 MINECO).

Available from: 2022-05-03 Created: 2022-05-03 Last updated: 2023-08-21Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Alonso-Fernandez, FernandoBigun, Josef

Search in DiVA

By author/editor
Alonso-Fernandez, FernandoRaja, Kiran B.Raghavendra, R.Busch, ChristophBigun, JosefVera-Rodriguez, Ruben
By organisation
School of Information Technology
In the same journal
Information Fusion
Signal Processing

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 73 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf