Press release: Masks must change how face biometrics are developed and applied

The widespread use of masks must mark a step change in how identity is managed and the way facial recognition biometrics are developed and applied, says the Biometrics Institute.

The latest report from the National Institute of Standards and Technology (NIST) on the accuracy of face recognition algorithms is a reminder that biometric solutions need careful evaluation and risk management, it says. And new biometric applications, such as contactless or those only using the periocular – or eye region – for recognition must be carefully assessed.

In its first report published in July, NIST tested pre-pandemic algorithms that had already been submitted. It found than even the best of the 89 commercial facial recognition algorithms tested had error rates of between 5% and 50% in matching digitally-applied face masks with photos of the same person without a mask.

This second update builds on that report with tests on another 65 face recognition algorithms provided to NIST after mid-March this year.

With knowledge of this impairment, the Biometrics Institute says face biometrics where masks are used must be carefully risk managed to maintain safety, public confidence and security.

The international members association is now calling on the biometrics community to ensure that new good practices are implemented with agility and existing guidelines are diligently followed.  

It says new processes to ensure safety are now paramount, for example the implementation of areas where people can temporarily remove their masks while using biometric systems.

Isabelle Moeller, chief executive of the Biometrics Institute says, ‘We welcome this research from NIST which enables our members to mitigate these issues, grow public trust and continue to keep people safe. Further research and testing may reveal that using additional sensor data, like high resolution, 3D or infrared can improve accuracy. However, a theme common to all NIST’s tests on face recognition is that each algorithm performs differently. Now more than ever, it’s vital that anyone using biometrics makes a point of understanding the limitations of their individual algorithm and thoroughly tests its performance both in their own environment and through an independent laboratory. And as new systems are developed, they must be independently tested before being presented to the public.’

The Biometrics Institute has recently released its ground-breaking new tool, the Good Practice Framework. This high level, systematic pathway helps anyone planning to introduce a biometric system or develop an existing application to formulate sound policies and processes before they apply the technology.

The Biometrics Institute strongly recommends anyone operating in biometrics reads this latest report from NIST to ascertain how its findings effect their applications.


NOTES TO EDITORS:

The Biometrics Institute is the independent and impartial international membership organisation for biometric users and other interested parties. It was established in 2001 to promote the responsible use of biometrics and has offices in London and Sydney.

With more than a thousand members from 240 membership organisations spread across 30 countries, it represents a global and diverse multi-stakeholder community. This includes banks, airlines, government agencies, biometric experts, privacy experts, suppliers and academics.

The Biometrics Institute connects the global biometrics community. It shares knowledge with its members and key stakeholders and most importantly, develops good-practices and thought leadership for the responsible and ethical use of biometrics.

For more information, please email Claire Fox Baron: clairefb@biometricsinstitute.org


 

 

Lead the debate with us on the
responsible use of Biometrics