Press release: Response to NIST’s report on bias

19 December 2019 – for immediate release

The Biometrics Institute welcomes the latest research from the National Institute of Standards and Technology (NIST), which offers the biometric community further insights into bias – otherwise known as demographic differentials.

Ultimately, biometrics is an extremely accurate but probabilistic technology, incredibly useful for searching large datasets quickly in ways that humans could not achieve. It is constantly being improved through testing including by independent organisations such as NIST. According to NIST, facial recognition technology today is 20 times more accurate than it was just a few years ago.

The issue of bias in biometric and AI systems has been a significant focus of public attention recently. NIST’s Part 3 of its Face Recognition Vendor Test (FRVT) demonstrates for some algorithms there can be situations where bias can arise.

The Biometrics Institute reinforces to its members the importance of knowing the algorithm they are working with. Biometrics Institute members, acting responsibly and ethically, should work with a good understanding of the strengths and weaknesses of the underlying technology. They should also have procedural safeguards and effective oversight in place to govern its application to protect human rights and privacy. In addition, they should consider independent testing of their algorithm.

Biometric technology can be an effective tool to assist in identification and verification in an array of use cases. These range from the convenience of using your face to unlock your phone, to getting through passport control quicker, to the reassurance that a face can be found in a crowd far quicker with the assistance of technology than relying on a human alone. However, when we think of the word bias we tend to consider it as a pre-meditated, closed-minded and prejudicial human trait. It’s important to remember that technology cannot behave in this way. So-called bias in biometric systems may exist because the data provided to train the system is not sufficiently diverse. That is why in cases including law enforcement and counter-terrorism the human in the loop – to verify the algorithm’s findings – is often a critical aspect of using the technology.”

Isabelle Moeller, Chief Executive, Biometrics Institute

The Biometrics Institute recognises that, as with any technology, the convenience of its use comes with challenges. It takes human rights, privacy, spoofing, morphing and bias seriously and works diligently with its members, expert groups and the wider biometrics community to provide new, and update existing guidance to mitigate the risks. It provides regular events and training courses around the world to share and grow knowledge.

Patrick Grother – one of the authors of this report – presents regular workshops on bias for the Biometrics Institute. He will be speaking on bias in biometrics and demographic differentials at the institute’s US Conference on 24 March 2020 in Washington DC.

ENDS


Further reading:

The UN Compendium for the Responsible Use of Biometrics in Counter-Terrorism – co-authored by the Biometrics Institute which includes the section:

Ethical Principles for Biometrics

Notes to editors:

The Biometrics Institute’s US Conference is on 24-25 March in Washington DC. Confirmed speakers include:

  • Patrick Grother, Biometric Standards and Testing Lead,National Institute of Standards and Technology (NIST)
  • John Howard, Data Scientist,The Maryland Test Facility
  • Frank Torres, Senior Policy Director,Microsoft
  • Clare Garvie, Senior Associate, Center on Privacy and Technology,Georgetown University
  • Jay Stanley, Senior Policy Analyst, Speech, Privacy and Technology Program,American Civil Liberties Union (ACLU)
  • Arun Vemury, Director, Biometric and Identity Technology Center, US Department of Homeland  Security


The Biometrics Institute is the independent and impartial international membership organisation for biometric users and other interested parties. It was established in 2001 to promote the responsible use of biometrics and has offices in London and Sydney.

With more than a thousand members from 240 membership organisations spread across 30 countries, it represents a global and diverse multi-stakeholder community. This includes banks, airlines, government agencies, biometric experts, privacy experts, suppliers and academics.

The Biometrics Institute connects the global biometrics community. It shares knowledge with its members and key stakeholders and most importantly, develops good-practices and thought leadership for the responsible and ethical use of biometrics.

For more information, please email Claire Fox Baron: clairefb@biometricsinstitute.org

Lead the debate with us on the
responsible use of Biometrics