20-year Anniversary Report: Juliet Lodge

 

Juliet Lodge

Juliet Lodge: Towards an ethical and responsible biometric eco-system

Creating a biometrics eco-system to stimulate an ethical and responsible use of biometrics in all settings has been the hallmark of the Biometrics Institute’s work over the past 20 years.

Many of the challenges originally identified by the biometrics community remain.  Then, as now, it is important to ensure transparent and privacy-respecting use in order to generate trust in the reliability, security and dependability of the technology and of those private and public sector bodies using it either singly or in partnership for both commercial and more comprehensive public purposes. 

The focus has changed over the years regarding the purpose of deploying biometrics for domestic, commercial purposes as well as for matters relating to physical border management at state boundaries, and to accessing public and commercial services in the geopolitical borderless spaces of the digital world.  The convenience gain is irrefutable.

Playing catch-up

Managing physical borders using biometrics in passports was a first step that engaged the biometrics communities seeking to develop appropriate technical solutions to accelerate border controls and provide a degree of greater predictive certainty that there was a genuine correspondence between the person presenting themselves at a border gate and the identity in the travel document. The attendant vulnerabilities and security technologies evolved more swiftly than legislation.

It is still the case that technological applications of biometric technologies outpace legislative requirements and regulations.  A guiding principle in using biometrics for diverse purposes has therefore been to advocate legitimate and ethical use to ensure clear, privacy respecting, proportionate, consensual, fair, secure, appropriate and responsible deployment of the applications.

This is encapsulated in the idea that just because it is possible to do something with biometrics does not justify their use if the overall effect is disproportionate to the original goal. For example, biometrics payments by children may be considered disproportionate when there are other means of paying that are less intrusive on their integrity as a young person.

Citizens or suspects

Concern persists over biometric applications  treating children as ‘suspects’, linking that to all manner of educational, medical and social information – replete with errors – and over the potential for abuse of a system intended to reassure all using it that it was a credible and appropriate means to expedite otherwise slower and bureaucratic processes.

With opening your phone by iris recognition or fingerprint, voice biometrics for payments, hand or fingerprints to open doors, biometrics seems to have become accepted by a wider public.

However, as the Biometrics Institute has signalled over the years, with greater reach comes greater responsibility for the outcomes. 

Beyond behavioural biometrics

In the early days of biometric enrolment and deployment primarily for financial or physical border control management, the Biometrics Institute stressed the need for clarity over both the definition of what constituted a biometric and the purpose of using biometrics. The need for privacy and guarding against function and mission creep have been entrenched in the EU’s GDPR. It has become a model of good practice.

But mission creep is widespread and instead of a biometric being defined as a digital representation of an element of a person’s physical characteristics (such as a fingerprint) it has become shorthand for everything they do.

Behavioural biometrics may be inferred from neurological as well as  loose, social media-based information garnered anywhere, anytime.

As the EU now pushes ahead with realising a digital society, it is striking that once again many of the concerns raised by the Biometrics Institute’s members over the years, have re-surfaced.

Biometrics for trustable digital society

As Members of the European Parliament now stress, biometric tools must be used mindfully to sustain public trust.  They are particularly exercised now with the issue of biometric-enabled discrimination and mission creep – two issues that Biometrics Institute members stressed in early deliberations over the potentially privacy intrusive impact on the integrity of citizens’ data that could arise.

But discrimination and differentiation lie at the heart of using biometrics for diverse purposes. The core question is now not simply whether biometrics should be used but whether the linkage of biometric data to other information about an individual is legitimate, proportionate, secure and beneficial to the individual and society.

In short, the ethical use of biometrics for strictly defined, specific purposes is crucial. Defining a biometric also remains problematic as neurological and medical inferences associable with biometrics may escape current guidelines and regulations.

Then as now, how access to biometrics may be skewed to privileging certain groups over others, often inadvertently, is a public issue.  This is raised in the context of building an inclusive and responsive digital society and ensuring steps are in place to enable participation by the disadvantaged, disabled and marginalised.

Biometric bias

Whether biometrics should be required to unlock and link up information stored by commercial, financial and government bodies again requires reflection on the kind of biometric eco-system that evolves during this decade where artificial intelligence capabilities derive decisions from scanning biometrics – and other information – to build new ‘pictures’ of individuals on which further decisions are taken, again often without the intervention of human agency.

If the original algorithm is flawed, biased towards detecting certain biometrics or based on fake, misleading or false information, citizens face even greater problems getting it corrected. The onus for reliable, dependable, secure and trustable biometrics has never been greater on the creators of ever more precise potentially intrusive biometrics, such as those allegedly predicting and recognising emotional states and thought patterns.

The reliability and accuracy of systems such as iBorderControl has been challenged recently in court, and casual use of biometrics (or their scraping for unknown and unknowable purposes by unseen bots) alerts us to how readily public trust in biometrics can be eroded by careless use or misrepresentation.

MEPs are rightly concerned about the potential consequences of sensitive identifying personal data (including biometric information stored in crisis situations for instance) being misappropriated, compromised, re-spliced and sold. If biometric data collection is seen as enabling illegitimate surveillance public trust in the applications and both private and public authorities will suffer.

An ethical biometric eco-system fit-for-purpose

Biometric applications are neither value-free nor neutral in their impact. The challenge in refining the biometrics eco-system therefore lies with harnessing a deeper appreciation of the way in which biometric applications benefit society.  Working together with policymakers and society in creating a digital society is therefore essential.  The Biometrics Institute community has considered not just who provides the biometric but crucially who has access to it and for what purpose precisely.  Meeting that challenge is as valid today as 20 years ago.

Juliet Lodge
Member of the Privacy Expert Group, Biometrics Institute
juliet@saher-uk.com

Applications and use cases | Privacy and policy | Research and development | Technology innovation

Lead the debate with us on the
responsible use of Biometrics