BCS explains why abuses of facial biometric data means we need to get serious about safeguards
The top concerns expressed by IT professionals in consultations BCS have carried out over the last 18 months highlight the severe risks of biometric data misuse. These are:
● Poor data governance resulting in companies unable to effectively monitor how data is used, who is using the data, or where duplicates of data are stored, which may result in any unethical practice going undetected.
● Lack of diversity in product development teams leading to hard-wired unconscious bias in new products or services that are data-dependent.
● Using incomplete data to incorrectly infer personal characteristics.
● Allowing data to be improperly shared within organisations.
● Improperly aggregating data from different sources to infer personal characteristics.
● Incorrectly cleaning data
● Incorrectly restructuring data resulting in the wrong data being associated with an individual.
● Incorrectly merging different data pipelines from third parties.
● Not conducting proper due diligence to ensure correct provenance of data through the supply chain (which may well be offshored and distributed across different national jurisdictions).
● Using data analysis methodologies that are invalid in a particular context.
● Applying analytical models as part of decision-making processes that are poorly tested (including, for example, inappropriate Machine Learning based neural networks).
● Using invalid anonymisation techniques that do not provide enough protection against deanonymisation.
● Storing data insecurely so that it is at risk of being misappropriated.
Dr Mitchell says the feedback from these consultations has been quite clear: “Virtually every time we hear the same alarming worries about data governance practices. This directly links to worries about the current cavalier attitude to facial recognition technology. For instance, misappropriated facial biometric data could lead to opportunities for virtual doppelgängers, and poorly captured biometric data can lead to cases of mistaken identity that can have dire consequences that are hard to correct.
Much of the concern has been focused on the immaturity of the technology. An even bigger concern is what your biometric data is used for, or rather misused for, once it’s been captured and added to a database.”
The concerns raised by the IT profession come after a series of recent revelations about the widespread use of facial recognition technology. This includes the release of a report by Big Brother Watch, a civil liberties and privacy campaigning organisation, that says there is a facial recognition ‘epidemic’ across privately owned sites in the UK.
It says it has found major property developers, shopping centres, museums, conference centres and casinos using the technology. Also, the Information Commissioners Office, the UK’s privacy watchdog, has opened an investigation into the use of facial recognition cameras in a busy part of central London in Granary Square, close to King’s Cross station.
Dr Mitchell said: “All of this should mean we treat facial recognition technology with extreme caution.
“If the police can’t get it to work properly, why should we assume that property developers, museums, or music festival organisers can make it work?”