Facing up to the challenges of technology
From unlocking phones, to tagging photos on Facebook, to using smart gates at border checks—automated facial recognition technology (FRT) is making our lives more streamlined and seamless.
However, Associate Professor Nessa Lynch says the increasingly ubiquitous technology also presents ethical risks and challenges, and requires thoughtful cross-sector collaboration to ensure we’re moving in the right direction.
Nessa is the principal investigator for a research project called ‘Facial Recognition Technology in New Zealand: Developing a Legal and Ethical Framework’, which has received a grant of more than $50,000 from the New Zealand Law Foundation. The project began in June this year and aims to examine the legal and ethical issues around the growing use of FRT in New Zealand.
FRT involves identification of an individual based on an analysis of geometric facial features, and a comparison between the algorithm created from the captured image and one already stored. Many governments around the world already employ this technology to monitor their citizens, including Russia, where plans have been mooted to equip police with FRT glasses for identifying ‘people of interest’ in a crowd, and China, where FRT is used to scan CCTV footage as part of a ‘social credit system’ that assigns citizens scores according to their social behaviour.
As Orwellian as these scenarios may sound, Nessa says the technology itself isn’t inherently sinister—but we need to make sure it is properly regulated as it becomes increasingly commonplace in New Zealand. Nessa notes that there is currently no regulation in New Zealand that directly names or specifies FRT. “Obviously the technology has immense value in promoting societal interests such as efficiency and security, but it also represents a threat to some of our individual interests, particularly privacy and freedom from discrimination,” she says.
Nessa’s project is certainly timely—in September this year in Wales, human rights activist Ed Bridges lost the world’s first legal challenge over police use of FRT. Bridges argued that the use of FRT by the South Wales Police caused him distress and violated his privacy and data protection rights by processing images taken of him in public.
However, the case was ultimately dismissed by the High Court, which ruled that the technology was not unlawful.
The project team comprises Nessa, Dr Marcin Betkier from the Faculty of Law, Professor Liz Campbell from Monash University, and Dr Joe Purshouse from the University of East Anglia. Nessa explains the project has several key aims and research questions. First they will undertake a mapping exercise to see how the technology is currently being used or may be used in New Zealand, informed by overseas experiences with the technology. “We know it’s increasingly being used here, but it’s such an emerging technology that we don’t really have a sense of how New Zealand is using it yet,” she says.
The group will also examine the rights and interests that may be affected by the use of FRT. The final part of the project will assess what regulation New Zealand has at the moment, and propose what potential forms of future regulation might look like. The group then aims to produce a report in mid-2020.
In October, Nessa hosted international experts at the Law School for a series of events including a public panel and a stakeholder workshop. Nessa and the project team were joined by Rachel Dixon, Privacy and Data Deputy Commissioner for Victoria, Australia, and Claire Garvie from the Center on Privacy and Technology at Georgetown Law in Washington D.C.
One of the issues raised during the panel discussion was the lack of transparency in some jurisdictions around how FRT is used. Claire Garvie said that in the United States, where FRT is a very common investigative tool for law enforcement, driver’s licence databases provide police with an enormous library of facial images. “Over half of all American adults are enrolled in a database that’s accessible for criminal investigations thanks to getting a driver’s licence—this is not something that most Americans are aware of,” she said.
“There are no rules when it comes to the use of facial recognition by public agencies in the United States. With very few exceptions there are no laws that govern the use of this technology. As a result it has been implemented largely without transparency to the public, and without rules around auditing or public reporting.”
FRT can also have a ‘chilling effect’ on public assemblies, freedom of expression, and the general use of public space by certain communities and demographics—as Claire noted, FRT has already been used in Baltimore during a protest against police treatment of people in custody.
Another issue raised was inaccuracy and the potential for FRT to produce false positives, and Rachel Dixon spoke about the distorting effect this can have. “It’s very hard to shake somebody’s perception that a match is wrong after the machine has said it’s right. There’s a lot of research to show that the first thing people are told is the thing they believe,” she said.
“Like DNA matches it’s not a perfect technology—it makes mistakes,” says Nessa. She notes a case where a Dunedin supermarket used FRT to run CCTV footage against a list of shoplifters and misidentified someone.
“There’s also the potential for it to be used disproportionately against different groups.”
As the project evolves, Nessa and her team will work with sector stakeholders including the New Zealand Police, Immigration, Customs and security agencies to identify the key threats and opportunities that may arise from the use of FRT in New Zealand. To sum up the journey ahead, Nessa refers to a quote from the Bridges decision in the Welsh High Court: ‘The algorithms of the law must keep pace with new and emerging technologies.’