NZ Police finally has facial recognition policy - but is it strict enough?

10:04 am today
Facial recognition technology.

Photo: 123RF

  • First-ever police policy on facial recognition strict in some ways, loose in others
  • A ban on use on live footage remains, reflecting caution
  • But a lack of external oversight runs counter to Europe's world-first AI law

The police's first ever policy on facial recognition bans its use, for now, on live camera footage.

While it is stricter on that front than Europe's new law, it is looser in other ways.

The police already use facial recognition in at least five different ways, but their lack of policy around it has been evident since a misguided trial back in 2020, on Clearview AI.

The new six-page policy stated its use must be approved and monitored and only on lawfully gathered images.

It said there must be a significant delay between gathering the footage, and analysing it, as the risks of live facial recognition "outweigh the potential benefits".

The EU's new AI law allows some use on live footage within strict limits, but it pulled back in May from curtailing its use far more.

The EU also imposes some external oversight, which is missing from New Zealand's new policy; here, the police audit themselves.

Read more:

Globally, police use of facial recognition is spreading, often provoking controversy, such as in the UK, where it is boosting its deployment to curb far-right riots this month, against the protests of two dozen human rights groups.

While Europe has moved to legislate police use - and countries including France go further, with a compete ban - New Zealand is taking a different path: "Self-regulation through policy and practice guidelines," as Nessa Lynch - who led research into New Zealand police use of it several years ago - describes it.

The new, six-page policy said FRT's use would be "approved, controlled, monitored, and governed", with internal auditing where possible, and only deployed on images police have lawful access to.

"This policy ensures that appropriate safeguards are in place for Police's use of FRT and the storage of personal information, and that use of FRT is lawful, proportionate, and appropriate in a New Zealand policing context."

Lawful access is key: Police have thousands and thousands of unlawfully gathered images - mostly of Māori young people - which they are trying to find a way to delete, on the orders of the Privacy Commissioner.

Also, police have access to thousands of privately owned CCTV cameras, many of them linked into networks that have facial recognition capability.

"Police does not use the live FRT capabilities of third-party systems," said the policy.

This comes three years after they set up an independent expert panel on emerging tech, and began putting out a list of the various technologies, in an attempt to be more transparent. The list is not exhaustive, leaving off, for instance, tech police use to scrape social media.

The new policy lists five ways police already use FRT, such as in investigations for retrospective matching against the several million images of people they have in databases; in missing person inquiries; to help identify dead people; and in registering firearms owners.

They use the BriefCam system with facial recognition to speed up going through hours of video. The French police are in hot water over their use of BriefCam, which is unlawful under their country's ban.

A second tool with FRT used locally, from Israeli firm Cellebrite, can break mobile phone encryption. Social services in Australia were criticised last year for using it against welfare fraudsters.

Lynch and other researchers have found New Zealand police were comparatively cautious in rolling out facial recognition, especially compared to US law enforcement, and the new policy reflects this in its ban on live use.

"Police will not make decisions regarding the implementation of live FRT until the impacts from security, privacy, legal, and ethical perspectives are fully understood, and it has engaged with communities and understood their views," it said.

However, the EU is much more explicit in moving to ban some types of FRT among the biometric technologies that pose an "unacceptable risk" and a threat to people.

Its blacklist includes biometric identification and categorisation of people, and most forms of live FRT.

The EU's world-first AI Act provides an 'out' for real-time use for serious crimes.

It is stricter than New Zealand over delayed use, signalling a judge or other external sign-off would often be needed.

RNZ asked police for comment on the lack of oversight in the new policy, or it not instituting a "serious crimes" threshold for FRT use.

The local policy said proportionate use would need to account for human rights and privacy.

The tech - there are several police use - must have "sufficiently high accuracy and ... not operate with an unacceptable level of bias or discrimination".

Images must be retained, stored and destroyed in line with other police rules around those.

Often images can be kept for months; also, FRT involves private companies' tech systems that sometimes store the data in cloud-computing centres in Australia.

The policy does not mention Māori or Te Tiriti at all. RNZ has asked police why not. Māori lawyers have said Māori consider biometrics a taonga.

It has been released while the Office of the Privacy Commissioner is still working on a code for biometrics (your face is one biometric, as are your fingerprints, and even your gait).

Scotland in 2021 became the first country to implement a legally binding code of practice for biometrics in policing.

Lynch, in new research into controls in May, said: "There are many examples of good practice in terms of robust guidelines or oversight by independent observers and reviewers, but there is the ever-present risk of internal policy settings changing due to changes in leadership or attitudes."

Police said: "As part of the development of the policy, it was shared with the understanding policing delivery independent panel. We sought input from our internal Māori, Pacific, and Ethnic Services group, as well as Māori members of the expert panel on emergent technologies.

"We have also kept the Office of the Privacy Commissioner informed throughout the process and are in regular contact with them on. There is broader work in progress around how police treat Māori data, which is also an ongoing conversation across many government agencies."

Get the RNZ app

for ad-free news and current affairs