(London) Simon Mackenzie, a security guard at the QD store in suburban London, was out of breath. He had just chased after three thieves who had run off with several packets of laundry soap. While waiting for the police, he went into the back room to do an important task: find the faces of the thieves.

On an old computer, he viewed the footage from the security cameras, taking an enlarged screenshot of each thief’s face. Then he uploaded their photos to the facial recognition site Facewatch, which QD Stores uses to identify shoplifters. The next time these thieves entered a neighborhood store using Facewatch, the staff would receive an alert.

“It’s like being told, ‘The person you caught last week just came back,'” Mackenzie says.

The use of facial recognition technology by the police is heavily documented, but what the private sector does with it has received less attention. Today, with the improvement of technology and the decline in its cost, it is interfering more and more in people’s lives. Facial recognition is no longer the preserve of the state: it is increasingly being used to identify shoplifters, difficult customers, and even opposing parties in litigation.

Facewatch, a UK database, is used across the country by retailers exposed to petty crime. For £250 a month (about C$415), Facewatch maintains a watchlist that stores in the same neighborhood share. When Facewatch spots a face on file, a text alert notifies the store: employees can watch the person or kick them out.

Mackenzie adds one or two new faces a week, he says, mostly thieves of cheap diapers, food, pet products and other things. He sympathizes with their economic difficulties, but adds that theft had become rampant: facial recognition was necessary, he said. At least once a day, Facewatch warns him that a data subject has just entered the store.

Facial recognition is spreading as the West struggles to regulate advances in artificial intelligence. The European Union is working on rules that would ban many uses of facial recognition. But in New York, Mayor Eric Adams is encouraging retailers to try it to curb crime. MSG Entertainment, owner of Madison Square Garden and Radio City Music Hall, used automated facial recognition to deny entry to attorneys representing adversaries in court.

Among democracies, the UK is the leader in large-scale facial recognition, a use authorized by courts and regulatory authorities. In London and Cardiff, police are piloting this technology to identify wanted criminals as they walk down the street. In May, she sifted through the crowds at the coronation of King Charles III.

But its use by retailers is denounced as a disproportionate practice for minor crimes. Targeted people have little way of knowing that they are on file. And how to dispute? In a lawsuit filed in 2022, the collective Big Brother Watch called this system “Orwellian in the extreme”.

Facewatch was founded in 2010 by Simon Gordon, owner of a swanky downtown London bar housed in a 19th-century wine cellar…and popular with pickpockets.

At the time, Mr. Gordon commissioned a small piece of software from programmers to send his surveillance videos online to the police. The police would save time, complaints would be dealt with more quickly, there would be more arrests, he hoped.

The idea generated little interest. But Mr. Gordon was passionate about this issue. Fascinated by the evolution of facial recognition, he came up with the idea of ​​a database that retailers could feed and share. Kind of like photos of thieves displayed in front of cashiers, but broadcast widely to identify thieves in real time.

In 2018, Gordon thought the technology was ripe. His sales pitch was ready.

He launched Facewatch, which licenses facial recognition software from Real Networks and Amazon. His product is now used in nearly 400 stores in the UK. When a person walks into a store, Facewatch collects biometric information from their face and compares it to that of people in a database powered by millions of photos and videos.

The list of people on file with Facewatch is constantly growing as stores upload photos of shoplifters and problematic customers. The records are deleted after one year.

In October, a woman buying milk at a supermarket in Bristol, England, was stopped by an employee who ordered her out. Facewatch had reported her as a shoplifter, she was told.

The woman – who asked The New York Times to withhold her name out of respect for her privacy – said there was a mistake. When she contacted Facewatch a few days later, the company apologized, confirming that it was a mistake on the person. His story was corroborated by documents provided by his attorney and Facewatch.

After the woman threatened to sue Facewatch, the company dug into its records and discovered that the woman had been on file following an incident ten months earlier involving goods worth 20 lbs. The system “worked perfectly,” concludes Facewatch.

While technology correctly identified the woman, it did not leave much room for human discernment. Neither Facewatch nor the store where the incident occurred contacted the woman to advise her that she had been on file and ask what had happened.

The lady claims to have no memory of the incident and to have never shoplifted. Maybe she left without realizing that the debit card payment hadn’t worked at a self-checkout, she ventures.

Madeleine Stone, legal and political director of Big Brother Watch, denounces this situation.

Mr. Gordon declined to comment on the incident in Bristol.

Generally, “errors are rare, but there are,” he says. “If it happens, we acknowledge our mistake, we apologize, we delete all relevant data to prevent it from happening again and we offer commensurate compensation. »

Civil liberties groups have raised concerns about Facewatch. They believe its use to prevent petty crime could be illegal: UK privacy law requires biometric technologies to carry a “substantial public interest”.

The UK’s Information Commissioner’s Office, the privacy regulator, investigated Facewatch for a year. In March, he concluded that Facewatch’s system was legal, but only after getting the company to change how it works.

Stephen Bonner, assistant commissioner for regulatory oversight, said the investigation prompted Facewatch to change its policies. There will be more warning signs in stores; stores will only share violent or serious event records; Facewatch will issue alerts only on repeat offenders. This means that people will no longer be on file after a single incident, as happened to the lady from Bristol.