(San Francisco) In 2021, Apple found itself at the center of controversy over a plan to scan iPhones for child molestation records. Privacy experts warned of the potential for government abuse, and the backlash was so strong that Apple eventually scrapped the project.

Two years later, Apple is facing criticism from child safety advocates and investors who are calling on the company to do more to protect children from online abuse.

A children’s advocacy group, Heat Initiative, has raised $2 million for a new national ad campaign calling on Apple to detect, report and remove child pornography from iCloud, its online storage platform.

Later this week, the association will post digital announcements on websites popular with policymakers in Washington, such as Politico. She will also put up posters in San Francisco and New York: “Child porn content is stored on iCloud. Apple allows it. »

These criticisms reflect a difficult situation that Apple has been facing for years. The company has made privacy a central part of its iPhone pitch to consumers. But that promise of security has helped make its services and devices, of which 2 billion are in use, useful tools for sharing child sexual abuse images.

A group of two dozen investors managing nearly US$1 trillion in assets (more than C$1.3 trillion) has also asked Apple to publicly report the number of infringing images it detects in its devices and services. .

Two investors — Degroof Petercam, a Belgian asset manager, and Christian Brothers Investment Services, a Catholic investment firm — will submit a shareholder proposal this month that would require Apple to provide a detailed report on the effectiveness of its safety tools to protect children.

“Apple seems stuck between privacy and action,” said Matthew Welch, an investment specialist at Degroof Petercam. “We thought a proposal would wake up the manager and get him to take this issue more seriously. »

Apple was quick to respond to child safety advocates. In early August, its privacy officers met with the group of investors, Welch said. Then on Thursday, the company responded to an email from the Heat Initiative group with a letter that defended its decision not to scan iCloud. She forwarded this correspondence to Wired, a technology publication.

In Apple’s letter, Erik Neuenschwander, director of user privacy and child safety, said the company had concluded that it was “not practicable” to scan data. iCloud Photos without “jeopardizing the security and privacy of [its] users”.

“Analyzing one type of content, for example, opens the door to mass surveillance and could spark a desire to look for other encrypted messaging systems,” Neuenschwander said.

Apple, he added, has created a new default feature for all child accounts that triggers a warning if they receive or attempt to send nudity images. This feature is intended to prevent the creation of new child pornography and to limit the risk of predators pressuring or blackmailing children for money or nude images. These tools have also been made available to application developers.

In 2021, Apple said it would use a technology called “image hashing” to spot illegal content on iPhones and in iCloud.

But the company did not communicate this project to privacy experts, which reinforced their skepticism and fueled fears that governments could misuse the technology, said Alex Stamos, director of the Stanford Internet Observatory at the Cyber ​​Policy Center, which opposed this idea.

Last year, the company quietly scrapped its iCloud scanner project, catching child protection groups by surprise.

Apple has drawn praise from privacy and child safety groups for its efforts to limit the creation of new nudity images on iMessage and other services. But Stamos, who praised the company’s decision not to scan iPhones, said it could do more to prevent people from sharing problematic images in the cloud.

“You can have privacy if you store something for yourself, but if you share something with someone else, you don’t have the same privacy,” Stamos said.

Governments around the world are pressuring Apple to act. Last year, Australia’s Electronic Security Commissioner released a report criticizing Apple and Microsoft for not doing more to proactively monitor their services for illegal content.

In the United States, Apple made 160 reports in 2021 to the National Center for Missing and Exploited Children, a federally designated clearinghouse for illegal content. Google made 875,783 reports, while Facebook made 22 million. These reports do not always reflect genuinely illegal content; some parents had their Google accounts suspended and reported to the police for images of their children that were not criminal in nature.