By Michael Liedtke and Matt O'Brien

Apple is indefinitely delaying plans to scan iPhones in the U.S. for images of child sexual abuse following an outcry from security and privacy experts who warned the technology could be exploited for other surveillance purposes by hackers and intrusive governments.

The postponement announced Friday comes a month after Apple revealed it was getting ready to roll out a tool to detect known images of child sexual abuse. The tool would work by scanning files before they're uploaded to its iCloud back-up storage system. It had also planned to introduce a separate tool to scan users’ encrypted messages for sexually explicit content.

Apple insisted its technology had been developed in a way that would protect the privacy of iPhone owners in the U.S. But the Cupertino, California, company was swamped criticism from security experts, human rights groups and customers worried that the scanning technology would open a peephole exposing personal and sensitive information.

“Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” Apple said in an update posted above its original photo-scanning plans.

Apple never set a specific date for when the scanning technology would roll out, beyond saying it would occur some time this year. The company is expected to unveil its next iPhone later this month, but it's unclear if it will use that event to further discuss its change in plans for scanning the devices in the U.S.

The intense backlash to the scanning technology was particularly bruising for a company that has made personal privacy a marketing mantra. Apple contends it is more trustworthy than other major technology companies such as Google and Facebook that vacuum up information about people's interests and location to help sell digital ads. Apple CEO Tim Cook is known to repeat the catchphrase “Privacy is a fundamental human right.”

The photo scanning technology was “a really big about-face for Apple," said Cindy Cohn, executive director for the Electronic Frontier Foundation, one of the most vocal critics of the company's plans. “If you are going to take a stand for people's privacy, you can't be scanning their phones."

Cohn applauded Apple for taking more time to reassess its plans and urged the company to talk to a broader range of experts than it apparently did while drawing up its scanning blueprint in its typically secretive fashion.

Matthew Green, a top cryptography researcher at Johns Hopkins University and another outspoken critic of Apple, also supported the delay. He suggested the company talk to technical and policy communities and the general public before making such a big change that threatens the privacy of everyone’s photo library.

“You need to build support before you launch something like this,” Green said. “This was a big escalation from scanning almost nothing to scanning private files.”

When Apple announced the scanning technology last month, Green warned that the system could be used to frame innocent people by sending them seemingly innocuous images designed to trigger matches for child pornography. That could fool Apple’s algorithm and alert law enforcement.

Not long after Green and privacy advocates sounded warnings, a developer claimed to have found a way to reverse-engineer the matching tool, which works by recognizing the mathematical “fingerprints” that represent an image.

Apple traditionally has rejected government demands for data and access to devices that it believes are fishing expeditions or risk compromising the security of its customers or devices.

In a highly publicized act of defiance, Apple resisted an FBI demand in 2016 that the company crack the code protecting an iPhone used by one of the killers during a mass shooting in San Bernardino, California. It argued at the time that it would be opening a digital backdoor that could be exploited by hackers and other unauthorized parties to break into devices. In that instance, Apple was widely praised by civil rights and privacy groups.

——

O'Brien reported from Providence, Rhode Island. AP Business Writer Kelvin Chan contributed to this story from London.

Share:
More In Business
Disney content has gone dark on YouTube TV: What you need to know
Disney content has gone dark on YouTube TV, leaving subscribers of the Google-owned live streaming platform without access to major networks like ESPN and ABC. That’s because the companies have failed to reach a new licensing deal to keep Disney channels on YouTube TV. Depending on how long it lasts, the dispute could particularly impact coverage of U.S. college football matchups over the weekend — on top of other news and entertainment disruptions that have already arrived. In the meantime, YouTube TV subscribers who want to watch Disney channels could have little choice other than turning to the company’s own platforms, which come with their own price tags.
Universal Music and AI song generator Udio partner on new AI platform
Universal Music Group and AI platform Udio have settled a copyright lawsuit and will collaborate on a new music creation and streaming platform. The companies announced on Wednesday that they reached a compensatory legal settlement and new licensing agreements. These agreements aim to provide more revenue opportunities for Universal's artists and songwriters. The rise of AI song generation tools like Udio has disrupted the music streaming industry, leading to accusations from record labels. This deal marks the first since Universal and others sued Udio and Suno last year. Financial terms of the settlement weren't disclosed.
Load More