Now it looks like the feature might have been scrapped altogether. The company said “based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.” Security experts and privacy advocates expressed concern that the system could be abused by highly resourced actors, like governments, to implicate innocent victims or to manipulate the system, while others ridiculed it as being ineffective at identifying images of child sexual abuse.This led to dozens of civil liberties groups calling on Apple to abandon plans to roll out the controversial feature.ĭespite a publicity blitz that followed in an effort to assuage fears, Apple relented, announcing a delay to the rollout of the CSAM scanning feature.
At the time, Apple claimed - unlike cloud providers that already offered blanket scanning to check for potentially illegal content - it could detect known illegal imagery while preserving user privacy, because the technology could identify known CSAM on a user’s device without having to possess the image or device, or knowing its contents.Īpple faced a monumental backlash in response.
Apple has quietly removed from its website all references to its child sexual abuse scanning feature, months after announcing that the new technology would be baked into iOS 15 and macOS Monterey.īack in August, Apple announced that it would introduce the feature to allow the company to detect and report known child sexual abuse material, known as CSAM, to law enforcement.