Apple Abandoning Privacy-Busting Child Porn ‘CSAM’ Scanner for iPhone

PJ Media

Share on facebook
Facebook
Share on twitter
Twitter
Share on linkedin
LinkedIn

No one but the dregs of humanity is in favor of child porn, but Apple has apparently abandoned its plan to build “CSAM” scanning software into its mobile operating systems for iPhone and iPad.

“While nobody less awful than Jeffrey Epstein wants to protect predators, more than a few are wondering what has happened to the world’s most valuable company’s commitment to user privacy,” I wrote back in August when the Cupertino tech giant announced its privacy-busting feature. But a new report indicates that Apple has listened to customer pushback.

AppleInsider reported on Wednesday that the company has in the last few days “removed all signs of its CSAM initiative from the Child Safety webpage on its website.”

That’s a good indication that operating system-level photo scanning will not be coming to a future update to iOS 15 or iPadOS 15 as Apple had promised last summer.

The good parts of Apple’s Child Safety feature set will live on, including on-device detection of sexual materials on the phones of minor children. The data doesn’t go anywhere and isn’t seen by anyone. The detection system merely alerts kids how they can get help.

Stories You May Like