Apple to Scan for CSAM

Posted on AllSides August 9th, 2021

Last week, “Apple unveiled plans to scan U.S. iPhones for images of child sexual abuse… The tool designed to detected known images of child sexual abuse, called ‘NeuralHash,’ will scan images before they are uploaded to iCloud. If it finds a match, the image will be reviewed by a human. If child pornography is confirmed, the user’s account will be disabled and the National Center for Missing and Exploited Children notified. Separately, Apple plans to scan users’ encrypted messages for sexually explicit content as a child safety measure.” AP News

Many...

Read full story

AllSides Picks

https://www.theflipside.io/archives/apple-to-scan-for-csam

More News about Sexual Misconduct from the Left, Center and Right

From the Left

From the Center

From the Right