Apple to Scan for CSAM

Posted on AllSides August 9th, 2021

Last week, “Apple unveiled plans to scan U.S. iPhones for images of child sexual abuse… The tool designed to detected known images of child sexual abuse, called ‘NeuralHash,’ will scan images before they are uploaded to iCloud. If it finds a match, the image will be reviewed by a human. If child pornography is confirmed, the user’s account will be disabled and the National Center for Missing and Exploited Children notified. Separately, Apple plans to scan users’ encrypted messages for sexually explicit content as a child safety measure.” AP News


Read full story

The Flip Side

AllSides Media Bias Rating: Mixed
AllSides Media Bias Rating: Mixed
See full bias rating for The Flip Side
Learn about media bias

More News about Sexual Misconduct from the Left, Center and Right

From the Left

From the Center

From the Right