Apple to Scan for CSAM
Last week, “Apple unveiled plans to scan U.S. iPhones for images of child sexual abuse… The tool designed to detected known images of child sexual abuse, called ‘NeuralHash,’ will scan images before they are uploaded to iCloud. If it finds a match, the image will be reviewed by a human. If child pornography is confirmed, the user’s account will be disabled and the National Center for Missing and Exploited Children notified. Separately, Apple plans to scan users’ encrypted messages for sexually explicit content as a child safety measure.” AP News
Many...