Apple will inspect every photo uploaded to the cloud by US users of iPhones and iPads to detect images of child sexual abuse, and will report any found to a nonprofit that investigates cases of child exploitation. The new measure has been praised by child welfare charities but condemned by privacy campaigners, who believe it opens the door to other types of surveillance from authoritarian governments.
Rather than examining the photographs themselves, Apple’s neuralMatch software will include an algorithm that creates …
Existing subscribers, please log in with your email address to link your account access.
Inclusive of applicable taxes (VAT)