The so-called NeuralHash technology would have scanned images just before they are uploaded to iCloud Photos. Then it would have matched them against known child sexual abuse material on a database maintained by the National Center for Missing and Exploited Children. If a match was found then it would have been manually reviewed by a human and, if required, steps taken to disable a user’s account and report it to law enforcement. Following widespread criticism from privacy groups and others, worried that the on-device tracking set a dangerous precedent. Apple has delayed plans to roll out detection technology which would have scanned US users’ iPhones in search of child sexual abuse material .Do you think privacy protection was the major concern here?