Before releasing their latest operating system, iOS 15, Apple announced that they will include new measures in the their devices to combat child pornography. These new measures include alerts to parents for iMessages containing nude images sent to their child’s device, as well as automated hashing to automatically match any images stored in the Apple iCloud to known child sexual abuse material, reporting any matches to law enforcement. It is this second measure that has caused controversy amongst privacy advocates as some claim that this opens up a possible security back door that could be abused by authorities, governments, or hackers, and this undermines Apple’s long term pro-privacy stance. Are Apple trying to protect children?

