Apple announces a new protection policy for children; beneficial for customers or privacy intrusion?
Have you ever seen a system that blurs obscene content or warns you about the content when you access it? On August 5th, Apple recently communicated they would start doing this kind of warning to avoid children seeing sexual content. What seems as an unquestionably well-intentioned idea, interferes with the end-users right of privacy.
Apple refers to its policy as an advance to technology coming with security and child safety. The new policy analyzes messages received or sent by an infant over iMessage to inform about explicit content and/or to alert their parents about sexual graphic photographs (the content will be blurred). Additionally, when the user has opted to use the iCloud services, Apple will use a device based algorithm to scan all photos to be safed on iCloud for Child Sexual Abuse Material (CSAM), create a hash value of the picture and compare it against CSAM image hashes proveded by the National Center for Missing and Exploited Children (NCMEC).
However, this is not the first company to arrange such policy. Other giant companies that promote this policy are Facebook and Google. Facebook and Google on the other hand are scanning every uploaded photo in their cloud.
Some of the principal complaints customers make are:
- How can we know that our data is secure?
- If the system can check whether it is a sexually explicit photo or not, what is the probability that it can save our personal information?
- Is this a caution notice just to report any explicit content or make it easier to scan our private photos?
- We can see how Apple can make improvements in privacy policies to help us. But how can we be sure this is the only intention to get through?
Various digital rights organisations and even Apple employees raised concernes over these changes. Many privacy professionals see a first step to decrease the provided privacy featuers on Apple devices and that in future, Applie will comply with local authorities around the world to scan for other information within the ecosystem. This new policy will enable Apple to promote its end-to-end encryption policy but analyze the content before the transmission takes place.
However, Apple has demonstrated several times the responsibility the company has with its clients and the statutes. Further, Data Protection Authorities and Privacy Professionals around the world will most likely following the discussion in the US and/or getting enganged in it. For that reason, it will be interesting to follow the started discussion and how Apple will finally implement the policy. Such policies need to be discussed to find a common ground on the topic of personal privacy and a states interest in law enforcement.