iPhones Scan Photos for Child Pornography – OK, But What’s Next?

A revelation swept accross ITSec sites this week concerning Apple’s apparent anti-paedophile algorithms that scan the iPhone devices for child pornography. The measure taken by Apple scans the user’s photos as these are taken, but probably it applies to the downloaded pictures as well.

While the phenomena is hardly new, being annouced in the first month of 2020, for some reason it reached wider audiences now.

It comes without doubt that such a noble reason will have its way,  but a number of privacy advocates casted their doubts whether this is a breach of confidentiality and consequently a breach of the terms of agreement between Apple and its users.

The fact is that in theory all communications are encrypted and Apple should have no means to decrypt the photos so as to match these against datasets of nudes and children’s facial expressions needed to undertake the necessary match-up. So, essentially it is a low-key way of saying that all your data is routinely scanned by Apple.

With the obvious impossibility to defend a standpoint of letting child pornography spread there lies the problem why the resourcefulness of the chaps at Apple scored a point against all users. If you have nothing to hide, you have no reason to voice your concerns – they might say.

But we all know that sooner or later these very same machine learning algorithms will (or already are) scan for swastikas, guns, black faces, asian faces, white faces an whatnot.

And this also means that the users of iPhones entered an age where privacy is simply a notion and nothing else.

About the Author

Counter-AI Collective
Counter-AI Collective

Be the first to comment on "iPhones Scan Photos for Child Pornography – OK, But What’s Next?"

Leave a comment

Your email address will not be published.


*