According to a BBC news articleand corresponding video, Chinese companies filed patents related to AI algorhythms that are able to detect facial micro-expressions, so as to analyze the emotions of subjects. Allegedly some patents point to systems that might or might not be in use even monitor “minute changes” in skin pores on the face of subjects, as a means of detecting micro-expressions. The software could be then used to analyze the subject’s emotions.
While the aforementioned news article is placed in a wider context of the minority issues in the People’s Republic of China, the technology involved is not seems to be a completely new approach. The use of the AI to identify humans based on their voice, DNA, face, motion and whatnot has been a booming business for the last fifteen years or so. With the leaps forward in the technology of the digital cameras, it is now even possible to ID someone based on the fingerprint – based on a quite everyday photo showing a finger on Facebook.
The article suggests that the Chinese law enforcement organizations (LEOs) actually fielded and started to use a complex system, complete with a central database, obviously a network and “emotion sensing cameras“, that are able to detect the corresponding changes when placed 3 metres away from someone’s face. While sympathies and political beliefs might vary, it would be a good guess that such systems are as widely use troughout the World as authorities and LEOs are able to procure these based on their respective legal and financial constraints.
It is even more worrisome that a host of start-ups and scale-ups are marketing a large number (for example: imotions and 4 Little Trees) of promising AI solutions, very much like this. Not to mention the bigger players, like Amazon’s Rekognition or Facebook’s Face API. And some outlets even have a name for this phenomena: Emotion recognition technology (ERT). And let’s be assured: for scientists, this is everything but new.
For if it were, they would not draw attention to its dangers on the pages of the venerable the Nature magazine…
Counter-AI Collective Analysis:
The BBC, at the end of the article has its interviewee quoted as saying: “With artificial intelligence we have nowhere to hide“. And this is what it is all about. While the all-over surveillance of all people is a bit farther down the road, but the willingness and capabilities are here, on both the engineering and both the end-user sides.
So, is there a defence against these AI algorythms? Of course, if you are chained to a chair and forced to look into the emotion sensing camera, then it is high time that they will find out that you are unhappy for the moment. But if you are just a casual one-in-a-million passer-by then chances are that listening to your favourite music will do the trick. And if you happen to spend a lot of time somewhere where there are a number of such devices, well, you should know where are these and how not to expose your face for too much time.
Or wear a mask that will completely removes suspicion.