The artifical intelligence field, which is based on irreproducible machine learning caused quite a few sceptic remarks over the years, but this time some researchers finally started to speak some truth. The main problem that puts ML/AI outside of the rigid scientific boundaries is -“that researchers often can’t replicate their own results – and virtually no one else can, either”.
Ali Rahimi of Google’s AI division even went as far as to call the entire AI field “alchemy”. Ben Recht, of University of California, Berkeley and Csaba Szepesvári, of DeepMind in London are arguing that “The purpose of science is to generate knowledge. You want to produce something that other people can take and build on” .
Apart form the reproducibility problem of any AI, there is the the impossibility of finding out how a particular AI has come to its conclusions. This is no news, as “according to a directive from the European Union, companies deploying algorithms that substantially influence the public must by next year create “explanations” for their models’ internal logic. The Defense Advanced Research Projects Agency (DARPA), the U.S. military’s blue-sky research arm, is pouring $70 million into a new program, called Explainable AI, for interpreting the deep learning that powers drones and intelligence-mining operations”. Of course researchers will have the remedy soon enough: the way to explain deep learning is simply to do more deep learning. Even though it makes no sense for us, it does for them apparently.
Rahimi, on his part says that “without deep understanding of the basic tools needed to build and train new algorithms, researchers creating AIs resort to hearsay, like medieval alchemists. People gravitate around cargo-cult practices, relying on folklore and magic spells”.
What is not being said or written in their papers, however, is that we are already living in an era where many up-and-coming services are based on utter nonsense. And billions of dollars are being spent on bullshit technologies, which will eventually lead to a burst of the bubble. But what will remain of the inflated tech giants if the trust is gone?
Some sources and reading material: