Google has declared war on the independent media and has begun blocking emails from NaturalNews from getting to our readers. We recommend GoodGopher.com as a free, uncensored email receiving service, or ProtonMail.com as a free, encrypted email send and receive service.
01/19/2018 / By Ethan Huff
A professor of electrical and computer engineering at The University of Oklahoma has issued a warning about the threat of artificial intelligence (AI) robots taking over the world. According to Dr. Subhash Kak, technological automation is already in the process of creating a “hellish dystopia” on earth, where all human jobs will eventually be replaced by fake robot “employees” that can perform them for free.
Though they might seem convenient now, helping to reduce overhead and streamline various processes, AI technologies could one day plunge the world into a global depression. Should fully-functional AI robots eventually be released, claims Dr. Kak, there will be no stopping them from stealing “literally all jobs,” leaving humans with nothing to do for survival.
“The beginnings of the dystopia are already there,” Dr. Kak told the Daily Star Online, noting that human usefulness, at least from a pragmatic perspective, is threatened by AI technology. “There will be massive unemployment. People want to be useful and work provides meaning, and so the world will sink into despair.”
There’s already talk amongst some politicians about implementing a “universal basic income” (UBI) to counteract this onslaught of human-replacing robots. As explained by Mike Adams, the Health Ranger, this system of universal welfare (which is what it really is), will theoretically provide all human beings with a minimum ration of nourishment, housing, and connectivity.
But truth be told, such a concept just won’t work in real life. And Dr. Kak wholeheartedly agrees, having stated:
“Policy makers have begun to speak of a minimum guaranteed income with everyone provided food, shelter, and a smart phone, and that will not address the heart of the problem.”
Before there were even computers, let alone AI technology, human beings were forced to utilize their natural brainpower for survival. They had to use critical thinking skills and process information in a logical way in order to survive – a uniquely human state of existence that some might argue is rapidly disappearing in the age of technology and automation.
Dr. Kak believes that this loss of natural human expression, so to speak, is causing many people to resort to unhealthy behaviors that threaten the stability of mankind as a whole. Such behaviors include things like drug abuse and radicalization of beliefs, both of which undermine the bedrock of civilization.
“In my view, the current opioid and drug epidemic in the US is a manifestation of this despair,” Dr. Kak is quoted as saying. “Likewise, phenomena such as ISIS are a response to the meaninglessness that people find in a world devoted only to the cult of the body.”
While some are hopeful that human beings will simply discover new types of jobs that we don’t yet realize even exist, even this is wishful in a hypothetical sense. Not only that, but true AI robots that can think independently of human input may prevent humans from ever discovering these jobs at all and could end up going “rogue” and enslaving the populations of the world.
“If indeed machines become self-aware, they will be cunning and they will bide their time and choose the best moment to take over and enslave, if not kill, us,” Dr. Kak stated during an earlier interview with the Daily Star Online.
“If you believe that machines will become ‘conscious’ like us, then the possibility that such machines will kill off humans cannot be ruled out. I personally don’t believe that machines will become conscious and self-aware like we are and so we need not worry. But most computer scientists and physicists don’t agree with my position. They think there is nothing to consciousness but computation.”
Follow more news on the rise of the robots by visiting Robots.news.
Sources for this article include: