The computer system that the National Security Agency (also known as the Illegal Spying Agency) uses to identify and track “terrorists” is seriously flawed, warns a shocking new report that takes a closer look at the leaked documents put forward by former NSA subcontractor-turned-whistleblower Edward Snowden.
Many innocent people, particularly in Pakistan, are at serious risk of being killed by a ground-based death squad or unmanned drone strike because NSA’s “SKYNET” program — hink Terminator movies— doesn’t know how to accurately differentiate between legitimate threats and people whose behaviors simply deviate from the norm.
The report indicates that the NSA’s metadata sweeps, which is really just code for illegal spying activities, involve putting an artificial intelligence-based computer algorithm in charge of determining whether or not somebody is a “threat.” It then uses this information to determine whether or not the “threat” needs to be defused.
The Big Brother implications of this evil and highly corrupt system are enough to convince even the most stalwart skeptic that the U.S. government has way too much power — power that it continually uses to murder innocent lives to advance the globalist agenda. And even those on the inside are starting to speak up about the horrors that are taking place under SKYNET’s watch.
According to Human Rights Data Analysis’ executive director Patrick Ball, NSA’s SKYNET program is “ridiculously optimistic,” meaning it tends to target all sorts of innocent people because, quite frankly, it’s a robot with a mind of its own.
“The program, cheekily called SKYNET after the humanity-destroying artificial intelligence from the Terminator franchise, tracks movements and known associates, then an algorithm analyzes all that Big Data and flags potential terrorists to be targeted for drone strikes,” explains New York Magazine.
“The problem is, a data expert told Ars Technica this week, that algorithm is ‘completely [bollocks].'”
A much more detailed explanation of SKYNET’s flaws is available at Ars Technica, but it suffice to say that in order for the system to work, there has to be a certain number of known terrorists within a population. And as of this writing, there are currently only seven in Pakistan, among a population of about 55 million.
This means that innocent people are being pinned and labeled as “terrorists” by the NSA even though they’ve done absolutely nothing wrong. And some of them are ending up dead as a result, the product of a perpetual, U.S.-led “war on terror” that is sowing carnage, misery, and death all throughout the world.
Even the NSA admits that its system has a 0.008 percent false-positive rate, which translates to about 15,000 people in Pakistan alone who will be erroneously tagged as terrorists, and potentially gunned down in a drone or military strike.
It’s the reason why so many people spoke out against NSA spying, warning that such activity represents a threat to everyone, not just Pakistanis. If it’s happening there, you can be sure it will eventually happen here, should the powers that be see a “need” to implement this type of “security” apparatus on domestic soil.
“Big Data being used to show you ads or recommend friends can certainly feel intrusive, but when Facebook gets it wrong, the worst consequence is that icky, uncanny valley feeling,” New York Magazine says.
“But that’s nothing compared to what can happen when machine learning goes wrong for a military-intelligence app in Pakistan. It can literally be life or death.”