Close

Hmmm, you are using a Gmail.com email address...

Google has declared war on the independent media and has begun blocking emails from NaturalNews from getting to our readers. We recommend GoodGopher.com as a free, uncensored email receiving service, or ProtonMail.com as a free, encrypted email send and receive service.

That's okay. Continue with my Gmail address...

AI-enabled cameras said to predict crime before it happens… are “precrime” arrests next?


The coming tsunami of devices equipped with “artificial intelligence” (AI) technology may seem like, to some, a welcome addition to all things that make life easier on planet Earth.

But to a growing number of critics, AI represents the next opportunity for big government to encroach upon our lives daily, and even prosecute and imprison us based on what a machine thinks we might do.

As NextGov reports, AI-capable cameras said to be able to detect ‘behaviors’ that lead to crime are coming as the next evolution in surveillance technology that cities and governments deploy to monitor citizens 24/7/365.

“Imagine it were possible to recognize not the faces of people who had already committed crimes, but the behaviors indicating a crime that was about to occur,” NextGov noted, adding: 

Multiple vendors and startups attending ISC West, a recent security technology conference in Las Vegas, sought to serve a growing market for surveillance equipment and software that can find concealed guns, read license plates and other indicators of identity, and even decode human behavior.

A company called ZeroEyes out of Philadelphia markets a system to police departments that can detect when a person is entering a given facility carrying a gun. It integrates with any number of closed-circuit surveillance systems. 

Under the guise of ‘ending mass shootings’ — it’s always about gun control with the authoritarian Left — the AI community seeks to develop ‘predictive’ technology that can not only spot certain behaviors, allegedly, that indicate a gun crime is about to take place, but other violent criminal activity as well.

Support our mission and enhance your own self-reliance: The laboratory-verified Organic Emergency Survival Bucket provides certified organic, high-nutrition storable food for emergency preparedness. Completely free of corn syrup, MSG, GMOs and other food toxins. Ultra-clean solution for years of food security. Learn more at the Health Ranger Store.

At present, precursors to predictive criminal behavior — facial recognition and license-plate reader technology — are being met with stiff resistance from privacy advocates. And in some cases, at least, judges have sided with them in limiting the use of such technology in the public square. 

As such, there can be no doubt that AI-driven pre-crime ‘recognition’ will be met with similar resistance from pesky constitutionalists who still believe in privacy rights and our founding legal principle of innocent until proven guilty. 

The inevitable will come: Pre-arrests for predictive crime

And why not? In addition to the probability of false positives, how long will it be before predictive crime technology leads to pre-incident arrests for criminal activities some machine thinks someone is about to engage in? (Related: “Minority Report” PRE-CRIME now real in Colorado as it becomes latest state to pass ‘red flag’ gun law.)

Farfetched? Hollywood has already made a movie about it, and not recently (“Minority Report” — 2002). As for pre-arrests for pre-crimes, Hollywood has that covered as well (“Pre-Crime” — 2017).

In the latter, according to the series’ official trailer, “A pre-emptive arrest is made of someone before they perform an act” — all based on a computer algorithm (powered by predictive artificial intelligence).

“Adoption of pre-crime tech is beginning to trend in the U.S. PredPol, one of the leading systems on the market, is already being used by law enforcement in California, Florida, Maryland and other states,” reports Bleeping Computer. “Aside from civil liberties concerns, however, a flaw found in the design of the type of software used indicates that predictive algorithms are to blame for a whole new set of problems.”

After U.S. researchers analyzed how PredPol actually works to predict crime, they discovered that the software initiates a “feedback loop” leading cops to being repeatedly directed to certain neighborhoods regardless of what the actual crime rates in those neighborhoods really is. 

The problem? Human decisions go in to designing software, including AI software, so flaws are inherent in them. 

Still, based on trial and error, researchers are bound to come up with a technology that functions on a certain level. The problem then becomes what to do about being able to predict crime before it happens — and that will inevitably lead to pre-arrests.

Read more about how big government plans to use future technology against us at Precrime.news and FutureTech.news.

Sources include:

NextGov.com

BleepingComputer.com

NewsTarget.com

Receive Our Free Email Newsletter

Get independent news alerts on natural cures, food lab tests, cannabis medicine, science, robotics, drones, privacy and more.



Comments
comments powered by Disqus

Receive Our Free Email Newsletter

Get independent news alerts on natural cures, food lab tests, cannabis medicine, science, robotics, drones, privacy and more.

RECENT NEWS & ARTICLES

Receive Our Free Email Newsletter

Get independent news alerts on natural cures, food lab tests, cannabis medicine, science, robotics, drones, privacy and more.

No thanks, I'm already subscribed.