Popular Articles
Today Week Month Year


Study finds lasers can blind self-driving cars and make them crash into objects or pedestrians
By Arsenio Toledo // Nov 11, 2022

A study has found that a laser aimed at the guidance system of a self-driving car can disrupt its sensors and trick it into not seeing a pedestrian or another obstacle in its way.

In a pre-print study, researchers from the United States and Japan were able to trick a self-driving car – the "victim vehicle," as they called it – into not seeing an obstacle in front of it by pointing a laser at its LIDAR. (Related: Self-driving cars are causing traffic incidents all over San Francisco.)

LIDAR stands for "Light Detection and Ranging." It is a sensor technology that can create a map of the environment around it. LIDAR sensors send out waves of infrared light around the car's surroundings and measure the time it takes for the light to bounce off the objects and return to the sensor, creating a three-dimensional map out of the data.

The "hack" works because a perfectly-timed laser aimed directly at a LIDAR can create a blind spot large enough for the infrared sensors to not see an object or a pedestrian in front of the autonomous vehicle.

"We mimic the LIDAR reflections with our laser to make the sensor discount other reflections that are coming in from genuine obstacles," noted University of Florida cybersecurity researcher and professor Sara Rampazzi. This "deletion" of data creates a false impression for the self-driving car that the road is safe to progress along, placing the car and the obstacle in a potentially dangerous collision course.

"The LIDAR is still receiving genuine data from the obstacle, but the data are automatically discarded because our fake reflections are the only ones perceived by the sensor," she continued.

More sophisticated equipment could make laser hacks of self-driving cars more prevalent

In the test, Rampazzi and her colleagues did the "laser attack" from the side of the road, no closer than 15 feet away from the vehicle. They could also replicate the results up to 10 meters (32 feet) from the car. But Rampazzi noted that the device used in the hack had to be perfectly timed and it had to keep pace with the car's movements to be able to keep the laser pointing at the right spot.

It is feasible to produce similar results from a laser attack further away by using more sophisticated equipment than what the researchers deployed in the experiment.

The tech required for such an attack from a distance is "fairly basic," but the timing required to make the attack succeed makes such an attack unlikely to occur at the moment.

But if such an attack were to succeed, its consequences are potentially horrific, as it could result in the death of drivers, car passengers and pedestrians.

The researchers have already approached autonomous vehicle manufacturers to warn them of this possibility and have suggested changes to the LIDAR software to minimize this problem.

"Revealing this liability allows us to build a more reliable system [for self-driving cars]," noted Yulong Cao, a computer scientist from the University of Michigan and the study's first author. "In our paper, we demonstrate that previous defense strategies aren't enough, and we propose modifications that should address this weakness.

One of the suggestions proposed by the researchers would be to drastically change how LIDAR interprets raw data. Another suggestion involves manufacturers teaching their LIDAR software to look for the telltale signatures of a laser attack, including distinguishing the spoofed reflections caused by lasers.

Read more news about technology hiccups at Glitch.news.

Watch this clip from "The David Knight Show" discussing how the self-driving cars of companies like Tesla and Waymo are glitching dangerously and becoming a threat to drivers and pedestrians alike.

This video is from the channel The David Knight Show on Brighteon.com.

More related stories:

Laser technology used in self-driving cars can damage cameras and human eyes.

Autonomous vehicles to stop, roll down windows and unlock doors for law enforcement.

Waymo self-driving taxi goes rogue in Arizona, blocking traffic and escaping rescue crew.

Self-driving vehicle legislation held up by the question of who to blame in a crash.

It's a "no" for driverless cars: Americans prefer driving themselves to work than relying on autonomous vehicles.

Sources include:

ZeroHedge.com

CosmosMagazine.com

Driving.ca

IOTWorldToday.com

Brighteon.com



Take Action:
Support NewsTarget by linking to this article from your website.
Permalink to this article:
Copy
Embed article link:
Copy
Reprinting this article:
Non-commercial use is permitted with credit to NewsTarget.com (including a clickable link).
Please contact us for more information.
Free Email Alerts
Get independent news alerts on natural cures, food lab tests, cannabis medicine, science, robotics, drones, privacy and more.

NewsTarget.com © 2022 All Rights Reserved. All content posted on this site is commentary or opinion and is protected under Free Speech. NewsTarget.com is not responsible for content written by contributing authors. The information on this site is provided for educational and entertainment purposes only. It is not intended as a substitute for professional advice of any kind. NewsTarget.com assumes no responsibility for the use or misuse of this material. Your use of this website indicates your agreement to these terms and those published on this site. All trademarks, registered trademarks and servicemarks mentioned on this site are the property of their respective owners.

This site uses cookies
News Target uses cookies to improve your experience on our site. By using this site, you agree to our privacy policy.
Learn More
Close
Get 100% real, uncensored news delivered straight to your inbox
You can unsubscribe at any time. Your email privacy is completely protected.