Friday, March 24, 2017 by Ethan Huff
With all the end-to-end encryption built into their framework, smartphones are impenetrable, save for advanced backdoor hacking, right? Wrong. Computer security researchers from the University of Michigan and the University of South Carolina say that simple sound waves, when utilized effectively, can successfully hack not only smartphones, but also wearable fitness monitors like Fitbit and even “smart” automobiles.
Presenting their findings at a recent meeting, the team explained how they were able to break into both a Fitbit and a smartphone and mess with their accelerometers, which gauge the movements and acceleration of their respective devices. By doing this, these scientist hackers were able to alter the Fitbit’s records of a person’s steps, for instance, and in the smartphone, directly interfere with its ability to track movements via GPS and apps that use the accelerometer to, say, control a radio-controlled toy car or airplane.
It’s a far-reaching security flaw that the experts who discovered it say makes it really easy to hijack smart devices and recalibrate them. And the whole thing is done using sound, including frequencies or musical notes that result in smart devices “thinking” that they are being told different commands by their users, when in fact the commands are coming from a third-party exploiter.
“It’s like the opera singer who hits the note to break a wine glass, only in our case, we can spell out words” and enter commands rather than just shut down the phone, Kevin Fu, lead author of a new paper on the topic, explained to The New York Times (NYT). Fu works as an associate professor of electrical engineering and computer science at the University of Michigan, and also serves as the chief executive of Virta Labs, a company focused on cybersecurity in healthcare.
“You can think of it as a musical virus.”
It would be one thing if this glaring flaw was only present in smartphones. But according to Fu and his fellow colleagues, it’s an issue in all sorts of products that utilize robotic elements and chipsets with AI-like functionality. More than half of the 20 commercial electronic brands tested, containing chips from five different chip-makers, demonstrated this security flaw.
This is particularly concerning as more and more “smart” appliances with this type of technology are progressively unveiled. Refrigerators, stoves, microwaves, thermostats, and many other consumer appliances and electronics containing “smart” technologies continue to be released and are becoming the norm, which means more potential hacking problems down the road.
The same is true for “smart” cars and delivery trucks, both of which could be hacked and told to drive off the road, for instance. Terrorists could have a heyday with this type of technology, allowing them backdoor access to smart vehicles’ measurement and gauging instruments, potentially telling them to go crash into buildings or reroute themselves to another location.
“If an accelerometer was designed to control the automation of insulin dosage in a diabetic patient, for example, that might make it possible to tamper with the system that controlled the correct dosage,” writes John Markoff for the NYT about the risks involved with medical devices and healthcare tools.
There are many dire possibilities that Fu and his colleagues say are possible, but not necessarily likely. Still, cybersecurity challenges such as this will only increase the more society allows machines to take over — and you can be sure that “smart” technologies are the technological inroad into allowing third-party control of all aspects of people’s lives.
The paper is set to be presented at the IEEE European Symposium on Security and Privacy in Paris next month. Part of the presentation will include solutions that manufacturers can make to hardware and software to protect against these types of security breaches.
Sources for this article include: