When they first debuted in March, contact-tracing apps seemed like they would become the technological saviors in the fight against the spread of the Wuhan coronavirus (COVID-19). However, doubts have now been raised about the apps, concerning both their effectiveness and security.
Most of the apps in question work by using Bluetooth to connect phones to each other and ultimately let people know if they’ve been exposed to the virus. Instead, the apps have raised fears regarding privacy and surveillance. Some also wonder whether the technology is even effective.
From San Francisco to Singapore, people are now saying that they don’t want to use them. They cite technical problems and skepticism that the privacy protections are real. A system developed by Google and Apple hasn’t seen widespread adoption because of limitations it imposes on governments. Instead, many countries have come up with their own solutions, which usually end up being incompatible with apps used elsewhere, making them useless for travelers. (Related: Apple and Google promise to shut down their coronavirus trackers when the pandemic ends… but does anyone believe them?)
Meanwhile, in India, Southeast Asia and much of Africa, smartphone use is too low for the apps to reach the threshold for contact tracing to be effective — 60 percent of the population needs to be on such a system for it to work.
“Given the haphazard way many of these apps have been developed, there are obvious concerns about their efficacy and privacy implications,” says Samuel Woodhams, a digital rights researcher at internet research firm Top10VPN.
These contact-tracing apps are supposed to provide a technological solution for public-health officials, who would normally have to make phone calls and site visits to track down people who may have come into contact with coronavirus carriers. Given how fast the pandemic spreads, such methods are too slow.
Developers saw an opportunity for smartphone apps to assist these officials. However, putting the technology into action has not been easy.
In March, Singapore became the first country to roll out a voluntary contact-tracing app called TraceTogether. People were initially eager to try it with almost 20 percent of the island nation’s population rushing to download it. However, people started complaining that the app drained their phone’s batteries and often required restarting. They also said that the app also did nothing to stop a second wave of infections among Singapore’s migrant laborers, many of whom don’t own a smartphone.
Now, many prominent members of Singapore’s tech community are calling on officials to integrate TraceTogether with a new app called SafeEntry, which records a person’s name and phone number. The system works like a digital check-in and is mandatory for people looking to enter a business, such as a supermarket or a mall.
However, this still raises questions about how a user’s data is used and stored.
When Google and Apple announced their system, called Exposure Notification, they said that all of the information would be kept on an individual’s phone and not on a central database. However, governments have complained that not having access to this data hurts their efforts to fight the pandemic. Singapore’s system, for one, collects user data and stores it in a central database — although Singapore’s government has assured the public that this data will only be used for contact-tracing.
The U.K. too has argued that its centralized system would help track outbreak patterns, do follow up testing and plan reopenings while keeping the data anonymous. However, a range of security flaws have been recently discovered that may threaten the deployment of the system.
Researchers identified several different problems with the U.K. National Health System‘s app. These include weaknesses in the registration process that would allow hackers to steal encryption keys, as well as the fact that user data stored on the handsets was unencrypted.
The researchers warned that these flaws could also be exploited to monitor a user’s activities beyond just contact-tracing. For example, the unencrypted data makes it theoretically possible for law enforcement agencies to determine where two or more people meet.
“In terms of the registration issues, it’s fairly low risk because it would require an attack against a well-protected server, which we don’t think is particularly likely,” Dr. Chris Culnane, the second author of the report, told the BBC.
“But the risk about the unencrypted data is higher, because if someone was to get access to your phone, then they might be able to learn some additional information because of what is stored on that,” he added.
The U.K.’s National Cyber Security Centre (NCSC) has since stated that it was aware of the issues and is working to address them.
“It was always hoped that measures such as releasing the code and explaining decisions behind the app would generate meaningful discussion with the security and privacy community,” an NCSC spokesman told the BBC.
“We look forward to continuing to work with security and cryptography researchers to make the app the best it can be.”