In the worldwide effort to respond to the COVID-19 pandemic, governments, tech companies and telcos have been quick to create solutions that help with tracking the spread of the disease. In doing so, they have assured us in no uncertain terms that privacy will not be compromised. But contact tracing by its very nature requires us to give up at least some privacy in order for it to work.
Apple and Google unveiled a potentially game-changing contact tracing effort on April 10 when they announced
that they are joining forces to create a Bluetooth-based contact tracing platform “to help governments and health agencies reduce the spread of the virus, with user privacy and security central to the design.”
The companies intend to develop application program interfaces (APIs) and operating system-level technology to enable tracing of contacts after someone has been diagnosed with Covid-19. The effort will happen in two stages. In May, the companies will release APIs that enable interoperability between Android and iOS devices using apps from public health authorities, which will be downloadable from Google and Apple app stores. Later, the companies will enable the contact tracing platform by building the functionality into their operating systems.
There are other contact-tracing efforts underway as well, but the sheer ubiquity of a combined Apple and Google platform (there are more than 3 billion Android and iOS devices worldwide) makes it likely that it would be the most widely used and thus most effective.
It will be private!
Apple and Google know what kind of scrutiny their plan will face, and indeed it should. In the five-paragraph blog announcing the alliance, the word ‘privacy’ occurs five times. ‘Security’, ‘transparency’ ‘consent’ and ‘opt-in’ are also mentioned. The emphasis on words in the following quote from the announcement is mine:
“[The contact tracing platform] is a more robust solution than an API and would allow more individuals to participate, if they choose to opt in
, as well as enable interaction with a broader ecosystem of apps and government health authorities. Privacy, transparency, and consent are of utmost importance in this effort
, and we look forward to building this functionality in consultation with interested stakeholders. We will openly publish information about our work for others to analyze.”
Telcos have been equally adamant that privacy is paramount in their efforts to help governments track the disease. In a blog about Vodafone’s efforts
, for example, Joakim Reiter, External Affairs Director, Vodafone Group, points out the potential problems with tracking.
“European privacy and data protection values are being tested by the availability of individual tracking technology such as private mobile apps that automatically alert users if they have been near someone who has tested positive for this awful virus,” he writes. “Such tools are fraught with very complicated and sensitive issues. Vodafone will never voluntarily offer our customer data for any initiatives that remove the requirement to give consent.”
Reiter notes that it is likely some governments will want to use location-tracking technology, in which case he says four important conditions should be met:
- Mobile apps must be independent of operators and other private companies
- They must be developed and controlled by national health authorities
- They still must require consent
- State institutions must justify why location data or tracking is necessary and must comply with existing laws and regulations
“Even in the hardest of times, we must be clear about what rules – such as the fundamental right to privacy – should not be deviated from,” Reiter cautions. “The European values we uphold now will continue to define us, and our way of life, in the future.”
Is it really private?
I agree with Reiter, and I think his four conditions are an excellent litmus test for any state-sponsored contact-tracing effort. But contact tracing requires willing participants to give up at least some privacy or it will not work.The Verge’s
Russell Brandom explains this well
in his article about the Apple-Google platform: “It’s hard to absolutely guarantee someone’s anonymity if they share that they’ve tested positive through this system. But in the system’s defense, this is a difficult guarantee to make under any circumstances. Under social distancing, we’re all limiting our personal contacts, so if you learn you were exposed on a particular day, the list of potential vectors will already be fairly short. Add in the quarantine and sometimes hospitalization that come with a Covid-19 diagnosis, and it’s very difficult to keep medical privacy completely intact while still warning people who may have been exposed. In some ways, that tradeoff is inherent to contact tracing. Tech systems can only mitigate it.
“Plus, the best method of contact tracing we have right now involves humans interviewing you and asking who you’ve been in contact with,” he adds. “It’s basically impossible to build a completely anonymous contact tracing system.”
For the greater good
I had to laugh when I read about research firm Oliver Wyman’s global survey
of 3,600 people about data-sharing during this challenging time. The group found that while large majorities in every country surveyed would want to know if someone in their neighborhood tested positive for Covid-19, only about half were willing to share their own positive result with public health authorities, and less than a third in nearly all countries said they would be willing to share the information with local authorities.
You cannot have it both ways. If you want to know if your neighbor has tested positive for Covid-19, you must be willing to divulge your own positive diagnosis.
As compassionate and caring human beings, we very likely are going to have to set aside some of our privacy concerns for the greater good. We can attempt to strike a balance by insisting that participation in contact tracing be voluntary, but on the other side of the scale, we must, in fact, be willing to volunteer. It may be the only way to stop this pandemic.