As many countries struggle to manage the COVID-19 crisis, governments globally are looking to data-driven technologies to minimize cases or to help facilitate a transition out of lockdown. The technologies being claimed to protect, prevent, and track include symptom tracking apps that help us understand the disease, digital contact tracing apps that alert us to our interaction with the virus, quarantine enforcement apps that monitor people’s compliance (such as those in Hong Kong, Poland, and Kazakhstan; proposed in Russia and Egypt), and immunity certification that identify individuals who have had COVID-19.

In more developed nations such as the United Kingdom and United States, conversations around data privacy are rife. For example, the UK National Health Service (NHS) is negotiating with Google and Apple about the efficacy of its proposed contact-tracing app in light of the two tech giants’ stringent adherence to privacy.

In the face of serious threats, many of us might say we have a reasonable expectation that our government will use technology to track and pre-empt risks. In the same way we might expect protection from terrorist attacks, we may also accept such measures to be employed during a public health crisis like the current pandemic.

But the vital question is: how can we ensure that the deployment of these digital tools are effective and do not set a dangerous precedent of digital surveillance?

photo-1585411241969-9ac0c565451b.webpPhoto courtesy: Brian McGowan from Unsplash.

A recent report by the Brookings Institution argues that the social benefits of an anonymised (mandatory) contact-tracing system are worth the temporary privacy costs. Privacy International recognises the potential of such apps, but caveat with the need for trust, built through vigilant monitoring, scrutiny, and testing. We have already seen how these apps do and do not work. For example, in South Korea, while the health alert app has been lauded as successfully monitoring cases, messages that trace people’s movements have exposed private lives, leading to speculation of extramarital affairs or private medical appointments.

Prior to the COVID-19 outbreak, the Carnegie Endowment for International Peace published a report on the expansion of surveillance technologies. It found that “the most important factor determining whether governments will deploy this technology for repressive purposes is the quality of their governance.”

In many of the countries where we work, political, economic, and security factors may not allow people to engage in robust debates about privacy and security. Therefore, there is a greater risk that the roll-out of contact tracing or disease-tracking technologies will set a dangerous precedent that in the long term will infringe on people’s rights. As the world continues to look for the best approaches to reopen economies while keeping populations safe, here are some factors we think are important to consider based on lessons we’ve learned from implementing digital development projects for more than a decade.

1. Trust

Trust in government is not a luxury afforded to populations in many countries, due to rife corruption, institutionalized discrimination, lack of access to reliable resources, and other factors. Even in countries where the government may be widely trusted, emergency measures may undermine this trust. In India, those returning from abroad to the state of Karnataka had their personal medical data published by the government, which many felt was a violation of their privacy. It has been well documented that flagging someone as having been infected with COVID-19 or not could have longer-term implications. Generally, making information as personal as medical data public can have implications on a person’s social and economic life. In the context of COVID-19, it may be the case that someone who does not have ‘immunity’ may have restricted freedoms. For marginalised communities or those living under repressive regimes, the government—or other groups—having their personal or location and movement data could put them and their families at increased risk of harm.

2. Robust Data Protection Legislation

Trust also extends to the opportunities available for redress or the backing to do so. United Nations Conference on Trade and Development analysis found that 19 percent of countries have no data protection and privacy legislation, and only 43 percent of African countries have any data privacy laws. According to Privacy International, in those African countries that do have data privacy laws, critics and advocates have raised concerns about the lack of sufficient protections and safeguards. Vulnerable communities, or those living in repressive states, in particular may feel that they cannot seek redress for the violation of their privacy rights. In India, privacy advocates are increasingly concerned about the country’s lack of legislation and the government’s contact tracing app’s potential to be used as a surveillance tool. In China, there is no law regulating surveillance camera use, which in some cities are being installed without warning outside the homes of people under quarantine. Citizens therefore cannot seek redress for this type of increased monitoring.

3. Digital Literacy and Cyber Hygiene

In many of the countries where we work, digital literacy is relatively low and smartphones, where used, are relatively new to the market. These users who may be entering digital platforms for the first time are particularly vulnerable to data privacy threats such as accepting user agreements that compromise their data or downloading false apps that are covers for ransomware (like COVIDLock). Already, the digital marketplace is full of false COVID-19 websites and applications. This crisis demands we increase our investment in digital literacy. This includes not only helping people learn how to use web and mobile apps, but also teaching basic practices that help protect accounts and information. This could include teaching how to create strong passwords, explaining why it’s important to change those passwords frequently, explaining why it’s important to update software, and emphasizing the need to avoid purchasing pirated software.

4. Culture

Not only are there varying policy and regulatory environments around data protection, but there are also varying cultural attitudes concerning data privacy and the government’s role in such policies. There are already numerous analyses of different cultural attitudes towards the COVID-19 pandemic and response; for example, the extent to which some nations are culturally more individualistic or obedient and how this affects adherence to quarantine measures.

The Chinese contact tracing app, Health Code, isn’t much of a departure from the existing social credit system that tracks data about your life such as timely bill payment and is then used to grant access to various services such as apartment rental. This is not to say that the population is more willing to adhere, but that oversight of everyday life is already commonplace. Attitudes towards privacy are both an issue of digital skills and culture. For example, Web Foundation research in Indonesia found that citizens generally do not see personal data and privacy as part of their rights as citizens.

During a pandemic, we don’t have time to solve global issues around data protection policies (or lack thereof) around the world. But, this does not mean we should wait. As part of the digital development community, we understand the value of contact tracing and the benefits of using technology to do so. Nevertheless, we also know that these tools do not operate in a vacuum. Rather, their successful implementation hinges on the strength and trust in the institutions promoting them. Therefore, even in times of crisis, it is important to simultaneously support improved governance and the design of digital tools with principles of do no harm.