The success of a tracing app for Covid-19 hinges on compliance with data protection law. The app’s effectiveness will be determined by user adoption rates which will depend on the fairness, lawfulness and transparency of the underlying data processing. The app’s efficacy will be determined by the accuracy of the information that goes in and out of the app. This article explains how to make such an app successful from a privacy perspective.
When is the lockdown going to end?
This is the million-pound question that everyone asks and no one seems to be able to answer. Whether you are enjoying working remotely next to a fireplace in the countryside or you are furloughed in a 495 sq ft flat in East London, chances are you now really want to go back to work. While we cannot predict when that is going to happen, we can try tell how that is going to happen.
Assuming no magic pill is discovered soon, according to the WHO, the formulae is that after we take control over transmission, we need to detect, test, isolate and treat every COVID-19 case and trace every contact. Tracing can be done in a myriad of different ways. In fact tracing individuals has never been so easy because over two thirds of us (the necessary user adoption rate) have a super-tracer in our pockets, namely a smart phone that is just waiting to be app-enabled (or is it?).
Any form of tracing, however, by its very nature, includes some form of surveillance which (in non-totalitarian societies) is considered a privacy intrusion affecting human dignity. Laws, such as the General Data Protection Regulation (‘GDPR’) and the Privacy and Electronic Communications Regulations (‘PECR’) are there to safeguard against unreasonable interference with privacy but it is a myth that such laws prohibit any interference at all. Admittedly, making a legally-compliant contact tracing app for COVID-19 with privacy by design is challenging because, in order for it to work, the app will inevitably gather and process some personal (including sensitive) data in an intrusive manner.
Here are some thoughts on how to make such an app viable from a privacy perspective:
1) State your purpose (fair processing notice and purpose limitation)
Tracing apps can have multiple purposes. Is the purpose to enable health authorities to identify the persons that have been in contact with another person infected by COVID-19 and ask him/her to self-quarantine? Or are the individuals supposed to self-quarantine and themselves inform authorities or other users via the app? Or is it the app itself that is supposed to notify the users they need to self-quarantine? Would the service go beyond quarantine and also include provision on advice on next steps, including what to do after developing symptoms? Who is going to be responsible for such advice? How is this advice going to be provided?
A transparent statement of purpose (and no deviation from it) is the first building block of GDPR compliance. An effective, fair, short and clear processing notice about the purpose for processing (and its limits) will be the decisive factor in user adoption rates, ie whether or not users will voluntarily download the app.
2) Ensure data is genuine (accuracy)
Compliance with the GDPR’s accuracy principle will determine the app’s efficacy in absolute terms. Its purpose will be defeated and unnecessary risk shall arise if data that goes into or out of the app is inaccurate. Just imagine the consequences of sending out a hundred notifications or requests for self-quarantine on the basis of an incorrectly recorded contact.
A completely anonymous service where data accuracy is not (independently) verified is prone to error and possible abuse. Under the guise of anonymity, users may submit inaccurate information in bad faith. A prankster can tie a phone to a dog collar; a child may not understand the consequences of submitting a false health status; an employee with a grudge against an employer may potentially close an entire factory down by uploading wrong information.
3) Choose who is going to take overall responsibility for the data (the controller)
In the world of privacy regulation, there is no power without accountability. If the app was operated by a private business, this business will be held liable and potentially pay for any unnecessary harm caused. On the other hand, the liability risk may be transferred to the public authorities if the app is developed privately and then sold to or operated on behalf of a government, in which case the government will remain accountable. But what is going to happen to user adoption if the central government acts as the sole controller?
If the app is perceived as a potential restriction to civil liberties as a form of government surveillance user adoption rates may drop. Admirers of Edward Snowden and Yuval Harari certainly wouldn’t participate. History has shown that exceptions to civil liberties protections made in a time of crisis can often persist much longer than the crisis itself.
A way out of this conundrum may be to assign ownership to a group of reputable and accountable organisations where none of them is in full control and the consensus of all of them is required to make any use of the data. Could distributed ledger technology underpin such an arrangement?
4) Justify what the app is doing (lawful basis)
In a democratic society, adoption of such apps has to be voluntary, hence the importance of data protection without which there may be no adoption at all. The choice has to be made by individuals, and the more individuals who collectively trust the app, the more authoritative it will be.
However, this does not mean that the app has to be designed to the detriment of user experience or prompt users to click on a dozen ‘I agree’ buttons like some annoying cookies banners do. The GDPR provides ample room for collecting and sharing even sensitive personal data where this is proportionate and there is a clear public interest. That being said, reliance on user consent for some specific processing activities may be required by PECR. Lawful set up is potentially mission-critical to the app’s efficacy.
5) Choose what data to collect and where to store it (data minimisation)
While the app will not work if it is completely anonymous, this does not mean data collection should go any step further that the bare necessary minimum. Cellular-tower-generated location data that reveal individual movement patterns and retains them on file long-term is certainly not part of that minimum. The latest thinking on the subject suggests that Bluetooth beacons transmitting (temporary?) identifiers stored on user devices may be the best way forward.
De-centralised storage of tracing events and identifiers is best for privacy. However, this will not allow public authorities to access any anonymised and aggregated information on social distancing and on the effectiveness of the app. This major drawback can be solved by centralised storage of pseudonymous identifiers that are not linked to any other identifiers so that user privacy is preserved as much as possible. Uploading identifiers on a central server may be made conditional on a confirmed infection in line with the GDPR data minimisation principle.
6) Use bank-grade security (integrity and confidentiality)
It goes without saying that information security will be another decisive factor for user adoption rates. All information entering and leaving the app has to be encrypted in line with the highest standards. All information that has to be processed or shared in order to achieve the app’s purposes has to be pseudonymised. Any other data ought to be anonymised.
7) Have a plan for the personal data lifecycle and be upfront with it (storage limitation)
There is a valid concern that personal data kept after the lockdown is lifted will be kept for longer than originally proposed and will be repurposed. For that reason it is of utmost importance to have a clear plan for permanent erasure of all personal data collected during the pandemic once it no longer serves the original need. It is important to remember that genuinely anonymous information (that can never be traced back to someone) is not personal data and not covered by the GDPR.
8) Allow others to verify the code (accountability)
The best way to demonstrate compliance with data protection law is to play with cards on the table. Publishing the full code may be perceived to create an obstacle to making a tracing app for profit. However, the app does not need to be distributed under an open source licence. The code can simply be made available for inspection only to potential buyers under strict confidentiality.
9) Engage with others
We are all in this together. The benefits of developing and implementing an effective solution for Britain may be reversed if other nations’ efforts don’t work out. If tracing efforts are not coordinated internationally, the virus will come back as soon as ports and airports re-opens.
10) Embrace criticism and further regulation
If a contact tracing app for Covid-19 is not privacy protective and does not comply with data protection law, it simply won’t work. In other words, compliance is not a nice to have but a pre-condition of success. But even if this condition is met, criticism (and hence lower user adoption rates) may remain. Criticism of such apps has already emerged from the Electronic Frontier Foundation and the American Civil Liberties Union. A not (yet) binding framework for tracing apps has just been published by the European Commission. App developers must pre-emptively engage with such (well intentioned) feedback and ensure their products reflect the concerns raised by it.
Click here to read more insights on how we can weather the coronavirus outbreak with you.