Just learned of The Coronavirus (Safeguards) Bill 2020: Proposed protections for digital interventions and in relation to immunity certificates. This is in addition to the UK’s Coronavirus Bill 2020, which is (as I understand it) running the show there right now.
This new bill’s lead author is Prof Lilian Edwards, University of Newcastle. Other contributors: Dr Michael Veale, University College London; Dr Orla Lynskey, London School of Economics; Carly Kind, Ada Lovelace Institute; and Rachel Coldicutt, Careful Industries
Here’s the abstract:
This short Bill attempts to provide safeguards in relation to the symptom tracking and contact tracing apps that are currently being rolled out in the UK; and anticipates minimum safeguards that will be needed if we move on to a roll out of “immunity certificates” in the near future.
Although no one wants to delay or deter the massive effort to fight coronavirus we are all involved in, there are two clear reasons to put a law like this in place sooner rather than later:
(a) Uptake of apps, crucial to their success, will be improved if people feel confident their data will not be misused, repurposed or shared to eg the private sector (think insurers, marketers or employers) without their knowledge or consent, and that data held will be accurate.
(b) Connectedly, data quality will be much higher if people use these apps with confidence and do not provide false information to them, or withhold information, for fear of misuse or discrimination eg impact on immigration status.
(c) The portion of the population which is already digitally excluded needs reassurance that apps will not further entrench their exclusion.
While data protection law provides useful safeguards here, it is not sufficient. Data protection law allows gathering and sharing of data on the basis not just of consent but a number of grounds including the very vague “legitimate interests”. Even health data, though it is deemed highly sensitive, can be gathered and shared on the basis of public health and “substantial public interest”. This is clearly met in the current emergency, but we need safeguards that ensure that sharing and especially repurposing of data is necessary, in pursuit of public legitimate interests, transparent and reviewable.
Similarly, while privacy-preserving technical architectures which have been proposed are also useful, they are not a practically and holistically sufficient or rhetorically powerful enough solution to reassure and empower the public. We need laws as well.
Download it here.
More context, from some tabs I have open:
- COVID-19 and Digital Rights and EFF and COVID-19: Protecting Openness, Security, and Civil Liberties, by the EFF.
- We Mapped How the Coronavirus Is Driving New Surveillance Programs Around the World—At least 28 countries are ramping up surveillance to combat the coronavirus, by Dave Gershgorn
- Coronavirus disease (COVID-19) technical guidance: Surveillance and case definitions, by the World Health Organization
- Coronavirus and the Future of Surveillance: Democracies Must Offer an Alternative to Authoritarian Solutions, by Nicholas Wright, in Foreign Affairs.
- PRIVACY EXPERTS SAY RESPONSIBLE CORONAVIRUS SURVEILLANCE IS POSSIBLE, by Sam Biddle, in The Intercept.
- About Face, in which last October I opposed facial recognition by entities other than people, their own devices, and their pets.
All of this is, as David Weinberger puts it in the title of his second-to-latest book, Too Big to Know. So, in faith that the book’s subtitle, Rethinking Knowledge Now that the Facts aren’t the Facts,Experts are Everywhere, and the Smartest Person in the Room is the Room, is correct, I’m sharing this with the room.
I welcome your thoughts.