Quantcast
Channel: Commentary – POLITICO
Viewing all articles
Browse latest Browse all 1784

Put privacy first in tech fight against coronavirus

$
0
0

Julian King is former European commissioner for security.

As Europe prepares for the next phase of the coronavirus pandemic, attention has naturally turned to the use of technology. Big data, artificial intelligence and mobile applications may allow us to lift the lockdowns and restart the economy while keeping the virus at bay.

But like all cutting-edge technology, this kind of high-tech epidemiological surveillance cuts in more than one direction. In addition to exploring how the technology can help us manage the outbreak, we need to be talking about how to ensure our values and rights are protected.

Across Asia and now Europe, countries are experimenting with using citizens’ data to track, monitor and prevent cases. Many hope that, along with robust testing, using this could speed the return to “normal” life.

The problem is that such advanced surveillance technology can be highly invasive. How we handle personal data — especially health and biometric data — raises sensitive questions about the relationship between individuals and the authorities. In short, it determines the kind of society we live in. The author Yuval Noah Harari has warned that we may be about to experience a “watershed in the history of surveillance.”

We must define those exceptions and ensure they are proportionate, targeted, transparent, time-limited and, crucially, subject to independent scrutiny.

Given the pace at which new technology is being rolled out, Europe urgently needs to have a wider, informed public debate about what is at stake.

The European Data Protection Supervisor has already said there is a clear difference between aggregated, anonymized data and more granular data about individuals. MEPs have pressed for a review of the different apps springing up across Europe. But this isn’t just a discussion for data protection authorities. Equally, given what’s at stake, we shouldn’t sit around and wait for the big tech companies to tell us what they think the answer should be.

The debate concerns national governments and their cyber specialists, of course, but also the tech sector and its burgeoning startups/SMEs, as well as health authorities, researchers and civil society.

Most people may be comfortable with the use of aggregated data, which can play an important role in mapping the spread and evolution of the virus (though we have to ensure that the collecting and use of this data is done in a transparent and accountable way).

The trouble is that aggregated data may not be enough to move us out of lockdown. That will likely also require the use of more granular information on individual cases so that health authorities can trace our contacts. Making sure the epidemic doesn’t spread once an infection is detected will possibly also involve some kind of tracking.

This may be necessary as part of the fight against the pandemic, but that doesn’t mean it shouldn’t come with strong safeguards that ensure data isn’t mishandled and that tracking doesn’t infringe on civil liberties.

Europe’s data protection rules already allow for public health exceptions. What we must do now is define those exceptions and ensure they are proportionate, targeted, transparent, time-limited and, crucially, subject to independent scrutiny.

Not all of the measures proposed in EU countries meet these tests.

Another important measure will be to clearly define what informed consent means in these circumstances — so that we are sure people know what they are signing up to.

The European Commission has already launched an effort to produce guidance for governments on how to responsibly use aggregated and individual data — a so-called “toolbox” of possible measures and the safeguards. We urgently need to speed up this process.

As the Commission reviews various apps rolled out in EU countries, the aim should be to limit the amount of data collected. For example, apps should track proximity data — which determines when you’ve been near someone with the virus — rather than provide a full record of your movements.

Data will need to be held securely, with strictly limited and controlled access, and only for as little time as possible. It’s difficult to see, frankly, why apps should hold on to data for years, as some have suggested.

Any use of biometric identification data raises particular concerns. We must set the bar at the highest level when it comes to justifying its use and ensuring the correct privacy standards are met. All this should also be subject to credible independent review and scrutiny.

It’s important that we get these things right — not least to maintain the public support and engagement that will be needed to make such measures effective.

Making a success of this kind of wider public debate and engagement on sensitive technical questions can be difficult. But there are precedents.

In the U.K. in the 1980s, the Warnock Committee on human fertilization and embryology brought together a wide range of experts to look at policies and safeguards, including the social, ethical and legal implications of the latest scientific and technological developments. Their work helped promote debate and define best practices in Europe and beyond, as well as build public understanding and support in a controversial area.

The departures terminal of the Barcelona airport | David Ramos/Getty Images

We need to be having a similar level of public debate about technology and health in the age of coronavirus, and about where biometric monitoring and surveillance technology might take us as a society.

We have to make sure that getting back to “normal” does not take us down a path from which it will difficult to return. Technology will play an important role in our recovery, but it should not come at the cost of our rights.


Viewing all articles
Browse latest Browse all 1784

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>