As Russian cities go into lockdown to try to contain coronavirus, Moscow is using the latest technology to keep track of residents.
City officials are using a giant network of tens of thousands of cameras – installed with facial recognition software – which they plan to couple with digital passes on people’s mobile phones. It’s prompted concern about whether such widespread surveillance will ever be rolled back.
Sarah Rainsford explains how the system works in her own Moscow neighborhood.
Facial recognition surveillance systems are ominous. People see how these tools threatenprivacy and civil liberties and consider ways they might resist being tracked and profiled everywhere they go. One option that is regularly tossed around is the idea of frustrating identification systems with clothing and accessories that obscure and distort our appearance.
Until now, it’s mostly been art installations and academic projects experimenting with face-jamming. But with the spread of COVID-19 fueling expanded surveillance as well as the number of people who are wearing face masks, scarves and bandanas, there’s a flicker of hope that masks will make face recognition harder and harder to implement.ADVERTISEMENT
Wouldn’t it be nice to be able to opt-out so easily?
Unfortunately, more people covering their faces won’t meaningfully thwart face recognition technology, or make it any less urgent to grapple with the threats that go along with it.
Wired: Fighting Covid-19 Shouldn’t Mean Abandoning Human Rights
Democracies everywhere have tried to build legal protections for privacy and basic freedoms. But surveillance aimed at addressing the pandemic could dismantle them.
GOVERNMENTS AROUND THE world are racing to adopt new surveillance tools in response to the Covid-19 pandemic. Many are green-lighting dragnet monitoring systems, seeking real-time location data from mobile providers or deploying facial recognition and other emerging technologies.
These steps may usher in a long-term expansion of the surveillance state. Covid-19 has arrived after two decades of rapid technological change, in which both the public and private sectors exponentially increased their capacity to incorporate surveillance into various aspects of governance and commercial activity. Many democracies have tried, not always with success, to build legal barriers that constrain authorities’ ability to access and exploit the personal information collected by private companies. Coronavirus surveillance could dismantle these structures.
Allie Funk is a research analyst for technology and democracy at Freedom House. She is an expert on human rights in the digital age, focusing on the organization’s Freedom on the Net project.
To avoid such an uncontrolled shift, policymakers must ensure that any new surveillance program complies with human rights principles, like those outlined by Freedom House, which safeguard basic freedoms while allowing the government to do what is necessary to protect public health.
Testing for necessity and proportionality
International human rights standards give states some leeway to adopt surveillance measures in the current crisis, but the programs must first be proven to be necessary in significantly limiting disease spread. If public health experts can ensure the monitoring’s effectiveness, then any program must next be narrowly tailored, minimizing what data is collected and using the least intrusive options to accomplish legitimate goals.
South Korea has been comparatively effective at containing its coronavirus outbreak, but its Infectious Disease Control and Prevention Act (IDCPA) allows authorities to tap into broad surveillance powers, raising questions about epidemiological necessity and proportionality. For example, officials have pulled information from credit card records, phone location tracking, and security cameras—all without court orders—and combined it with personal interviews for rapid contact tracing and monitoring of actual and potential infections. Importantly, IDCPA requires data collected to “be destroyed without delay when the relevant tasks have been completed.”
Credit card histories reveal intimate details about people’s lives that go far beyond basic information for contact tracing, including sexual orientation and religious beliefs. Mobile-phone location data is also personal information, and some South Korean officials have publicized patients’ gender, age range, and where they have been and when to notify other residents about potential exposure. This log has meant that some South Koreans’ personal movements have been laid bare for public consumption, at times fueling online ridicule, scrutiny, and social stigma. Yet the data may not be precise enough to discern whether two people were at least 6 feet apart. This ambiguity is especially problematic if the records are cited to penalize people for not complying with quarantine or social-distancing rules.
Instituting independent oversight
Surveillance programs need robust and independent oversight that can assess what types of data are collected, who manages the collection, and how and by whom that information is used. As the pandemic evolves, an independent legislative review process should routinely keep tabs on programs to ensure they remain necessary and proportionate. An avenue for judicial review should also be available so that affected individuals can appeal disproportionate restrictions and seek redress for any abuses.
Worryingly, this essential oversight is lacking in some surveillance initiatives. Israel’s caretaker government, for example, used emergency regulations to grant police and security officials access to a secretly obtained trove of sensitive smartphone metadata, including geolocation data, without parliamentary approval. The existence of this database and its underlying legal framework was previously undisclosed. After the government’s unilateral move, the High Court intervened to impose a temporary injunction and require some legislative involvement. Security officials have since been allowed to continue the monitoring after a new parliamentary subcommittee was established, but these controls do not appear to be sufficiently robust.
Ensuring openness and transparency
Openness and transparency are crucial not only for keeping citizens safe during a health crisis, but also for helping them understand how and why their privacy is being affected. This builds public trust in the institutions tasked with curtailing the outbreak, while ensuring that surveillance programs and the officials running them remain accountable.
Many mobile applications that claim to track individuals’ movements and quarantine compliance fall short on transparency. They are generally opaque regarding how they collect and process data, and how and with whom they share that information.
In Poland, some are using the government’s new Home Quarantine app to prove they’re complying with isolation orders. Users first upload a profile image and are then sent periodic requests to upload a “selfie” for authorities. The app pulls a geolocation stamp from the selfie to confirm the time it was taken while using facial recognition to match the image to the user’s original picture. It remains unclear how much information the app collects, and whether the data can be retained or made available for other private or public facial recognition initiatives
Sunsetting and limiting data collection, access, and use
Surveillance programs should have unambiguous sunset clauses so that they cannot continue once the pandemic ends. Information collected during the outbreak should be firewalled from other governmental or commercial uses and then should generally be destroyed after the virus is brought under control.
Emergencies provide governments a shortcut to access people’s personal information or roll out emerging surveillance technology that under normal circumstances would either not be allowed or would require significantly stronger judicial or legislative review. Moreover, indiscriminate monitoring and mass collection of sensitive information sidestep due process standards, making everyone a suspect of potential wrongdoing.
Authorities could collect sensitive information or deploy facial recognition systems under the guise of countering the outbreak, only to use them later for political purposes, such as repression of minority populations. Phone records can be, and have previously been, weaponized to track down and arrest journalists. Geolocation data could be used to identify and detain undocumented people for deportation. And police could repurpose data about people’s movement to identify civic organizing efforts and disrupt protests. Private entities such as insurance companies or advertising agencies may also seek to exploit such data for their own commercial ends.
In practice, many programs lack or are ambiguous about sunset and firewall provisions. Certain mobile providers in Belgium, Germany, and Italy have supplied aggregated and “anonymous” location data to authorities. South African mobile carriers have also agreed to hand over location data. In the United States, mobile advertising companies, not mobile service providers, are reportedly providing government agencies with similar information. It is unclear how and by whom this third-party data could be used during and after the outbreak, whether for law enforcement, immigration, or intelligence purposes.
Stopping the spread
While certain forms of monitoring—such as contact tracing—can be indispensable to containing Covid-19, they should remain in compliance with human rights standards. Aggressive expansion of surveillance programs without adequate checks could normalize privacy intrusions and create systems that may later be used for various forms of political and social repression.
Surveillance tools alone cannot solve a public health crisis. Enhanced technical monitoring does not provide rapid tests to patients, protective equipment to medical workers, or ventilators and staffing to hospitals. As democracies build out their responses to the pandemic, they should ensure that their efforts do not also institute a lasting deterioration in human rights.