Troubling Symptoms: The COVID-19 pandemic could help create a dystopian surveillance regime

Anurag Mehra , 01 Jun 2020
A screen shows a thermographic image of a man walking through a temperature-screening point at an entrance of the Shanghai Museum amid the coronavirus outbreak on 13 March 2020 in Shanghai, China. TANG YANJUN / CHINA NEWS SERVICE / GETTY IMAGES

On behalf of the Indian government, in late April, the public-sector company Broadcast Engineering Consultants India Limited floated a tender for procuring devices to fight the COVID-19 outbreak. The devices included COVID-19 patient-tracking wristbands, fever-scanning tools and hand-held thermal imaging systems. The expectations from these devices, as per a report in the Indian Express, bordered on the magical: “detect, prevent and investigate threats to national security using call data records, internet protocol detail record, tower and mobile phone forensics data”; “geofence an area of interest, such as meeting place, airport, mosque, railway station, bus stand”; “advanced analytics and intelligence software that uses telecom & internet data to identify suspect locations, associations & behaviour”; monitor “everyday behaviour of the person, including where s/he orders food from and the places s/he regularly visits, the multiple routes s/he could take”; “should be able to easily identify close contacts, frequent contacts as well as occasional contacts such as Uber drivers etc, and be able to collect information like where the suspect has spent most of his/her time and who all he or she has met.” To say that the company’s aims with these devices would infringe on the privacy of those being subjected to such surveillance would be an understatement.

The tender came at a time when the government was already facing criticism over the Aarogya Setu app, sponsored and heavily promoted by the central government. The app determines whether users are at the risk of a COVID-19 infection, through accessing the locations they have visited, whether they had “contact” with someone who carried the infection, and self-assessment of symptoms. Initially, the government made use of the app mandatory for all public and private employees, and for those living in containment zones. It also proposed that all new smartphones come with this app pre-installed. Since then, there have been cases of housing societies insisting that in order to access the common area of the housing complex, residents will have to install the app. The Central Industrial Security Force has proposed making the app mandatory for allowing access to public places such as the Delhi metro. The states of Karnataka and Haryana have declared that those entering their territories should have the app installed. In Noida, a government order said that criminal penalties will apply if the app is not used during lockdowns. Downloading the app might even be made mandatory for air and train travel. However, in the guidelines for the lockdown from 17 to 31 May, the ministry of home affairs somewhat diluted its stand. Instead of making the app mandatory for all employees, it said that “employers on best effort basis should ensure that Aarogya Setu is installed by all employees having compatible mobile phones.”

Despite this minor relaxation, it will not be surprising to see this app go the Aadhaar way, in that it will effectively become “compulsory” for everyone, giving rise to issues of exclusion and denial of a service if the app is not installed, and misidentification, where an uninfected person might be shown as high risk. The mandatory use of this app has been challenged in Kerala High Court because it “takes away the right of a person to decide and control the use of information pertaining to him” and thus violates the 2017 Supreme Court judgment in Justice KS Puttaswamy (Retd) vs Union of India. The eminent jurist Justice BN Srikrishna, who led the committee that drafted India’s data protection Bill, has called making the use of the Aarogya Setu mandatory app illegal.

While it may seem odd to worry about data extraction, surveillance and privacy-related issues in these anxious times, but this is precisely the time when the public’s guard is low and it can be easily convinced to part with personal data for the greater good. Under normal circumstances, a citizen would be outraged if the local police or the municipality asked them to make their mobile phone location available to them at all times. However, now, under attack from a virus that is so contagious, the demand seems justifiable. Across the world, corporate and state agencies have found this to be the most opportune time to design intrusive systems for combating the public-health emergency. But in doing so, they often wilfully violate the spirit, if not the letter, of fundamental liberties guaranteed in a democracy.

The citizens have a right to know what data is being taken from them by an expanded surveillance system, during the pandemic, and what should be claimed back after it ends. They must demand to know who is responsible for handling and securing their data, and what precise uses it will be put to. The expectation that this data should never be shared with any entity—private or public—is a reasonable one. Several apps designed to fight COVID-19 also want access to your contacts, images, videos and other data, which is totally irrelevant to the aim of the app. If the government makes these apps mandatory, the “consent” a person is asked for when they start the app for the first time, is immaterial. You have no option but to agree to all the terms and conditions. Additionally, the privacy policies of these apps often contain open-ended clauses such as “will be used for appropriate purposes” or “will be shared by relevant agencies.” In India, poor data-security also poses a danger of it being leaked. In the long run, letting the state get away with this form of infringement on our privacy could possibly bring to fruition a dystopian surveillance nightmare.

Globally, in the name of the pandemic, all kinds of systems using cameras, drones and mobile phones are being created to track people who are infected as well as suspected asymptomatic virus carriers. The simplest and the most militaristic approach has been taken by Israel. It is using its internal intelligence agency, Shin Bet, to track infections through location data of users. The new regulations permit collection of this data without a court order, and parliamentary formalities were bypassed in promulgating these rules. Shin Bet is directly accessing this data from mobile telephone operators, and thus requires no apps and therefore no consent of the users. An Israeli high court has criticised the agency’s actions, noting that it “severely violates the constitutional right to privacy,” and that the government would need to pass new legislation to authorise Shin Bet to continue with this operation. It also stated that, “The choice to make use of the state’s preventative security agency to follow those who do not seek to do [the state] harm, without the subjects of the surveillance giving their permission, raises an extremely serious difficulty and a suitable alternative should be sought that fulfils the principles of privacy protection.” The court was alert enough to order extra protection for journalists—who will not be tracked by Shin Bet post testing—so that journalistic sources could be protected.

A few countries have asked their telecom providers to share location data of people with the government, though without specifying under what laws such data is being shared. This is also true of the European Union though most of the data that is being shared with governments is anonymised and aggregated. Most countries, states and cities have launched mobile phone apps, resembling Aarogya Setu, to do this tracking.

The most extensive tracking has been implemented by South Korea. It uses a combination of security-camera footage, GPS data from cell phones and vehicles, as well as credit-card transactions, to trace a person’s movements. A lot of this data—including gender, age, district of residence and credit-card history, plus minute-to-minute movements with time stamps—is being publicly released to show a detailed path of confirmed virus carriers so that people can check if they ever crossed paths with them. Even though the public data is anonymised by stripping off personal identifiers, it is not too difficult to infer names and addresses. As a result, people are being harassed online, with social media full of comments on people’s activities—time spent in motels, attending religious meets, having affairs—and, of course, being vilified for getting the infection.

China’s tracking has been ruthless and it has deployed its mass surveillance systems already in place—CCTVs, drones, biometric readers—in addition to mobile phone apps. China is aggressively installing more cameras, sometimes even inside people’s homes. A color coding scheme has been instituted that generates a “health code”—green, yellow or red—on a person’s cell phone. A green code means no infection and the person is free to enter the metro or shopping malls or restaurants (this works like an e-pass); yellow code indicates that the person needs to be quarantined, perhaps at home; and a red one mandates that a person is likely to be infected and so must be isolated immediately. The project is sponsored by the government and administered through Ant Financial owned by the Alibaba group. “Neither the company nor Chinese officials have explained in detail how the system classifies people,” the New York Times reported in March. “That has caused fear and bewilderment among those who are ordered to isolate themselves and have no idea why.” The health code status is also shared with the local police. Apparently, the health code is generated by using AI algorithms on “big data” that is collected by “workers in train stations and outside residential buildings [who] record people’s names, national ID numbers, contact information and details about recent travel.”

China is now equipping its public-transport systems with thermal imaging cameras which will, in addition to a facial scan, also record body temperature. The face scan can be used to immediately identify the person using the country’s vast collection of facial records. The possibility of India replicating a similar system that connects to the Aadhaar database has been mentioned. Of course, at the moment, such a link up with the Aadhaar database is not permissible in law.

Besides Aarogya Setu, many apps have been made available in various cities and states across India. The local authorities too are using these technologies in a questionable manner. In Karnataka, for instance, those quarantined at home must post selfies to the app Quarantine Watch every hour from 7 am to 10 pm. Most such apps require people to upload symptoms, give feedback relating to testing, and give access to their location. Tamil Nadu’s CoBuddy app even uses facial recognition. All of these have poorly drafted privacy policies, ask for excessive permissions and often work erratically, causing hardship to uninfected people. The evidence of callous attitudes towards the privacy of citizens by public authorities keeps mounting. In one case, the Karnataka government published names and addresses of more than 19,000 residents of Bangalore ostensibly because they were not following quarantine rules properly. Such personal details being made public pose a threat to people’s lives in an environment where even the suspicion of being infected immediately invites social stigma or even physical violence. The problem of stigma and consequent discrimination seems to be fairly widespread. Thus, personal data acquired by intrusive apps, which can be released into the public sphere, makes the situation much worse.

These examples indicate that the installed apps, can perform two primary “surveilling” functions, and Aarogya Setu includes both. First, a person’s location can be recorded at any given time. This is useful for keeping tabs on people to check if they are crossing a “geofenced” area such as a quarantine zone. Second, it can keep a record of all the people a user has been in close proximity of—what is being called contact tracing. Bluetooth technology, inside mobile phones, is being used to do this rather than GPS.

If this collected data flows the way it is intended, and processed only for the purpose that it was collected for, it would be somewhat reassuring. But that is not the way it is in practice. Servers can be hacked and data breaches can occur because of many reasons: insufficient security layers, weak encryption, simple passwords, bad design of the app or the storage architecture. Or sheer carelessness. Data can get stolen or exposed while in transit from devices to servers and vice-versa. An app sponsored by the state of Karnataka apparently revealed addresses of patients. In Madhya Pradesh, a COVID-19 dashboard displayed personal details of quarantined persons.

This data often makes its way into the marketplace from where anyone can access it: data brokers, marketeers, criminals, sundry organisations, insurance and finance companies, and so on. Location data can be de-anonymised with little effort, resulting in identification, and information about lifestyle, associates and activities.

Across the world, corporate and state agencies find this to be the most opportune time to design intrusive systems for combating the emergency. But in doing so, they often wilfully violate the spirit, if not the letter, of fundamental liberties guaranteed in a democracy.

But above all, what permits such data collection morphing into surveillance systems is the lack of legislative backing with constitutional safeguards such as a sunset clause—mandating that everything related to the app will be deleted at the end of a specific period. The European Union has issued guidelines on how to process personal data that is being acquired during the pandemic. The American Civil Liberties Union too has published a set of principles that apps, and their makers, should follow in order to be respectful of individual freedoms, including privacy.

In India, an executive order specifying the protocol for how the response data obtained through the Aarogya Setu app is to be handled spells out which state agencies will have access to the “de-identified” data but the list is long and the wording so generic that it can include almost any agency which can do pretty much whatever it wants with the data. This defeats the purpose of why the protocol was issued in the first place—the list of entities that can access the data and its precise use should have been specified sharply. The order also permits sharing of data with research institutions “registered in India” on approval of an expert committee. The data can live up to 180 days with the entities it is shared with. The penalties of improper use are left vague: “under applicable laws for the time being in force.” The order can be reviewed in six months.

In 2018, a demand was made by the National Crime Records Bureau that the agency be given access to Aadhaar data. The Unique Identification Authority of India turned down the proposal because the Aadhaar Act does not permit such access and this purpose was not consented to by people “giving” their personal information and biometrics to the Aadhaar system. Therefore, legislative oversight does add a layer of accountability to executive schemes, and should have been provided for in our “national app.”

We should be mindful of what we are being asked to do. We should appraise apps for their privacy-intrusion potential and specifically examine questions relating to the limited scope and purpose of data collection; security of data; scope of liability offered by app owners; and clear consent, as well as opt-in and opt-out features.

There is an important debate over two frameworks being used for the location and encryption of collected data: the Decentralised Privacy-Preserving Proximity Tracing, or DP-3T, versus the Pan-European Privacy-Preserving Proximity Tracing, or PEPP-PT. The former is considered far more privacy-preserving because server interaction, processing and storage is minimised and thus is more decentralised. Germany has ditched a centralised approach. In the United Kingdom, experts have advised the government against its plans to use centralised contact-tracing apps. The just released Google-Apple collaborative framework, upon which contact-tracing apps can be built into Apple and Android phones, is DP3T compliant. These companies have promised that they will remove this feature when the pandemic is over. Apps released by public authorities and governments should release the source code for public scrutiny and for building trust.

Contrary to this spirit, what is worrying is the collaboration between government agencies and notorious private technology companies known for designing intrusive technologies. The American technology company Clearview AI is in negotiations with US state agencies to provide facial-recognition services to track Covid-19 patients. The technology of the NSO group from Israel, which created the well-known spying tool, Pegasus, is being tested by many countries around the world as a potential candidate for tracking the movements of people using mobile-phone’s location data. For these companies, this viral pandemic is a godsent opportunity to launder their reputations and advance their questionable products and practices.

Ideally, every policy, law and rule brought in for the emergency that the pandemic has created should be reviewed when it ends. The extracted data should be deleted. The retention of health, location and facial data on private and state servers will always make it a potential target for hackers. Corporate entities will seek access to it in order to make profits, and for state agencies it will be tempting to use it for surveillance. In contemporary India, surrendering privacy could expose people to mob violence for their political views.At this moment, we need to vigorously campaign for data-protection laws. The European Union set the ball rolling in May 2018 enforcing data-protection regulations. The Indian data-protection law is still under debate because it allows state agencies to have a free run of citizens’ data. In the fight against the pandemic, we must not create systems of surveillance that would be difficult for people in power to give up once the pandemic ends.




Your subscription could not be saved. Please try again.
Your subscription has been successful.

Newsletter

Subscribe to our newsletter and stay updated.

© 2024 www.ipsmf.org | All Rights Reserved. Maintained By Netiapps