Prajnya Gender Talks, August 2020 || Surveillance, COVID-19 & Women: How is surveillance being experienced by women at the margins during COVID-19? by Radhika Radhakrishnan

Standard

August 2020

Rapporteur: Suhasini Udayakumar

Introduction

Leading the resurrected Prajnya Gender Talk Series was this talk on “Surveillance, COVID-19 and Women” conducted post the first COVID lockdown of 2020. Even during the first few months of COVID, chilling stories about women’s domestic safety during the pandemic gave cause for feminist researchers to consider the exact implications of the pandemic for women, especially those on the margins. 

With that in mind, we organised this talk with Radhika Radhakrishnan, a feminist researcher with a master’s degree in Women’s Studies from TISS, India. Radhika was at the time studying the intersections of gender, sexuality, technology and politics at the Internet Democracy Project (IDP) and was excited to share recent research findings about the lived experiences of a variety of marginalities with respect to surveillance during COVID-19. 

What is data?

Radhika began the roundtable by asking the audience a fundamental question “What do you think is the meaning of data?” 

Participants responded with their ideas: privacy, evidence, thoughts, research, that which is authentic, useful information, statistics etc. Radhika then went on to explain how the purpose of her research – and this roundtable – was to show that data was so much more, especially in the COVID-19 context. The genesis of this particular research was from the buzz in policy circles that described data purely as a resource and a public good. Commonly heard was the phrase “Data is the new oil.” But is that so? Does data even exist separately and autonomously from our own bodies? Radhika’s aim was to show why and how we need to reframe these dominant but inadequate notions of data.

Human beings take their bodies very seriously. We are cognizant of violations, infringements, and attacks when it comes to our bodies. Similarly, organisations and platforms take privacy violations seriously. But when there is a digital injury to our data or information, we take it far less seriously: this is because we view the data as separate from our bodies. Radhika’s research intended to investigate if this idea was true. 

Radhika began explaining her findings by defining surveillance as a process of controlling bodies through data. This is of critical importance to feminists since this offered a new dimension; historically, women’s bodies have been the objects of control – more so if one was a marginalised woman. This new and data-fied world would enable even more control over women’s bodies. It was now important to find out exactly how data is used to control women, and how it affects various intersections to give a complete view without erasing any identities.

The Methodology

This research started out as informal conversations with people about the difficulties they faced with respect to data. It later evolved to a more structured approach, where investigators conducted 25 interviews (both Hindi and English) over four months and listened to the candid storytelling of the experiences of various communities such as sex workers, transpersons, queer persons, Muslim women, Dalit women, gig workers, Asha workers, working-class people, women’s rights organisations and activists, activists working with Adivasis, and public health activists. 

The four key findings from the research are as follows:

  • Surveillance led to increased control over bodies during COVID-19.
  • Data proved to be a key factor in the way surveillance was being carried out and experienced.
  • Within the field, conversations were limited to privacy issues with data. This needs to transform into an acknowledgement that with data privacy violations, fundamental rights such as bodily integrity, dignity, and autonomy are at stake.
  • Those at the margins experience surveillance more, facing maximum costs and bearing the maximum brunt. 

The research categorised surveillance based on three spaces: 

  1. The private space (family)
  2. The public space (community)
  3. The state

It is important to look at all these spaces because of the differing power hierarchies within each space. Women at the margins are most likely to negotiate with this power to gain concessions that are otherwise not accessible to them. 

Space 1: The Private Space (Family)

During the pandemic, the outside was regarded as unsafe and the inside safe. This was not the case for women, where the household was a space for constant surveillance, invasion, and abuse. During this time, women had to live in close proximity to those they most feared and needed protection and privacy from. As the pandemic wore on, domestic violence rates skyrocketed. While the rest of the country bemoaned the overuse of technology and the associated data risks, for women, data presented itself as the only escape route. Women facing abuse needed phones, the internet, and data about crisis centres to call helplines. But in lower socio-economic households, the phone is often a shared resource and women were afraid to report crimes through the family phone. These dynamics of surveillance in the household and often overlooked. 

Source: Point of View, Internet Democracy Project

Similarly, LGBTQIA+ persons run the risk of being outed when family members surveil their phones. Sex workers often use two phones, a personal one used openly and a professional one that is often hidden and used in secret. They have to constantly worry about their families discovering the hidden-phone or have to deflect questions about their source of income. Their income and livelihood are also directly affected by their limited ability to use the phone. These are a few of the intersectional lenses through which we need to view data and tech in the household space. 

Source: Point of View, Internet Democracy Project

Increasing control and restrictions over women’s use of data and technology have had devastating consequences. Society has a persisting fear that if women have access to data and tech, they will be liberated from patriarchal constraints: they view data as an escape for women from traditional confines. Thus, we can trace a direct link between controlling women’s data and controlling women’s bodies, and thus we need to question the universal distinction between data and bodies. Data is not just a resource. It is not independent of our bodies. And it is not just a good for human consumption. Control over data is experienced intimately; it extends beyond issues of privacy and beyond current policies over data harm. It extends to sexual, bodily, and financial autonomy – embodied experiences which are currently not captured. 

Space 2: The Public Space (Community)

Surveillance of marginalities in the community shares commonalities and differences from surveillance within the household. Some examples include the surveillance and ostracization of transgender folks in Hyderabad during COVID, the effect of which was critical and life-threatening. Here, the stigma associated with COVID was transferred to the trans community. 

Source: Internet Democracy Project

This type of community surveillance extends over several axes and marginalities: Kashmiris, Muslims, single women, queer persons, domestic workers etc. In all cases, the dominant community surveils and harasses the marginalised community. In most communities, this type of surveillance was already happening, but data only made it easier. CCTVs in resident welfare associations monitor movement and community WhatsApp groups broadcast community COVID+ patient information. 

The government was an ally in the process on several occasions. The government of Punjab for example created the CovApp, which the public could use to report mass gatherings; this encouraged communities to surveil for state purposes. In the case of Rajasthan’s Raj COVID Info app, the state even released watchlists with the names and information of COVID+ patients to the masses. 

During the pandemic, surveillance took on a heinous form: data was in the hands of powerful stakeholders in the community (smartphone apps and CCTV footage were always used by the more privileged to gaze upon the marginalized) and in all these situations, there was no accountability when action was taken against the marginalized. When the entitled mobility of domestic workers, Kashmiri women, and Muslim women were restricted through the use of data, the distinction between data and bodies blurred. 

Space 3: The State

The power differences that exist between the Indian state and its citizens are harder to question because, according to the constitution, such power differences are to not exist. The state uses surveillance data to create hierarchies of citizens and thus all are not equal in the eyes of the state. Citizens should not have to negotiate with this type of unfounded power. 

During COVID, Aadhaar cards, identifiers that the state uses to surveil citizens, took on a larger role. People required them to access benefits such as rations, vaccinations etc. A transgender person would communicate their identity through their bodies, and not their Aadhaar cards. But the only way they could avail of benefits was if the state authenticated their identities through the (dysphoric) data on their Aadhaar cards.  Bodies have become so data-fied that we need data to prove our bodily identities. 

For minorities such as Muslims, police surveillance is not protective but dictative. Due to resources such as drones, police do not even need to be present to surveil these marginalities. These bodies are policed through data and technology, no longer requiring the previously present physical threshold to govern.

How do people respond to surveillance?

Data has proven to be a double-edged sword: as much as it is used for surveillance, it is also central to the resistance of surveillance. People have moved protests online; while this is convenient, it is also exclusionary since only those with access to tech can join the resistance. 

Another thing to keep in mind is that Foucaltian and Orwellian theories of “the one power entity” no longer apply since now, there are many powerful entities, exerting their control in multiple spaces. These new types and spaces and perspectives of power demand newer strategies of resistance. Technology has made surveillance ubiquitous, and the resulting asymmetry in power and knowledge lead to even lesser control for individuals in the margins, if at all they are aware of being monitored, that is. 

During the pandemic, surveillance was justified, making it harder to resist. In this post-pandemic world, surveillance continues and is even normalised, thus requiring new forms of resistance to battle it.

Conclusion

Surveillance is regarded as a useful tool for disease control but in the COVIDian context, data became central to surveillance and technology was used against people, thus acting as a tool of control than of liberation. The overwhelming switch to reliance on technology also creates a loss of the “care” element, since no people are involved anymore in the process. We need to reimagine data as being embodied and having intimate connections with our own physical existence. Only then will we respond to the manipulation of data with as much vehemence as we do to the manipulation of our bodies. Only then will rights and dignities remain guaranteed and intact, even during trying times like COVID-19.  

Q&A

The first question in the Q&A that followed was regarding people’s difficulties with resistance to apps. Radhika responded with a few examples: In order to access the Delhi government’s COVID drives, the Aarogya Setu app was mandatory. Some citizens were able to resist this due to their negotiating power while others who wanted to couldn’t because of their marginalities. 

At the time, Muslim compliance with these procedures was especially monitored closely. What’s more, Muslims couldn’t refuse to participate in practices such as banging thalis, lighting lamps etc because they would be attacked. 

Asha workers, who are mostly lower-class women, were at the forefront of the COVID-awareness drives. They were compelled by the government to have the Aarogya Setu app on their phones which to led to several protests. Many of these workers did not have smartphones so were told to borrow phones and download the app. They had to go door-to-door to collect information for apps, and faced severe physical, sexual, and epistemic violence since people became scared that the government was at their doorstep. Muslims in particular heavily distrusted the government because they thought their data would be used for the NRC database. Thus, Asha workers were at the intersectionality of several margins (gender, caste, class). While it is the government that organises the surveillance, it is their workers and representatives that bear the brunt of the backlash. To make matters worse, it was discovered that the Asha workers weren’t being paid properly or given deserved promotions. 

The COVID migrant labourers’ crisis is still fresh in our memories. Migrant labourers from Jharkhand who were stuck in Surat were supposed to benefit from the Chief Minister’s special assistance scheme which was to facilitate their mobility. But this scheme came with the condition that the labourers must have bank accounts registered in Jharkhand to get the services. These workers had been in Surat for many years and only had Surat accounts, which proved useless at this time. Moreover, due to low digital literacy, the workers were unable to understand the error message that was displayed on the app. They each spent INR 4000-5000 to take trains and buses back home, where they opened new bank accounts to receive the scheme. But they noticed that even after several weeks no money was credited. This was because the app was designed in such a way that if registration failed, one could neither receive the money nor lodge a complaint about this issue. Thus, data acted against the interests and lived realities of people.

The next question in the Q&A was about public-private partnerships that enabled data surveillance. To answer this, Radhika explained the case of Uber requiring that drivers set up the Aarogya Setu apps on their phones while not requiring the same of passengers. In a similar fashion, Swiggy executives were mandated to have the app. But this did not benefit them at all since they had no information about whether the order was placed from a red zone or not. 

Another question was about feminist responsiveness to data and surveillance. Radhika’s reply was that feminists have seen tech as an institution of power for many decades and are vigorously studying how it affects women. The understanding, however, is tricky because people struggle to understand and tackle tech within use in civil spaces.

The final question was about Radhika’s stance on the anonymity that accompanies digital spaces. She stated that anonymity has both pros and cons: for a resistor or a victim, data anonymity may mean life or death, whereas, for perpetrators, anonymity may serve as a tool to wreak harassment.