Data is the snake oil that keeps the mysticism of computational networks as robust decision-making bodies intact. Our faith in the idea that predictive digital technologies are reliable is sustained by the constant conversation around data.
When a digital deployment does not work, or at least does not work as intended, we hear two pre-wired responses from the intermediaries and stakeholders who support the system: That we need more data to create better outputs and that we need better data to generate more reliable results.
This quest for more data and better data naturalises three basic principles. First, that data is a neutral thing, which is a mere description of what is happening. Second, that data is imminently transactional, without any repercussions on the events and people that it describes. Third, that data is a commodity and hence can be traded off in exchange for services, benefits and conveniences. These three principles offer digital data as digital objects — born digitally and only tenuously connected to the experiences and lives of people it seeks to describe.
The historic emergence of the Digital Personal Data Protection (DPDP) Act 2023 that India has now adopted is a late but welcome alignment to other global and national Data Protection Regulations around the world. We are in good company in defining the use and scope of digital data by private and public institutions. The Act closely mimics the General Data Protection Regulation adopted by the European Union in 2016, the Data Protection Act enacted by the UK in 2018, and the Personal Information Protection Law passed by China in 2020. All of these regulatory instruments are critical in recognising the capacity for extracting, exploiting, and exchanging data to create targeted profiles, segregated filter bubbles, and contained echo chambers that have deteriorating effects on social, personal, political, and economic well-being of the individuals being targeted by data industries.
The DPDP Act 2023 underscores that data is not only a precious resource but that, without protections, it can lead to extraordinary harms to the people who are connected with a data set. The capacity of our algorithmic networks to make correlations, causal links, and critical connections that can manipulate, target, influence, and punish data subjects is alarming and expansive. The DPDP 2023 forwards the cause for data privacy, individual data protection, setting up limits around the use and scope of data, and even instituting the right to be forgotten, thus setting up temporal limits on how long data can be retained in a system.
These are all welcome changes which put greater accountability on private corporations and government institutions to make their data practices more transparent, consensual, and explainable. While it is impossible to think of a dismantling of the big-data structures that have come to rule our digital lives, the safeguards and the demands of fairness and justice that the DPDP mandates are much needed, though aspirational. Even though they might be difficult to operationalise, it will certainly be good to have them as benchmarks.
However, the DPDP 2023, just like most of the other global Data Protection and Privacy regulations, continues to reinforce the
three principles of digital data which are seen as de facto.
In presenting data as a mere description or information about people and phenomena, it fails to look at the ethics of data generation and harvesting. We leak digital data all the time. This data does not just describe what we do, but they define who we are, and predict what kind of subjects we are going to be. Data generation is as critical a site of regulation as data circulation and usage, and we need to address that more critically in data regulation.
The excessive focus on the economic value and use of data reinforces the idea that data is a commodity. It doesn’t question the need for certain kinds of data or the mandate to resist datafication of our personal and private lives. It presumes that all data can be generated and harvested, and valued only as an economic value. This betrays the fact that data is embodied. It affects and is anchored in our bodily practices, and that we need better understanding of what can and cannot be translated as data, setting limits on what can be translated into data.
The DPDP 2023 accepts the maxim that digital data travels and circulates. While data has the potential to travel, it is not a given that it can travel so fast and so far that it can be alienated from the subject, creating digital and temporal distance so that consent is no longer possible or required. We have to pay heed to legal scholars and judges who have insisted that data, like dignity and life, has to be seen as an inalienable resource. Regulating how far data can travel without losing its provenance and consent has to be reworked in the operationalisation of this bill.
While the DPDP Act 2023 sees a historic landmark in India, which still boasts of the largest biometric citizen database in the world and has lacked a general framework to protect the use of digital data and personal privacy, it has to be seen as a first draft to establish data equity and justice. The aspirations of the DPDP Act 2023 will be met only if the operational realities question and critique the three principles which are often used to weaponise data against those who are the most vulnerable and precarious.