India's Non-Personal Data (NPD) framework

India's Non-Personal Data (NPD) framework

Knowledge repo, archives and collaborations

Nadika N

Nadika N

@nadikanadja

Summary of Panel Discussion: Personal and Non Personal Data regulations worldwide and how NPD is positioned against it

Submitted Jan 28, 2021

Moderator: Jyoti Panday, Researcher, Internet Governance Project

Panellists:
- Annabel Lee, AWS Public Policy
- Raegan MacDonald, Public Policy, Mozilla
- Sean McDonald, Frontline SMS

The panel discussion, on Wednesday 27th Jan, looked at international regulations for governing data, what are the models developing around the world:
* A US-style free market approach,
* A China-style governmental control, or
* A European Union style rights-based approach with some governmental oversight.
And positioned the NPD regulations emerging in India, and the conversations built around it, within these three models.

Moderator Jyoti Panday had the following questions for the panellists:

  1. How is the conceptualization of data as a resource with economic and societal benefits shaping data governance laws in Europe (GDPR/Digital Services Act (DSA) /Digital Markets Act (DMA) (balancing market integration vs rights protection)

  2. At the heart of the NPD framework is the question of who has access to data and why? What are the models of ownership of data that are emerging globally (Community Rights / Data trusts/ Data Commons). Is the model of data trusts being explored under the NPD framework similar to or diverging from how data trusts are being modelled globally?

  3. What is the impact of various national data strategies on cross-border data flows/ operations of transnational companies? How are companies navigating the data laws emanating from the three big data blocs (US-EU-China).

Jyoti also had this to say:
“At the heart of regulatory efforts for data governance, there are two key objectives. The first is enabling access to data held by private platforms. As without access to vital data, neither the public nor the private sector will be able to exploit the benefits of data. The second objective that policymakers seem to be pursuing is to tackle market challenges presented by data monopolies, and large platforms that become data monopolies.”

Some follow-up questions for the panel were:
1. Is the separation between personal and non-personal data laws sustainable and is such a binary separation desirable? Has non-personal data become the focus of regulation as a way to overcome the constraints created by data protection laws?

  1. What are some of the conflicts that arise when governments pursue different regulatory objectives than data protection law like enabling data access and sharing? How are these different regulatory goals shaping the development of data governance laws in India and beyond?

Raegan MacDonald, Head of Public Policy at Mozilla responded first.
Raegan pointed out that EU, India, and many governments around the world realise that data is an economic resource, and an economic product, and there is an intent to regulate it to further public interests, and to regulate and manage large monopolies. Raegan cautions that this must be done with care. She pointed to the GDPR in the EU, which while protecting and preserving individual rights, does not have a position on collective responsibility, collective harms.

Pointing to the Cambridge Analytica scam, she said that there are large data-related harms that may not accrue to the individual, but to a wider community, to the society. Raegan also said that the Cambridge Analytica scandal also affected those who may not have given their data or had it taken.

So there’s a definite felt-need to manage and regulate data at a collective, community level, and they are not present currently in the GDPR. Raegan however says that the conversations around this are beginning in the EU. With data intermediaries that could mitigate or control the actions of data monopolies, and antitrust legislation.
However, the aim is not to replace US tech monopolies with European ones.
Raegan also had a word of caution around privacy risks, and said there are very few examples of how data trusts/stewardship models work and the advice to EU governments is to consult “often and openly, with experts, from everything from civil liberties, to security, to businesses, to better understand” how to shape these conversations in the future.

Sean McDonald, co-founder of Digital Public, and FrontlineSMS spoke next
Sean stated at the start that the distinction between personal and non-personal data is a precarious one and in today’s computing and data-availability environment, the lines are blurry. And so to achieve some of what data regulations around the world are trying to, there must be serious engagement on the definition of and the distinction between personal data and non-personal data.
He also said that what matters most in this conversation is the rights to - and agency over, availing a service or feature, or product, of what this data means, and enables.
“So my right to be represented in a particular way or my right to access a particular service, those rights are what convey my standing, and so that’s what gives me the right to bring a claim against something, or to participate in the decision making around how data gets governed and used.”

Saying that the edifice of regulations are built on perhaps unstable foundations, Sean said that the data-trusts model is more about stewarding data rather than stewarding rights.
Sean said it was good to see the idea of standards of care in the NPD framework, but that ‘duty of loyalty’ was missing, and it is this which - in a complex ecosystem, will better preserve the individual’s rights.
Sean used the analogy of a table with no legs to explain this, and said while the NPD may seek to protect the table top, it is still missing some very important legs.

Annabel Lee, Public Policy Lead at Amazon World Services, Singapore spoke next.
Annabel began by stating that AWS, is a cloud services provider and has shared security responsibility with their customers. Their customers retain full control over their own data, including personal data – and AWS does not have visibility or control over that data She also mentioned that prior to her current role, she worked with a Government agency that had dual roles of being both data protection regulator as well as data innovation industry promotion. She was involved with developing data protection legislation and thinking about compliance with the law once it was in effect.

She stated that privacy and innovation aren’t mutually exclusive, and the primary goal of laws and regulations should be to ensure that data is secured and processed responsibly before building products or services on top of it. Annabel said that while high profile cases like Cambridge Analytica captures the imagination and while malicious actors exist, they fall in the minority. The majority of companies collect data for essential, routine business purposes – and often data breaches and misuse are unintentionally, arising from their failure to govern data properly. As a regulator working with a data protection authority, Annabel said she had to talk to companies about how they governed data. “Privacy regulators do not like to think about security controls. It’s a principle in the law, but it’s very technical. And usually, they’ll leave it to a security expert to think about it. What they’ll ask for is that a reasonable standard of security is put in place, but may not provide more specific advice on how to do so.”

There are multiple data-security, data-governance regimes and blocs emerging in Asia and this leads to fragmentation, which means a company operating in multiple jurisdictions will have to create a baseline framework to comply and can then build on top of this customized to different jurisdictions where necessary. This may be perhaps be easier for a large company with resources, but for startups and small businesses, this becomes quite difficult. Annabel noted that in AWS’ view, data security is an important consideration, but that often, in a bid to merely comply with onerous or confusing regulations, CSP customers pick lower levels of security, even when a higher standard of security is necessitated, leading to a higher risk of breaches. Therefore, data protection regulation needs to keep in mind the extent to which they are likely to be easily and effectively complied with. Annabel said that the Indian government’s aim of regulating non-personal data was a noble aim.

“I think that’s a very noble aim and very noble outcome. But I think the thing that we’re not talking about is how scary it could be if the governance frameworks are not in place, when whoever it is, for whatever reason, using whatever mechanism is able to obtain that data.”
Pointing to the Indian Personal Data Protection bill, Annabel said that it was great that the duties of a fiduciary in protecting personal data was laid out clearly, but the same for non-personal data was missing, and that this could therefore lead to bad outcomes Annabel said this makes the focus on duty of care, duty of loyalty and security was thus very important. She warned against the risks of re-identification for anonymized PD used as NPD, citing the example of weather data, when recombined with other data could be used to create granular profiles about data subjects and lead to significant harms.

Jyoti pointed out how the Personal Data Protection Authority had a remit of protecting data, while the Non Personal Data Protection Authority had the charge of making that data available for economic purposes, and that going forward, this could increasingly become a point of conflict between the two.

Raegan, answering Jyoti’s question about the sustainability of maintaining the distinction between Person and Non Personal data, said that there are perhaps two binaries emerging: one of the technical challenge: is anonymisation feasible and sustainable; and the other is - does personal data and non-personal data deserve the same level of protection.

Raegan believes, and as experts point out, perfect anonymisation cannot be achieved. So focus must be on risk mitigation, on better security.
Raegan says that while the EU takes privacy risks and personal risks seriously, non-personal data is in a different realm, and thus perhaps not immediately governed by the GDPR. However, there are emerging conversations about duty of care, duty of loyalty to cover the risks that could arise out of this situation.

Sean, speaking next, responded to Jyoti’s question of what frameworks can be produced for governing data, and who “owns” the data. In Jyoti’s words: “Is the Data Commons even possible with respect to data? And if we are to move in that direction, what are some of the principles we need to think through?”
Sean said it is important to see how duty flows through digital contracts, how it is maintained over perhaps many digital relationships and how we trace loyalty through various contracts.

Saying that the idea of data stewardships is in essence outsourcing political and social risks to future professionals who will continue to be stuck with the same problems as now: access to courts, access to justice. He says that the NPD and other regulations around the world do not focus on institutional investments required to protect rights - both current and emerging rights.

Annabel who spoke next, had this to say: “The issue of NPD, that’s really harming innovation is two-fold: firstly it’s mandating the sharing of data; and secondly it’s doing so very broadly, with a vague notion that somebody called a “trustee” would be able to ensure data is governed and protected to the right level. But the framework needs to provide assurance on how data sharing will be governed properly, and sharing must not be mandated as it needs to account for scenarios in which companies have a very good reason not to share. Expanding on this further, Annabel said this uncertainty could affect investment, could affect growth and could give VCs pause to consider the value of their investment in emerging companies.

Annabel believes that this could perhaps be solved by calling the regulation a “Data Sharing Framework” rather than calling it Non-Personal Data framework.
Annabel also spoke about the need for balance - balance in any relationship between an individual and the government, individual and the corporation, between the government and the corporation, between government and government, and between big and small players. And this balance requires a solid understanding of the concept of data, and justice.

The emerging consensus from the panel is that while the NPD regulation, and similar data governance models around the world may come from the right aim - of managing and governing data for the benefit of the country and its people, it may not perhaps take into account all the complexities present - both in the legal system and in the global market. And that a solid rights-based approach in protecting the individual is required.

Comments

{{ gettext('Login to leave a comment') }}

{{ gettext('Post a comment…') }}
{{ gettext('New comment') }}
{{ formTitle }}

{{ errorMsg }}

{{ gettext('No comments posted yet') }}