Bhavani Seetharaman

Bhavani Seetharaman

@Bhavani-21

Nadika N

Nadika N

@nadikanadja

Detailed findings: Implementation concerns

Submitted Sep 21, 2021

What we observe in this section can be seen as the continuation of the previous section with issues arising from the lack of ethical considerations playing out in terms of implementation. Vague definitions of organizations, and the broad scope that has been created for the Rules, organizations must now undertake more tasks and activities to comply, irrespective of size or resources. Some of the larger themes taken into consideration in this section are recruitment difficulties, compliance issues and technology upheaval for organisations. In addition to this we take a closer look at the effect of the Rules on the startup community in the country.

Hiring difficulties and personal liability

Most respondents we spoke to said that the IT Rules 2020 have brought on several implementation hurdles for organizations. Organizations anticipate that the personal liability clauses of the IT Rules will significantly increase the cost of hiring, especially at senior levels in the organization, and increase the cost of compliance and reporting. In addition to this the liability of the organisation remains in question as well as the single accountability and legal repercussions faced by the Chief Compliance Officer (CCO). There is also concern of what skillsets apply when hiring for such a role, and availability of candidates with such skill sets, and willingness to face legal action. Given that the rules hold the CCO – a senior managerial representative resident in India – personally accountable for compliance, organizations say this will create a hiring vacuum.

Representatives of social media organizations, and public interest technologists and privacy experts told us that larger platforms and social media intermediaries already have robust content monitoring capabilities, and that users are able to flag or report CSAM and similar. They question whether providing such clauses that attaches personal liability to employees of social media platforms will work as intended or if it would merely complicate processes further.

Experts and industry leaders also say this clause, and the Rules overall can adversely impact ‘innovation’ and thus investment in the overall tech ecosystem. With capital and earnings going into legal defence for employees, experts believe this will cause a bottleneck into capital inflows into the country.

Impractical timelines for compliance

The IL Guidelines require social media intermediaries, Significant Social Media Intermediaries (SSMIs) and messaging services to take down any user-generated content that is deemed unlawful under the Guidelines within a period of 36 hours, but retain the data of the user and the content deemed objectionable for a period of up to 180 days to aid in investigation.

Social media organizations and platforms may not always have requisite details regarding user registration and the 36-hours timeframe provided will not be enough time for respondent organizations to collect such data. Additionally, users are given redressal mechanisms which also adds to the timeline crunch and growing administrative tasks. In this scenario, requiring user notification, takedown of content, and setting up grievance redressal can become challenging for social media organizations, particularly small and medium organizations. As the representative of a video sharing platform says,

“36 hours is the content takedown timeline, if the request is coming from the government agency or the platform has responded in 72 hours for law enforcement requests asking for user information. We may not have sufficient details in many cases. Putting an obligation on the platform to respond within those stipulated timelines, and having consequences – such as removing Safe Harbour, or holding CCOs personally liable – if the organizations don’t comply because of lack of information is concerning, something which requires more clarification. Can there be ‘stop the clock’ clauses here?”

Some representatives we spoke to also point to the requirement for periodic compliance reports (which involves monthly submissions for SSMIs, 1), and say such a clause will increase the burden of compliance for the organization and its employees. While many social media platforms and services with user-generated content say they already have systems in place for monitoring content, and produce periodic reports on compliance, content filtering, and user related information, they believe the strict deadlines will affect the day-to-day operations of their organizations, requiring a dedicated department.

Lack of clarity

Some organizations say they produce transparency reports regularly, however there is no clarity on what the mandated compliance reports under the IT Rules must cover, other than expanding on the grievances received. How or why the government wants such a report with such strict deadlines has not been detailed in the Rules, and is a cause of concern in addition to the strict deadlines mentioned.

Representatives of digital news publications also highlighted that the Rules didn’t offer enough clarity on the compliance requirements. As the founder of a digital-only news website said,

“So, one thing I want to add is after the digital rules came into play, one thing that they had said is, within 30 days you have to register your organization. You have to send all the details of your funding, who runs the organization, who owns the company etc… A couple of us did write to the ministry, asking ‘Who are we supposed to give this information to?’ And there has been no reply. Those rules said we have to do it within 30 days after the rules come into effect. But we have received no information, no clarity from either ministry about who to report this to.”

Representatives of organizations say that this is a cumbersome procedure with impact on costs. Mandatory reporting at monthly intervals will further increase the burden. Smaller organizations and startups will thus be at greater risk of non-compliance.

There is concern over who will govern the digital news organizations. On the one hand, they are nominally classed under the Ministry of Information and Broadcasting. and the broadcast rules are defined by them. However, the IT Rules are jointly notified and enforced by the Ministries of Electronics and Information Technology (MEITY), and the Information and Broadcasting (MIB).

A senior journalist and a member of a large digital news publication says,

“We understand that the Rules are governed by MeitY. While news and broadcast rules are administered by MIB. So that confusion remains and I do not expect the government of the day or any government for that matter, will take a clear call. And that gray area is going to be extremely dangerous, especially when governments are absolutely obsessed with regulation and are obsessed with controlling communication. So you worry about each and every tweet and the next second, you want to react to that. So that is a big concern that there will be an overlap, there will be contradiction, and there will be confusion. And that must not be the case. This cannot be an executing process, there has to be a judicial process.”

Technology overhaul

Despite the fact that many organizations already have internal processes to govern content on their platforms, due to the nature of the Rules, organizations will be forced to completely overhaul their technology framework to comply.

A large organization with significant presence in India believes that major re-tooling and restructuring of processes will be required to comply with the data retention policies of the new Rules. This is particularly true in retaining user registration data for up to 180 days. The organization states that the current timelines given under the Rules will not be sufficient. Due to the first originator clause, organizations will need to scrap basic architecture of their products despite the fact that many organizations have already internally been monitoring and protecting their users.

As a policy head of a large social media platform puts it,

“There is active sharing of (content) hashes that happens across these companies. We’ve been doing it (monitoring and removing pornographic/CSAM content) for a long time. There is this 24-hour turnaround time, we are cognizant of the objective of the GOI, but I think it is important to have some guard rails to understand what falls under this bucket. We do understand it’s needed given the number of cases that happen in India. But we need to have some guard rails to ensure that it (is) not abused. We need more clarity on the protocols.”

Another respondent echoed this and said,

“We have a strong internal process, every piece of content is reviewed. But here, there is a 24-hour return time. And there might be many frivolous complaints, and we need a process to understand how we weed out frivolous concerns. We require time to build this up.”

All this will add to the overall cost of operating, thus impacting the ease of doing business in India.

Startups affected

Further, smaller organisations will be at a disadvantage compared to larger organisations when it comes to compliance. Mandating reporting requirements for compliance will increase the compliance cost for smaller organizations and startups, but not affect larger organizations, resulting in a further unequal situation, respondents said. As the legal head of a large social media platform said,

“A startup might find it hard to hire three people. There’s a need for understanding the regulatory cost. I don’t think we can quantify it in any form. But it’s important to have that number. Large organizations may not find it a problem. It is the small intermediaries that will bear the brunt. I remember when GDPR discussions were happening, there were studies about the average cost of compliance. There is a need for that to happen here.”

In our roundtables, an added perspective to the problem of implementing traceability or content filtering emerged. One of employee and organizational bias, and the larger question of ethics.

As a senior startup founder put it,

“Typically, when we hire as an organization, we’re looking for people with big data knowledge or some tech stack knowledge. But there isn’t enough awareness beyond these. There is no formal education about ethics, biases, perspectives that we can hire for right now.”

Echoing this, a public interest technologist said,

“I feel like building this (compliance to IT Rules) overnight is not probably something that any organization will do. I feel like what is a good way of building these tools is to treat them as indicative rather than prescriptive, something that is built over time with enough people who can sort of understand the various issues that come up when modeling these systems. So you need a very keen eye, and you need to give it time to be able to grow these systems. Thirdly, I feel there is always some bias in every organization, no matter what you do.”

At the end of last year India was estimated to have more than 41,000 startups in the country with employment of over 4 lakh individuals produced by them and is the world’s third largest startup ecosystem2. Considering this, it is prudent to ensure that such sectors remain protected with the rising rates of unemployment and lack of growth in other sectors in the country3, it is essential that the Rules keeps in mind the sizes and human resources required for compliance.

Comments

{{ gettext('Login to leave a comment') }}

{{ gettext('Post a comment…') }}
{{ gettext('New comment') }}
{{ formTitle }}

{{ errorMsg }}

{{ gettext('No comments posted yet') }}