Social Media networks and platforms, chat and messaging applications, photo and video sharing services have radically transformed the internet landscape in India and elsewhere in the last decade. User generated content has allowed diverse voices to create and share views, political opinions, dance videos, movie and music commentaries.

While the platforms and networks have encouraged these voices, there is also a growing concern1 over the sharing of potentially offensive material such as pornographic content, child sexual abuse material (CSAM), hate speech and violent content often not suitable for the wide audience such platforms caters to.

The INFORMATION TECHNOLOGY (GUIDELINES FOR INTERMEDIARIES AND DIGITAL MEDIA ETHICS CODE) RULES, 2021 notified by the Ministry of Electronics and Information Technology (MEITY), together with the Ministry of Information and Broadcasting (MIB), Government of India – under the IT Act 2000, seeks to monitor and control user generated content and provide firm guidelines for social media Intermediaries, digital news publications and other organizations who host or transfer content on the internet.

The Rules were notified in February 2021, and went into effect in May 2021. Organizations and individuals have challenged the Rules on various counts2 – including their applicability under the parent law. Large platforms and social media networks have expressed concern about implementation and compliance.

Privacy Mode, a hub for conversations around privacy, data security and compliance, conducted a two-part research project seeking to understand the impact of the Rules on organizations and employees in the tech ecosystem who might be responsible for implementing the Rules and achieving compliance in tech and media products.

A qualitative study of Social Media Platforms, Digital News Publications, and Cloud Computing services providers, was undertaken to look at the possible impact on encryption, traceability, compliance, applicability of law among others, was conducted in May-June 2021; and a quantitative survey of tech workers across India, looking at awareness, professional and personal impact, work flows and requirements, was conducted in June-July 2021.

This report is a comprehensive analysis of both surveys and presents a rounded picture of the impact of the IT Rules 2021 on organizations and its employees. This research report also looks at larger questions and concerns about privacy, freedom of expression and speech given the discursive debates around responsible tech, digital platforms and ethics, and impact on society and individuals.

Executive Summary and Core Concerns

Scope of the law

By definition, the ‘Rules’ framed for any law in India are ‘Subordinate Legislation’ or ‘Delegated Legislation’. While laws are made by the Parliament/Legislature, Rules are made by the Executive i.e., the Government of India, to fulfill the requirements of the parent law. In Indian democracy, it is only the Legislative that can make laws. The Executive can only implement them. If the law says ‘XYZ has to be accomplished’, rules can frame the methods in which ‘XYZ’ can be accomplished. However, in the case of IT Rules 2021, the Rules are seen as overarching and exceeding the parent law.

Notified under the Information Technology Act, 20003 , which provides ‘Safe Harbour’ status to digital intermediaries, the Rules are ultra vires of the parent Act and seek to regulate activities that have no mention in it. Further, bringing digital news publishers under the ambit of the Rules, is unconstitutional and ultra vires of the IT act, as news websites do not fit the definition of ‘Intermediaries’ given under the Act4.

Further, the activities of news publishers and media are regulated by the Ministry of Information and Broadcasting (MIB)5, and thus excluded from the ambit of the IT Act. Concerns emerged that the Rules – which did not pass through the legislative body – sought to curtail rights and laws that did emerge from due legislative process.

Further, with existing guidelines under the Press Council Act that govern news organizations, the Rules are seen as overarching and drafted to censor specific media channels and outlets.

The Rules require intermediaries to identify the first originator of messages deemed objectionable. This implies that messaging platforms and social networking sites will have to significantly alter their product (and the technology underlying products) to comply. This is again not governed by the parent act, and is therefore unconstitutional. The Rules also operate from a position of assumed guilt, where all conversations and communications are expected to be scanned for potentially offensive material, and traced back to the original sender. This is against the assumption of innocence enshrined in the legal system operating in the country.

Breaking encryption and implementing traceability, a fundamental requirement of the new Rules, have international legal implications, as messaging services and social media platforms will need to alter the underlying technical architecture of their products or services - or at least have a different product and user experience for Indian users. Since this cannot be implemented for users in India alone and will affect every user of the services across the world, these social media intermediaries will be in violation of international laws governing user privacy and security, thus inviting legal costs.

Freedom of expression and natural justice

The Rules are seen as violating freedom of expression guaranteed in the Indian constitution by implementing traceability, which breaks encryption. Privacy, also a fundamental right as determined by the Supreme Court of India, is increasingly seen as a ‘make-or-break’ feature of all websites, apps, products, and services. Privacy operates from a position of assumption of innocence of the user. The Rules, by enforcing traceability, violate the fundamental rights of Indian citizens by reducing privacy to a conditional service, and not a constitutional guarantee

Cost of compliance

When the IT Rules came into effect in May 2021, they were criticized for imposing high costs of compliance, including legal and personal liability attached to employees of social media organizations. In the case of the office of the Chief Compliance Officer (CCO), liability extended even after the CCO retired from office. Every social media and news organization surveyed during this research pointed to the personal liability attached to the role of the CCO, grievance and nodal officers as imposing financial and legal costs on their organizations.

Proactive content filtering requirements will impact human resources requirements, demand changes in product and business operations, thereby significantly increasing costs. Traceability clauses under the Rules require extensive overhaul of messaging services and social networking platforms’ core architecture, requiring significant monetary and human resource investment.

Further, respondents in the Focus Group Discussions (FGDs) believed that ease of doing business will diminish given the stringent compliance regime and employee impact.

The Rules are also framed vaguely and arbitrarily, leading to confusion over operating clauses. Additionally, they have stringent reporting requirements. This will affect all organizations, especially small and medium enterprises, financially, and otherwise.

Skill and competency of Industry

In addition to the legal and ethical concerns emerging from implementation of the Rules, there are knowledge, awareness, and skill gaps across a representative sample of the IT industry, which may impact the ability of organizations to comply with the IT Rules.

Software developers in junior and mid-level roles in IT organizations believe their workload will increase with the IT Rules. Respondents hinted at their jobs now requiring them to do more documentation and reporting, and their role in achieving compliance in the company’s product as increasing their workload.

Industry representatives however felt that tech workers and product managers will fundamentally need knowledge in, or retraining in, privacy features, content filtering and user experience, in order to fully comply with the Rules. Experts in the industry believe that more than just technical skills or knowledge, what is missing is also perspective and understanding of how executing the Rules will impact users of media and tech products.

As noted above, encryption and traceability requirements of the Rules will mean major changes in products, especially user experience and inability to safeguard privacy of Indian users under the IT Rules. Implementing features such as voluntary verification will need product managers to acquire new skills and knowledge. Tech workers will need to learn how to work in coordination with legal teams. Given the implementation of the IT Rules, each content takedown request will have to be serviced on a case-by-case basis. This will impact scale and standard operating procedures in organizations, or will result in organizations relying more heavily on automation to censor content proactively (and to avoid being served takedown notices). In both cases, users of these products will bear the brunt, where their freedom of speech and expression will be reduced drastically.

Individual chapters and sections of the report are presented as submissions. Scroll down to read them.

About the principal researchers

Nadika Nadja is a researcher at Hasgeek. She has worked across advertising, journalism, TV and film production as a writer, editor and researcher.

Bhavani S is a Research Associate at Hasgeek. She has previously worked for the Centre for Budget and Policy Studies (CBPS), Microsoft Research India, and the University of Michigan, Ann Arbor.

Support team

Anish TP illustrated the report. Satyavrat KK provided research and editorial support. David Timethy and Zainab Bawa were project managers for producing this report. Kiran Jonnalagadda and Zainab Bawa advised on research design and execution.

Acknowledgements

We would like to thank the following individuals who provided feedback during different stages of the research. Their feedback helped the team fine-tune and bring rigour to the research process.

  1. Suman Kar, founder of security firm Banbreach, for reviewing early drafts of the quantitative research questionnaire, and providing detailed inputs on survey design.
  2. Prithwiraj Mukherjee, Assistant Professor of Marketing at IIM-Bangalore, for reviewing early drafts of the quantitative research questionnaire, and providing detailed inputs on survey design.
  3. Chinmayi SK, Founder of The Bachchao Project, for reviewing and providing feedback on the final report and conclusions

While Hasgeek sought funding from organizations, the research itself was conducted – with full disclosure at all stages – independently and objectively. The findings do not reflect any individual organization’s needs.


  1. Unicef: Growing concern for well-being of children and young people amid soaring screen time (2021) - https://www.unicef.org/press-releases/growing-concern-well-being-children-and-young-people-amid-soaring-screen-time 

  2. LiveLaw: Supreme Court Lists Centre’s Transfer Petitions, Connected Cases After 6 Weeks
    https://www.livelaw.in/top-stories/it-rules-2021-supreme-court-lists-centres-transfer-petitions-connected-cases-after-6-weeks-181032 

  3. India Code: The Information Technology Act 2000 https://www.indiacode.nic.in/bitstream/123456789/1999/3/A2000-21.pdf 

  4. India Code: IT Act Definitions https://www.indiacode.nic.in/show-data?actid=AC_CEN_45_76_00001_200021_1517807324077&sectionId=13011&sectionno=2&orderno=2 

  5. Ministry of Information and Broadcasting: About the ministry https://mib.gov.in/about-us/about-the-ministry 

Hosted by

Deep dives into privacy and security, and understanding needs of the Indian tech ecosystem through guides, research, collaboration, events and conferences. Sponsors: Privacy Mode’s programmes are sponsored by: more

Supported by

Google’s mission is to organize the world’s information and make it universally accessible and useful. more

Promoted

GitHub is the developer company. As the home to more than 65 million developers from across the globe, GitHub is where developers can create, share, and ship the best code possible. GitHub makes it easier to work together, solve challenging problems, and create the world’s most important technologi… more

Promoted

Bhavani Seetharaman

Bhavani Seetharaman

@Bhavani-21

Nadika N

Nadika N

@nadikanadja

Detailed findings: Implementation concerns

Submitted Sep 21, 2021

What we observe in this section can be seen as the continuation of the previous section with issues arising from the lack of ethical considerations playing out in terms of implementation. Vague definitions of organizations, and the broad scope that has been created for the Rules, organizations must now undertake more tasks and activities to comply, irrespective of size or resources. Some of the larger themes taken into consideration in this section are recruitment difficulties, compliance issues and technology upheaval for organisations. In addition to this we take a closer look at the effect of the Rules on the startup community in the country.

Hiring difficulties and personal liability

Most respondents we spoke to said that the IT Rules 2020 have brought on several implementation hurdles for organizations. Organizations anticipate that the personal liability clauses of the IT Rules will significantly increase the cost of hiring, especially at senior levels in the organization, and increase the cost of compliance and reporting. In addition to this the liability of the organisation remains in question as well as the single accountability and legal repercussions faced by the Chief Compliance Officer (CCO). There is also concern of what skillsets apply when hiring for such a role, and availability of candidates with such skill sets, and willingness to face legal action. Given that the rules hold the CCO – a senior managerial representative resident in India – personally accountable for compliance, organizations say this will create a hiring vacuum.

Representatives of social media organizations, and public interest technologists and privacy experts told us that larger platforms and social media intermediaries already have robust content monitoring capabilities, and that users are able to flag or report CSAM and similar. They question whether providing such clauses that attaches personal liability to employees of social media platforms will work as intended or if it would merely complicate processes further.

Experts and industry leaders also say this clause, and the Rules overall can adversely impact ‘innovation’ and thus investment in the overall tech ecosystem. With capital and earnings going into legal defence for employees, experts believe this will cause a bottleneck into capital inflows into the country.

Impractical timelines for compliance

The IL Guidelines require social media intermediaries, Significant Social Media Intermediaries (SSMIs) and messaging services to take down any user-generated content that is deemed unlawful under the Guidelines within a period of 36 hours, but retain the data of the user and the content deemed objectionable for a period of up to 180 days to aid in investigation.

Social media organizations and platforms may not always have requisite details regarding user registration and the 36-hours timeframe provided will not be enough time for respondent organizations to collect such data. Additionally, users are given redressal mechanisms which also adds to the timeline crunch and growing administrative tasks. In this scenario, requiring user notification, takedown of content, and setting up grievance redressal can become challenging for social media organizations, particularly small and medium organizations. As the representative of a video sharing platform says,

“36 hours is the content takedown timeline, if the request is coming from the government agency or the platform has responded in 72 hours for law enforcement requests asking for user information. We may not have sufficient details in many cases. Putting an obligation on the platform to respond within those stipulated timelines, and having consequences – such as removing Safe Harbour, or holding CCOs personally liable – if the organizations don’t comply because of lack of information is concerning, something which requires more clarification. Can there be ‘stop the clock’ clauses here?”

Some representatives we spoke to also point to the requirement for periodic compliance reports (which involves monthly submissions for SSMIs, 1), and say such a clause will increase the burden of compliance for the organization and its employees. While many social media platforms and services with user-generated content say they already have systems in place for monitoring content, and produce periodic reports on compliance, content filtering, and user related information, they believe the strict deadlines will affect the day-to-day operations of their organizations, requiring a dedicated department.

Lack of clarity

Some organizations say they produce transparency reports regularly, however there is no clarity on what the mandated compliance reports under the IT Rules must cover, other than expanding on the grievances received. How or why the government wants such a report with such strict deadlines has not been detailed in the Rules, and is a cause of concern in addition to the strict deadlines mentioned.

Representatives of digital news publications also highlighted that the Rules didn’t offer enough clarity on the compliance requirements. As the founder of a digital-only news website said,

“So, one thing I want to add is after the digital rules came into play, one thing that they had said is, within 30 days you have to register your organization. You have to send all the details of your funding, who runs the organization, who owns the company etc… A couple of us did write to the ministry, asking ‘Who are we supposed to give this information to?’ And there has been no reply. Those rules said we have to do it within 30 days after the rules come into effect. But we have received no information, no clarity from either ministry about who to report this to.”

Representatives of organizations say that this is a cumbersome procedure with impact on costs. Mandatory reporting at monthly intervals will further increase the burden. Smaller organizations and startups will thus be at greater risk of non-compliance.

There is concern over who will govern the digital news organizations. On the one hand, they are nominally classed under the Ministry of Information and Broadcasting. and the broadcast rules are defined by them. However, the IT Rules are jointly notified and enforced by the Ministries of Electronics and Information Technology (MEITY), and the Information and Broadcasting (MIB).

A senior journalist and a member of a large digital news publication says,

“We understand that the Rules are governed by MeitY. While news and broadcast rules are administered by MIB. So that confusion remains and I do not expect the government of the day or any government for that matter, will take a clear call. And that gray area is going to be extremely dangerous, especially when governments are absolutely obsessed with regulation and are obsessed with controlling communication. So you worry about each and every tweet and the next second, you want to react to that. So that is a big concern that there will be an overlap, there will be contradiction, and there will be confusion. And that must not be the case. This cannot be an executing process, there has to be a judicial process.”

Technology overhaul

Despite the fact that many organizations already have internal processes to govern content on their platforms, due to the nature of the Rules, organizations will be forced to completely overhaul their technology framework to comply.

A large organization with significant presence in India believes that major re-tooling and restructuring of processes will be required to comply with the data retention policies of the new Rules. This is particularly true in retaining user registration data for up to 180 days. The organization states that the current timelines given under the Rules will not be sufficient. Due to the first originator clause, organizations will need to scrap basic architecture of their products despite the fact that many organizations have already internally been monitoring and protecting their users.

As a policy head of a large social media platform puts it,

“There is active sharing of (content) hashes that happens across these companies. We’ve been doing it (monitoring and removing pornographic/CSAM content) for a long time. There is this 24-hour turnaround time, we are cognizant of the objective of the GOI, but I think it is important to have some guard rails to understand what falls under this bucket. We do understand it’s needed given the number of cases that happen in India. But we need to have some guard rails to ensure that it (is) not abused. We need more clarity on the protocols.”

Another respondent echoed this and said,

“We have a strong internal process, every piece of content is reviewed. But here, there is a 24-hour return time. And there might be many frivolous complaints, and we need a process to understand how we weed out frivolous concerns. We require time to build this up.”

All this will add to the overall cost of operating, thus impacting the ease of doing business in India.

Startups affected

Further, smaller organisations will be at a disadvantage compared to larger organisations when it comes to compliance. Mandating reporting requirements for compliance will increase the compliance cost for smaller organizations and startups, but not affect larger organizations, resulting in a further unequal situation, respondents said. As the legal head of a large social media platform said,

“A startup might find it hard to hire three people. There’s a need for understanding the regulatory cost. I don’t think we can quantify it in any form. But it’s important to have that number. Large organizations may not find it a problem. It is the small intermediaries that will bear the brunt. I remember when GDPR discussions were happening, there were studies about the average cost of compliance. There is a need for that to happen here.”

In our roundtables, an added perspective to the problem of implementing traceability or content filtering emerged. One of employee and organizational bias, and the larger question of ethics.

As a senior startup founder put it,

“Typically, when we hire as an organization, we’re looking for people with big data knowledge or some tech stack knowledge. But there isn’t enough awareness beyond these. There is no formal education about ethics, biases, perspectives that we can hire for right now.”

Echoing this, a public interest technologist said,

“I feel like building this (compliance to IT Rules) overnight is not probably something that any organization will do. I feel like what is a good way of building these tools is to treat them as indicative rather than prescriptive, something that is built over time with enough people who can sort of understand the various issues that come up when modeling these systems. So you need a very keen eye, and you need to give it time to be able to grow these systems. Thirdly, I feel there is always some bias in every organization, no matter what you do.”

At the end of last year India was estimated to have more than 41,000 startups in the country with employment of over 4 lakh individuals produced by them and is the world’s third largest startup ecosystem2. Considering this, it is prudent to ensure that such sectors remain protected with the rising rates of unemployment and lack of growth in other sectors in the country3, it is essential that the Rules keeps in mind the sizes and human resources required for compliance.

Comments

{{ gettext('Login to leave a comment') }}

{{ gettext('Post a comment…') }}
{{ gettext('New comment') }}
{{ formTitle }}

{{ errorMsg }}

{{ gettext('No comments posted yet') }}

Hosted by

Deep dives into privacy and security, and understanding needs of the Indian tech ecosystem through guides, research, collaboration, events and conferences. Sponsors: Privacy Mode’s programmes are sponsored by: more

Supported by

Google’s mission is to organize the world’s information and make it universally accessible and useful. more

Promoted

GitHub is the developer company. As the home to more than 65 million developers from across the globe, GitHub is where developers can create, share, and ship the best code possible. GitHub makes it easier to work together, solve challenging problems, and create the world’s most important technologi… more

Promoted