Social Media networks and platforms, chat and messaging applications, photo and video sharing services have radically transformed the internet landscape in India and elsewhere in the last decade. User generated content has allowed diverse voices to create and share views, political opinions, dance videos, movie and music commentaries.

While the platforms and networks have encouraged these voices, there is also a growing concern1 over the sharing of potentially offensive material such as pornographic content, child sexual abuse material (CSAM), hate speech and violent content often not suitable for the wide audience such platforms caters to.

The INFORMATION TECHNOLOGY (GUIDELINES FOR INTERMEDIARIES AND DIGITAL MEDIA ETHICS CODE) RULES, 2021 notified by the Ministry of Electronics and Information Technology (MEITY), together with the Ministry of Information and Broadcasting (MIB), Government of India – under the IT Act 2000, seeks to monitor and control user generated content and provide firm guidelines for social media Intermediaries, digital news publications and other organizations who host or transfer content on the internet.

The Rules were notified in February 2021, and went into effect in May 2021. Organizations and individuals have challenged the Rules on various counts2 – including their applicability under the parent law. Large platforms and social media networks have expressed concern about implementation and compliance.

Privacy Mode, a hub for conversations around privacy, data security and compliance, conducted a two-part research project seeking to understand the impact of the Rules on organizations and employees in the tech ecosystem who might be responsible for implementing the Rules and achieving compliance in tech and media products.

A qualitative study of Social Media Platforms, Digital News Publications, and Cloud Computing services providers, was undertaken to look at the possible impact on encryption, traceability, compliance, applicability of law among others, was conducted in May-June 2021; and a quantitative survey of tech workers across India, looking at awareness, professional and personal impact, work flows and requirements, was conducted in June-July 2021.

This report is a comprehensive analysis of both surveys and presents a rounded picture of the impact of the IT Rules 2021 on organizations and its employees. This research report also looks at larger questions and concerns about privacy, freedom of expression and speech given the discursive debates around responsible tech, digital platforms and ethics, and impact on society and individuals.

Executive Summary and Core Concerns

Scope of the law

By definition, the ‘Rules’ framed for any law in India are ‘Subordinate Legislation’ or ‘Delegated Legislation’. While laws are made by the Parliament/Legislature, Rules are made by the Executive i.e., the Government of India, to fulfill the requirements of the parent law. In Indian democracy, it is only the Legislative that can make laws. The Executive can only implement them. If the law says ‘XYZ has to be accomplished’, rules can frame the methods in which ‘XYZ’ can be accomplished. However, in the case of IT Rules 2021, the Rules are seen as overarching and exceeding the parent law.

Notified under the Information Technology Act, 20003 , which provides ‘Safe Harbour’ status to digital intermediaries, the Rules are ultra vires of the parent Act and seek to regulate activities that have no mention in it. Further, bringing digital news publishers under the ambit of the Rules, is unconstitutional and ultra vires of the IT act, as news websites do not fit the definition of ‘Intermediaries’ given under the Act4.

Further, the activities of news publishers and media are regulated by the Ministry of Information and Broadcasting (MIB)5, and thus excluded from the ambit of the IT Act. Concerns emerged that the Rules – which did not pass through the legislative body – sought to curtail rights and laws that did emerge from due legislative process.

Further, with existing guidelines under the Press Council Act that govern news organizations, the Rules are seen as overarching and drafted to censor specific media channels and outlets.

The Rules require intermediaries to identify the first originator of messages deemed objectionable. This implies that messaging platforms and social networking sites will have to significantly alter their product (and the technology underlying products) to comply. This is again not governed by the parent act, and is therefore unconstitutional. The Rules also operate from a position of assumed guilt, where all conversations and communications are expected to be scanned for potentially offensive material, and traced back to the original sender. This is against the assumption of innocence enshrined in the legal system operating in the country.

Breaking encryption and implementing traceability, a fundamental requirement of the new Rules, have international legal implications, as messaging services and social media platforms will need to alter the underlying technical architecture of their products or services - or at least have a different product and user experience for Indian users. Since this cannot be implemented for users in India alone and will affect every user of the services across the world, these social media intermediaries will be in violation of international laws governing user privacy and security, thus inviting legal costs.

Freedom of expression and natural justice

The Rules are seen as violating freedom of expression guaranteed in the Indian constitution by implementing traceability, which breaks encryption. Privacy, also a fundamental right as determined by the Supreme Court of India, is increasingly seen as a ‘make-or-break’ feature of all websites, apps, products, and services. Privacy operates from a position of assumption of innocence of the user. The Rules, by enforcing traceability, violate the fundamental rights of Indian citizens by reducing privacy to a conditional service, and not a constitutional guarantee

Cost of compliance

When the IT Rules came into effect in May 2021, they were criticized for imposing high costs of compliance, including legal and personal liability attached to employees of social media organizations. In the case of the office of the Chief Compliance Officer (CCO), liability extended even after the CCO retired from office. Every social media and news organization surveyed during this research pointed to the personal liability attached to the role of the CCO, grievance and nodal officers as imposing financial and legal costs on their organizations.

Proactive content filtering requirements will impact human resources requirements, demand changes in product and business operations, thereby significantly increasing costs. Traceability clauses under the Rules require extensive overhaul of messaging services and social networking platforms’ core architecture, requiring significant monetary and human resource investment.

Further, respondents in the Focus Group Discussions (FGDs) believed that ease of doing business will diminish given the stringent compliance regime and employee impact.

The Rules are also framed vaguely and arbitrarily, leading to confusion over operating clauses. Additionally, they have stringent reporting requirements. This will affect all organizations, especially small and medium enterprises, financially, and otherwise.

Skill and competency of Industry

In addition to the legal and ethical concerns emerging from implementation of the Rules, there are knowledge, awareness, and skill gaps across a representative sample of the IT industry, which may impact the ability of organizations to comply with the IT Rules.

Software developers in junior and mid-level roles in IT organizations believe their workload will increase with the IT Rules. Respondents hinted at their jobs now requiring them to do more documentation and reporting, and their role in achieving compliance in the company’s product as increasing their workload.

Industry representatives however felt that tech workers and product managers will fundamentally need knowledge in, or retraining in, privacy features, content filtering and user experience, in order to fully comply with the Rules. Experts in the industry believe that more than just technical skills or knowledge, what is missing is also perspective and understanding of how executing the Rules will impact users of media and tech products.

As noted above, encryption and traceability requirements of the Rules will mean major changes in products, especially user experience and inability to safeguard privacy of Indian users under the IT Rules. Implementing features such as voluntary verification will need product managers to acquire new skills and knowledge. Tech workers will need to learn how to work in coordination with legal teams. Given the implementation of the IT Rules, each content takedown request will have to be serviced on a case-by-case basis. This will impact scale and standard operating procedures in organizations, or will result in organizations relying more heavily on automation to censor content proactively (and to avoid being served takedown notices). In both cases, users of these products will bear the brunt, where their freedom of speech and expression will be reduced drastically.

Individual chapters and sections of the report are presented as submissions. Scroll down to read them.

About the principal researchers

Nadika Nadja is a researcher at Hasgeek. She has worked across advertising, journalism, TV and film production as a writer, editor and researcher.

Bhavani S is a Research Associate at Hasgeek. She has previously worked for the Centre for Budget and Policy Studies (CBPS), Microsoft Research India, and the University of Michigan, Ann Arbor.

Support team

Anish TP illustrated the report. Satyavrat KK provided research and editorial support. David Timethy and Zainab Bawa were project managers for producing this report. Kiran Jonnalagadda and Zainab Bawa advised on research design and execution.


We would like to thank the following individuals who provided feedback during different stages of the research. Their feedback helped the team fine-tune and bring rigour to the research process.

  1. Suman Kar, founder of security firm Banbreach, for reviewing early drafts of the quantitative research questionnaire, and providing detailed inputs on survey design.
  2. Prithwiraj Mukherjee, Assistant Professor of Marketing at IIM-Bangalore, for reviewing early drafts of the quantitative research questionnaire, and providing detailed inputs on survey design.
  3. Chinmayi SK, Founder of The Bachchao Project, for reviewing and providing feedback on the final report and conclusions

While Hasgeek sought funding from organizations, the research itself was conducted – with full disclosure at all stages – independently and objectively. The findings do not reflect any individual organization’s needs.

  1. Unicef: Growing concern for well-being of children and young people amid soaring screen time (2021) - ↩︎

  2. LiveLaw: Supreme Court Lists Centre’s Transfer Petitions, Connected Cases After 6 Weeks ↩︎

  3. India Code: The Information Technology Act 2000 ↩︎

  4. India Code: IT Act Definitions ↩︎

  5. Ministry of Information and Broadcasting: About the ministry ↩︎

Hosted by

Deep dives into privacy and security, and understanding needs of the Indian tech ecosystem through guides, research, collaboration, events and conferences. Sponsors: Privacy Mode’s programmes are sponsored by: more

Supported by

Google’s mission is to organize the world’s information and make it universally accessible and useful. more


GitHub is the developer company. As the home to more than 65 million developers from across the globe, GitHub is where developers can create, share, and ship the best code possible. GitHub makes it easier to work together, solve challenging problems, and create the world’s most important technologi… more


Nadika N

Nadika N


Bhavani Seetharaman

Bhavani Seetharaman


Detailed findings: Legal concerns

Submitted Sep 22, 2021

IT Rules are Ultra Vires of IT Act 2000

The IT Rules are notified under the IT Act1, 2000, which provides ‘Safe Harbour’ status to digital intermediaries. That is, the intermediary will not be held liable for content that is merely hosted by, or transmitted by them, provided they are not the creators or owners of the content.
By nature, subordinate legislation, such as the IT Rules 2021, cannot regulate activities that find no mention in the parent law. The Rules cannot also alter definitions or extend the scope of the parent law.

Therefore, respondents from digital media organizations and social media organizations said that the Rules are ultra vires of the parent Act.

The definition of Intermediaries given under the Act is:

“intermediary”, with respect to any particular electronic records, means any person who on behalf of another person receives, stores or transmits that record or provides any service with respect to that record and includes telecom service providers, network service providers, internet service providers, web-hosting service providers, search engines, online payment sites, online-auction sites, online-marketplaces and cyber cafes

Journalists and digital media representatives who participated in the qualitative interviews pointed out that the activities of news publishers and media are regulated by the Ministry of Information and Broadcasting (MIB)2, and thus excluded from the ambit of the IT Act.

Further, digital news publishers and journalists are governed under the Press Council Act of 19783 and follow editorial and journalistic ethics. They are thus held responsible for the content that they publish. They do not fit the definition of an intermediary under the IT Act.

The Rules curtail rights and mandate restructuring processes (such as user privacy, censorship of content deemed offensive by the government, etc) for digital-only news publishers, thus bringing them to the same status as an intermediary, while they are not. Such mandates and requirements do not find mention in the parent law. Journalists and digital news publishers have challenged the IT Rules in the court, on these counts4.

Digital news publishers are also most concerned about the impact of the Rules on their freedom as a free press that can call out issues in the way governments function.

With existing guidelines under the Press Council Act that govern news organizations, the Rules are seen as overarching and drafted to censor specific media channels and outlets. As a senior journalist and a member of a trade body representing digital media houses puts it:

“The fact that they had included us in these IT Rules as news publishers was my biggest problem. Because we are a completely different entity, we have a different functioning, we have a separate role as a public service, as news content producers ... everything was wrong with it. Specifically, we are concerned that a committee of bureaucrats will sit and decide what is defamatory. This bypasses the judicial process. For instance, we are already in court with two people who have sued us for defamation. Now (with the IT Rules) those people need not go to the court … They can just go to this committee and get a takedown order on what they see as defamatory content.”

Another senior journalist echoed this sentiment, and added,

“On the Intermediary Liability issue, the IT Rules are very broad, and can be interpreted in any way that vested interests want . It is the one apprehension that everyone has. That is why it is very important that a proper framework, which is crystal clear, brings out the transparent ways to operate.”

Safe harbour status for intermediaries has been considered a gold standard, but the Rules remove it. However, as a subordinate legislation, it cannot do that and is thus ultra vires.

The IT Rules also distinguish between intermediaries based on arbitrary thresholds. Mainly, the Rules differentiate between a Social Media Intermediary and a Significant Social Media Intermediary (SSMI), based on a threshold of user base i.e., SSMIs are defined as organizations that have a user base of five million and above. Again, such distinctions and thresholds are not mentioned in the parent law and are therefore in violation here, respondents in the qualitative interviews and FGDs said. Some of the SSMI representatives who participated in the qualitative research felt that this was an arbitrary and vague definition. As a public interest technologist puts it:

“I have a problem with this large company and small company differentiation, especially with intermediaries. No company starts off by saying I want to remain a small company. When someone starts a social media company, they want to be the biggest., That is obviously their goal. So if they’re going to have Rules, then the Rules should be consistent for everyone.”

To summarize, the IT Rules are unconstitutional because:

  1. They intend to regulate activities that have no mention in the parent act.
  2. News publishing organizations will be censored under the IT Rules which prevents the growth of a free press
  3. The Rules curtail, and even try to replace the authority of the Press Council as a governing body for news organizations.
  4. Safe harbor removal which violates accepted norms for internet service providers and networking organizations, putting them at risk of prosecution.
  5. Creation of new categories of ‘media organizations’ based on arbitrary distinctions of user base thresholds, thereby exceeding the scope of the parent law.

Infringing the fundamental right to privacy

In 2017, the Supreme Court of India declared Privacy a Fundamental Right of citizens, in the Puttaswamy v. Union of India5 case. Although India doesn’t currently have a personal data protection law, several countries across the world have implemented such laws – like the General Data Protection Regulation (GDPR) in the European Union – and require tech companies to abide by them.

Messaging services and platforms have increasingly responded to international laws as well as user requirements for privacy, and have implemented safeguards such as end-to-end encryption and restrictions on data collection and retention.

However, the IT Rules demand that messaging services break encryption in order to identify the ‘originator of messages’ – whether or not a crime has been committed. Not only does this affect how tech companies could function internationally, breaking encryption and implementing traceability will seriously harm citizens.

A public interest technologist says,

“The way the Rules have defined the originator law means that there will be no end-to-end encryption. Because you will need some form of fingerprinting… If I want to know who this particular person is who first started this meme, I have to know all the recipients this meme was sent to. If I know that this person has sent this meme to another person, then it’s not encrypted anymore.”

In the FGD with public interest technologists and their opinion about the IT Rules, a public policy representative explained the onus for big tech.

“... companies like WhatsApp and Signal have two choices: either they implement traceability by massively increasing the data collection, or breaking into an encryption and associating metadata of messages with users, number one, or two, they don’t do it. And then because of that, eventually, they get banned by the Indian government or get taken to court. So in practice, I would say that that’s the end game we’re moving towards. Who gives in first, what is actually taped together, is what we have to see.”

Principles of natural justice violated

Encryption and user privacy assume good faith and goodwill on the part of all users. This implies that some bad faith actors may go undetected in the network, but the majority of users are presumed and treated as innocent. Respondents who participated in the qualitative interviews and round tables said that traceability, or the breaking of encryption, operates from a position of assumed guilt on the part of all users. Breaking encryption and violating user privacy flips the position and assumes all users as guilty until proven innocent. This is against the principles of natural justice.

By fingerprinting everybody on a messaging platform or social media site, the IT Rules effectively deem that everyone is under suspicion, that everyone is guilty. This has severe consequences for free speech and freedom of expression because any individual can be targeted for anything they said or even for forwarding messages.

A public interest technologist expressed concerns about the lack of a data protection law as a safeguard to the right to privacy.

“… when the Supreme Court ruled that privacy is a fundamental right, this is a right which is not explicitly written in the Constitution. There are no laws around it, compared to all the other fundamental rights. … this is our newest fundamental right, which has been declared by the Supreme Court based on existing fundamental rights. … ideally, what should have happened is the government should have come up with some privacy safeguard (?), or a data privacy act for citizens. But considering that this government argues that Indians do not have a fundamental right to privacy, they are not going to come up with that law. Under the IT Rules, every aspect of privacy will need to be litigated.”

Many of the participants in the qualitative research felt that the Rules – which are in essence meant to clarify how a law is implemented – are vague, and perhaps deliberately so.

As a veteran of the social media sector, and a senior activist, puts it:

“There is very little by way of transparency of orders that are going to be passed under these Rules, as is already becoming quite evident. And users are not going to know. To my mind, this is equal to buying a house or renting a house and the government always having a key to that house with the assurance that "Oh, but you know, this is for your own good.”

A participant from the media industry said:

“The laws have been deliberately framed so vaguely that you don’t know what the scope is or what the scope could be.”

Section 4 of the IT Rules, titled Additional Due Diligence of Social Media Intermediaries, requires social media organizations and services to employ a Chief Compliance Officer (CCO) who will be the one-point of contact between the intermediary and the government. This officer, whose role includes responding to the government’s takedown notices for content deemed objectionable or offensive, will be held personally accountable for any violations. The Rules also state that legal proceedings can be carried out against the CCO beyond their term as employee of the social media organization, with multiple parties including state and Union governments initiating proceedings.

Back in May 2021, many SSMIs were concerned about hiring for the position of the CCOs given the personal liability associated with the role of the CCO. Who will apply for a role that they know will persecute them for the rest of their lives?

The legal and policy head of a video sharing platform stated that placing liability on a single individual i.e., the CCO is an unprecedented practice. Organizations have a responsibility to protect their employees from legal risks.

“Any platform will be cautious to not expose their employees to a certain legal risk, a criminal liability in this case. I think this will have an impact on freedom of expression, on excessive takedowns … That is the biggest risk.”

The above statement indicates that organizations will make a trade-off between protecting their employees at the cost of users i.e. organizations might over-censor content to avoid litigation and thereby jeopardizing employees such as the CCO. Overall therefore, freedom of expression will be compromised as organizations prioritize protecting their employees.

  1. India Code: The Information Technology Act 2000 ↩︎

  2. Ministry of Information and Broadcasting: About the ministry ↩︎

  3. India Legislative: Press Council Act ↩︎

  4. The News Minute: Centre’s new IT Rules challenged in court ↩︎

  5. Supreme Court of India:Justice Puttaswamy and others ↩︎


{{ gettext('Login to leave a comment') }}

{{ gettext('Post a comment…') }}
{{ gettext('New comment') }}
{{ formTitle }}

{{ errorMsg }}

{{ gettext('No comments posted yet') }}

Hosted by

Deep dives into privacy and security, and understanding needs of the Indian tech ecosystem through guides, research, collaboration, events and conferences. Sponsors: Privacy Mode’s programmes are sponsored by: more

Supported by

Google’s mission is to organize the world’s information and make it universally accessible and useful. more


GitHub is the developer company. As the home to more than 65 million developers from across the globe, GitHub is where developers can create, share, and ship the best code possible. GitHub makes it easier to work together, solve challenging problems, and create the world’s most important technologi… more