On June 6, 2022, the Ministry of Electronics and Information Technology (MEITY) released the draft amendments to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (IT Rules, 2021) for public feedback.

The Rules were notified in February 2021, and went into effect in May 2021. A detailed study about the IT Rules and Intermediary Liability (IL) Guidelines is published at https://hasgeek.com/PrivacyMode/it-rules-2021/sub

This project is about the key concerns with regards to the draft amendments. Privacy Mode has, through continuous discussions with members of the tech and policy community, drawn up a set of recommendations for the amendments. The recommendations were shared with concerned ministers and officials at MeitY on July 5, 2022. Submissions on these recommendations and the trajectory of the IT Rules will be added to this page.

This project aims to delve deeper into issues regarding the current (and future) amendments on the IT Rules, and to amplify conversations around the same.

Hosted by

Deep dives into privacy and security, and understanding needs of the Indian tech ecosystem through guides, research, collaboration, events and conferences. Sponsors: Privacy Mode’s programmes are sponsored by: more

Anwesha Sen

@anwesha25

IT Rules Amendments - History, Concerns, and Recommendations

Submitted Jul 22, 2022

On June 6, 2022, the Ministry of Electronics and Information Technology (MEITY) released the draft amendments to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (IT Rules, 2021) for public feedback. Privacy Mode organized a stakeholder meeting with representatives from SSMIs and industry bodies on July 1, 2022 to discuss key concerns regarding the draft amendments, as well as recommendations for the same.

The release of the draft amendments were a result of comments and recommendations from organizations and groups such as Privacy Mode. In 2021, Privacy Mode conducted a research project to understand the impact of the Rules on organizations and employees in the tech ecosystem who might be responsible for implementing the Rules and achieving compliance in tech and media products. The report is available here.

Following the release of the draft amendments, it was announced that MEITY was open for public consultation till the first week of July, 2022. Privacy Mode submitted another report to MeitY with recommendations during this period, as discussed in the stakeholder meeting. These recommendations are tabulated in Table 1.

This article is relevant for anyone who is interested in tech policy, cybersecurity, privacy, and/or works in an SME. It covers the following aspects of the IT Rules amendments:

  1. The origin of the IT Rules and recent amendments
  2. Key concerns and major challenges as a result of these regulations
  3. Privacy Mode’s recommendations and the reasoning behind them, using inputs from intermediary industry representatives.

Table 1. Key concerns and corresponding recommendations

Amendment Key Concerns Recommendations
Addition of the following rules:
3(1)(m) the intermediary shall take all reasonable measures to ensure accessibility of its services to users along with reasonable expectation of due diligence, privacy and transparency; 3(1)(n) the intermediary shall respect the rights accorded to the citizens under the Constitution of India. Regulatory uncertainty being created through the IT Rules with the addition of rule 3(1)(m) and 3(1)(n), which enforces individuals’ fundamental rights on intermediaries. Refining the regulations to increase specificity and avoid misunderstandings or misinterpretations.
Creating a new Grievance Appellate Committee to provide an appeal mechanism to users:
It is proposed to create an appellate body called ‘Grievance Appellate Committee’ under rule 3(3) of the IT Rules 2021 by invoking section 79 of the IT Act having regard to additional guidelines as may be prescribed by the Central Government. Users will have the option to appeal against the grievance redressal process of the intermediaries before this new appellate body. The Committee will endeavor to address the user’s appeal within a period of 30 days. This is made necessary because currently there is no appellate mechanism provided by intermediaries nor is there any credible self-regulatory mechanism in place. Introduction of the Grievance Appellate Committee being a direct intervention by the government, leading to increased censorship and regulation, as well as biased decision making regarding appeals. Members of such a committee should include those with expertise in understanding and adjudicating issues related to social media intermediaries, as well as technical expertise, such as lawyers, professors, engineers, journalists, etc.
Members to be drawn for each industry vertical since a uniform GAC will not be able to adequately address grievances from all users. They will also have to be equipped to understand the nuances of each industry vertical.
Clearly specifying legal backing and punitive powers of the committee so as to avoid ambiguity and uncertainty.
Transparency around decision-making and having specific metrics in place to ascertain unbiased decisions by the committee.
Quasi-judicial appellate authority to regulate/enforce free speech while providing reasoned judgements on takedowns/de-platforming of user content/users, better community-level tools to flag offensive content. 1
It is proposed to add two provisos under rule 3(2) of the IT Rules 2021:
a. The first provision will require any complaint for removal of any content under rule 3(1)(b) to be addressed within 72 hours of the receipt of the user’s complaint, because of the very nature of cyber space providing instant communication, outreach and virality. Any other grievance will continue to be addressed within 15 days. This will help to ensure that problematic content is removed expeditiously, and does not become viral over a sustained period of time. b. The second proviso allows intermediaries to implement any safeguards to prevent any misuse of the grievance redressal mechanism by users. For example, where a user submits any inappropriate, trivial or inauthentic complaint, the intermediary may exercise due diligence to prevent such misuse. The new amendments presume that social media intermediaries have practical knowledge of all the content that users post, at all times, which isn’t the case. As a result, intermediaries cannot be held responsible for users’ compliance. Intermediaries that do not promote viral content should be exempt from compliance measures stated in the IT rules.
Content removal duration revision and self regulation provision being a form of censorship and overview. Having local community guidelines in place to aid moderation.
Lack of resources and overburdening intermediaries in terms of self regulating and content removal within the given timeline. Clarifying the type of safeguards that intermediaries can use to regulate content and favourably deplatform individuals/groups.
Using a harms-based approach that allows intermediaries to operate autonomously on content takedowns of serious harms and in different ways with respect to content with varying degrees of harms. Intermediaries can be required to proactively disclose details of the content they take down to both regulatory bodies, and the public (to educate them about the harms).
Safe harbour” status for intermediaries is directly threatened as they can be held liable or be banned for failing to comply with takedown notices. Reiterating the “safe harbour” rules explicitly in the new draft.

IT Rules 2021 - A Timeline

With the invention of messaging apps and social media platforms came the era of end-to-end encryption to prevent unauthorized access to one’s messages and data. In terms of national security, this technology brought about concerns as it became harder for law enforcement and governments to decrypt messages from suspicious individuals and groups. As a result, the need to regulate data and encryption was realised which led to the introduction of data protection laws all over the world.2

Such policies ask for encryption backdoors to enable data access on intermediary platforms for governments. This is incredibly harmful for the privacy of personal and sensitive data as it opens up platforms to the possibility of data breaches 3. To solve for the latter, data protection legislation such as the General Data Protection Regulation (GDPR) in the EU and the draft Data Protection Bill in India were formulated.

Currently, the law that governs data protection in India is the Information Technology Act, 2000 (IT Act), under which the Information Technology (Intermediaries guidelines) Rules, 2011 (2011 Rules) were enacted to offer further clarity.

The IT Rules 2021 were released to supersede the 2011 Rules. These rules were introduced to regulate significant social media intermediaries (SSMIs), which includes any “service provider which transmits, hosts, and publishes user content without exercising editorial control over the content like traditional publishers do”.4

The regulations include appointing a Chief Compliance Officer (CCO), enabling identification of the first originator of content on social media platforms, deploying technology-based measures to identify certain types of content, and other requirements.

Key concerns regarding the IT Rules 2021 are as follows:

  • High cost of compliance. Proactive content filtering requirements also have cost implications.
  • Personal liability on CCO is of grave concern.
  • Ease of doing business will be impacted given stringent compliance regiments and employee impact.
  • Vagueness of rules opens them up to arbitrary interpretation.
  • Tracing the originator/first sharer of content has been deemed offensive and opens up concerns about privacy and freedom of expression. It could lead to breaking encryption currently used by service providers.
  • User notification of content takedown and provision of removal of user content can open up arbitration. This too increases cost of operation and retention of users, and imposes costs on social media organizations. It also violates freedom of speech and expression.5

As mentioned previously, draft amendments to these rules were released, which includes clauses to appoint a Grievance Appellate Committee’ (GAC), add a timeline on content removal and user takedowns, self-regulation, etc.

In addition to a lack of clarity regarding the amendments, they add another layer of government control on content shared on intermediary platforms. Additionally, it further diminishes the ‘safe harbour’ status of intermediaries and opens them up to legal consequences if they fail to regulate content posted on their platforms.

The following section discusses concerns and recommendations regarding the same in detail.

Detailed findings

On regulatory uncertainty

The addition of rules 3(1)(m) and 3(1)(n) enforces individuals’ fundamental rights onto private bodies. According to this, intermediaries are required to “take all reasonable measures to ensure accessibility of its services to users” and “respect the rights accorded to the citizens under the Constitution of India”.

The two rules are quite vague in terms of how they are to be achieved. It assumes that intermediary platforms are aware and have knowledge of all the content posted by users, which is impractical. Additionally, guaranteeing every citizens’ rights is an impossible task to achieve by an intermediary. For example,

“If two individuals provide separate content for publication to a private publisher and the platform chooses to publish only one, it can do so. The content creator cannot insist otherwise, claiming his freedom of speech is being infringed by the private publisher.”6

These regulations need to be refined to increase specificity and avoid misunderstandings or misinterpretations. Additionally, since intermediaries are expected to do end-to-end compliance, creating a shared service model or shared resources can help SMEs comply.

On the introduction of the ‘Grievance Appellate Committee’

Under rule 3(3), it is proposed to create an appellate body called ‘Grievance Appellate Committee’ (GAC). Users will have the option to appeal against the grievance redressal process of the intermediaries before this new appellate body.

This committee is reminiscent of the Oversight Board that Facebook (now Meta) had created for grievance redressal. This board was external to Facebook and functioned outside of the platform’s moderation mechanism. The GAC may also function in the same way.7

Such a committee adds a level of regulation beyond the grievance officer. It also adds a layer of control that the Indian government can use to regulate and censor content on platforms, since the committee will be government-backed. Additionally, the lack of independent members and transparency would lead to increased censorship of content and biased decision making.8

Another concern regarding the GAC is that it may not have the capacity to deal with a high volume of grievances.

Members of this committee should include those with expertise in understanding and adjudicating issues related to social media intermediaries, as well as technical expertise, such as lawyers, professors, engineers, journalists, etc. This will help to bring in perspectives from various verticals, which in turn may lead to sound judgements regarding complaints. Understanding the nuances of each industry vertical, and hence adequately addressing grievances from all users, will only be possible with a diverse committee.

The committee could also be established as a self-regulatory body which would “operate as an appellate forum for all content moderation decisions, regardless of the platform from which the appeal originates”.9

However, the legal backing of even an industry-led committee is questionable and grey. Clarity is required on this aspect, as well as on the punitive powers held by the committee. Creating a set of metrics based on which the committee will make their decisions regarding cases will help make the process more transparent.

Having a quasi-judicial appellate committee that enforces algorithmic accountability and provides reasoned judgements on takedowns and de-platforming of user content or users will also aid in regulating and enforcing free speech, while keeping the process transparent. In addition to this, intermediaries developing better community-level tools to flag offensive content can also help reduce biased decision-making and targeting of specific types of content.10

On takedown/removal duration revision and self-regulation

The amendments to the IT Rules state that intermediaries should “complete” takedowns and removals within 36 hours, and notify users. The previous rules allowed intermediaries to “act within 36 hours” and only work with the user or owner of the content “when applicable.”

Intermediaries are also required to provide information or assistance to law enforcement agencies within a 72 hour time-limit. The previous iterations of the rule did not contain a time-limit.

The second provision “allows intermediaries to implement any safeguards to prevent any misuse of the grievance redressal mechanism by users”, i.e. they can self-regulate content on their platforms.

The primary concern regarding the 36 hour timeframe is that it may not even be enough for intermediaries to procure user registration data, which leads to more time being required for user notification, content takedown, and setting up grievance redressal.

Additionally, reduced timelines don’t work even though some companies have processes and teams for taking down the content. Over time, the volume of takedowns and removals will go up, and this will lead to over-censorship because of the reduced timelines.

There is also a requirement of clarity regarding the impact on Communications Service Providers (CSPs) from these rules. CSPs have intermediaries as customers, and hence, they should not be targeted to take down the intermediaries’ platforms in the case of non-compliance.

While the amendment allows intermediaries to self-regulate content, it also requires intermediaries to respond to any user complaint within 15 days or face censure. This leads to arbitrary censorship of digital news media as well as OTT platforms, wherein any content that isn’t favorable to particular groups of people will have to be taken down.

The onus to regulate content is put on the intermediaries and may lead to “platforms preemptively taking content down to comply with the law, causing chilling effects on free speech”11. It is also unclear what the type of safeguards are that intermediaries can use to regulate content and favourably deplatform individuals or groups.

It is recommended that intermediaries be allowed to use a harm-based approach that allows them to operate autonomously on content takedowns of serious harms and in different ways with respect to content with varying degrees of harms.

Intermediaries can also be required to proactively disclose details of the content they take down to both regulatory bodies, and the public (to educate them about the harms). It is helpful to have local community guidelines in place to aid moderation. Proactive algorithmic transparency and explainability, and disclosure of how content moderation is done by intermediaries is also recommended. Here is an article on Koo’s algorithms which is a good example of such practices12.

Certain intermediaries that do not promote viral content, such as dating apps, should be exempt from such compliance measures which are aimed at slowing down the spread of harmful viral content.

On effects to ‘safe harbour’ status for intermediaries

As a result of these amendments, the ‘safe harbour’ status for intermediaries is further diminished. With ‘safe harbour’ status, intermediaries are not held liable for content shared on their platforms as long as they didn’t create it.

The amendments regarding the Grievance Appellate Committee and self regulation will lead to intermediaries being held responsible for removal of content as requested by users and/or the committee. Additionally, the limited time frame of 36 hours puts further pressure on intermediaries to comply with such requests. Failure to comply would result in intermediaries being exposed to further liabilities or even being banned.13

These amendments also contradict rules 3(1)(m) and 3(1)(n) which enforce intermediaries to guarantee fundamental rights of individuals. For example, intermediaries having to take down a user’s content due to it being of a different opinion than that of the complainant encroaches upon the user’s Right to Freedom of Speech.

Intermediaries feel that it is imperative that the ‘safe harbour’ status be reinstated and laws ensuring it be written explicitly in the new IT Rules amendments. This proposed amendment is antithetical to the safe harbour status and it is important to preserve safe harbour to prevent over-censorship.

Major challenges

In addition to concerns specific to the various rules and clauses, the draft amendments and existing IT Rules bring about 3 major challenges, especially for SMEs:

  • They lead to an increased cost of operations and business uncertainty. In the current SME ecosystem, there is an existing capacity gap that needs to be filled. Adding such regulations further increases the gap and also creates an atmosphere of psychological pressure with penalizing provisions. These aspects make it even harder for organizations to hire for roles in related fields.
  • SMEs (web hosting providers, merchants running e-commerce websites, communities and providers of website development hosting services) should be concerned about the IT Rules because of the possibility of takedown if there is any violation of content moderation. The (false) assumption that SMEs have knowledge of all the content shared on their platforms increases the possibility of accidental non-compliance, which would have drastic implications for the organization and may even lead to a takedown of the platform.
  • Given the aforementioned challenges, the regulations also have a negative impact on innovation. Through the Open Innovation Project, we saw that policy initiative to encourage innovation in Open Source Software is lacking in India.14 These regulations further alienate creators and potential innovators as they are not conducive to business and also pose a threat in the case of unintentional non-compliance.

Conclusion

Public consultations and providing recommendations on policies such as the IT Rules are imperative to improve the state of data protection in India. Privacy Mode is invested in facilitating further interactions between MeitY and representatives from the SME community and Public Interest technologists. In its current form, the IT Rules and draft amendments require a lot more clarity and reformulation to make them business-friendly as well as privacy-friendly. For this, it is crucial to take into account perspectives and experiences of a wide range of stakeholders.


  1. Nadja, Nadika, and Bhavani Seetharaman. “It Rules 2021 Intermediary Guidelines: Concerns and Recommendations from the Community – Information Technology - Guidelines for Intermediaries and Digital Media Ethics Code - Rules, 2021.” IT Rules 2021 Intermediary Guidelines: Concerns and Recommendations from the Community –, 2021, https://hasgeek.com/PrivacyMode/it-rules-2021/sub/it-rules-2021-intermediary-guidelines-concerns-and-7swxnPZrCPooJUdLPPqz2u. ↩︎

  2. Green, Matthew D. “Keynote: End-to-End Encryption: State of Technical and Policy Debates – Data Privacy Conference.” Keynote: End-to-End Encryption: State of Technical and Policy Debates –, 2021, https://hasgeek.com/rootconf/data-privacy-conference/schedule/end-to-end-encryption-state-of-the-technical-and-policy-debate-R7V65dTuMkA669f8ByFpHb. ↩︎

  3. Green, Matthew D. “Keynote: End-to-End Encryption: State of Technical and Policy Debates – Data Privacy Conference.” Keynote: End-to-End Encryption: State of Technical and Policy Debates –, 2021, https://hasgeek.com/rootconf/data-privacy-conference/schedule/end-to-end-encryption-state-of-the-technical-and-policy-debate-R7V65dTuMkA669f8ByFpHb. ↩︎

  4. “Explainer: How the New It Rules Take Away Our Digital Rights.” The Wire, 2021, https://thewire.in/tech/explainer-how-the-new-it-rules-take-away-our-digital-rights. ↩︎

  5. Nadja, Nadika, and Bhavani Seetharaman. “It Rules 2021 Intermediary Guidelines: Concerns and Recommendations from the Community – Information Technology - Guidelines for Intermediaries and Digital Media Ethics Code - Rules, 2021.” IT Rules 2021 Intermediary Guidelines: Concerns and Recommendations from the Community –, 2021, https://hasgeek.com/PrivacyMode/it-rules-2021/sub/it-rules-2021-intermediary-guidelines-concerns-and-7swxnPZrCPooJUdLPPqz2u. ↩︎

  6. Rudra, Tapanjana. “Draft IT Rules Amendment: Legal Experts Call for an Independent Grievance Appellate Committee.” Inc42 Media, 18 June 2022, https://inc42.com/buzz/draft-it-rules-amendment-legal-experts-call-for-an-independent-grievance-appellate-committee/. ↩︎

  7. Jain, Anushka. “Summary of Proposed IT Rules Amendments, Now Withdrawn by Meity.” MediaNama, 4 June 2022, https://www.medianama.com/2022/06/223-meitys-proposed-amendment-to-it-rules-2021/. ↩︎

  8. Jain, Anushka. “Summary of Proposed IT Rules Amendments, Now Withdrawn by Meity.” MediaNama, 4 June 2022, https://www.medianama.com/2022/06/223-meitys-proposed-amendment-to-it-rules-2021/. ↩︎

  9. Matthan, Rahul. “An Ideal Approach to Social Media Grievance Redressal.” Mint, 14 June 2022, https://www.livemint.com/opinion/columns/an-ideal-approach-to-social-media-grievance-redressal-11655225995146.html. ↩︎

  10. Nadja, Nadika, and Bhavani Seetharaman. “It Rules 2021 Intermediary Guidelines: Concerns and Recommendations from the Community – Information Technology - Guidelines for Intermediaries and Digital Media Ethics Code - Rules, 2021.” IT Rules 2021 Intermediary Guidelines: Concerns and Recommendations from the Community –, 2021, https://hasgeek.com/PrivacyMode/it-rules-2021/sub/it-rules-2021-intermediary-guidelines-concerns-and-7swxnPZrCPooJUdLPPqz2u. ↩︎

  11. Ganesan, Aarathi. “Experts Flag Free Speech and Self-Censorship in India’s Amended IT Rules.” MediaNama, 22 June 2022, https://www.medianama.com/2022/06/223-it-rules-amendments-india-free-speech-big-tech/. ↩︎

  12. “Algorithms@Koo.” Koo, 7 Apr. 2022, https://info.kooapp.com/algorithms-at-koo. ↩︎

  13. Fatterpekar, Shrey. “IT Rules 2021 Explained: Non-Compliance Will Expose WhatsApp, Facebook, Twitter to Significant Liability.” Firstpost, Firstpost, 27 May 2021, https://www.firstpost.com/india/it-rules-2021-explained-non-compliance-will-expose-whatsapp-facebook-twitter-to-significant-liability-9661461.html. ↩︎

  14. Seetharaman, Bhavani. “Mozilla Open Innovation Project: Understanding Innovation in the Indian Tech Ecosystem.” Hasgeek, 2022, https://hasgeek.com/OpenInnovation/mozilla-open-innovation-project-understanding-innovation-in-the-indian-tech-ecosystem/. ↩︎

Comments

{{ gettext('Login to leave a comment') }}

{{ gettext('Post a comment…') }}
{{ gettext('New comment') }}
{{ formTitle }}

{{ errorMsg }}

{{ gettext('No comments posted yet') }}

Hosted by

Deep dives into privacy and security, and understanding needs of the Indian tech ecosystem through guides, research, collaboration, events and conferences. Sponsors: Privacy Mode’s programmes are sponsored by: more