Social Media networks and platforms, chat and messaging applications, photo and video sharing services have radically transformed the internet landscape in India and elsewhere in the last decade. User generated content has allowed diverse voices to create and share views, political opinions, dance videos, movie and music commentaries.

While the platforms and networks have encouraged these voices, there is also a growing concern1 over the sharing of potentially offensive material such as pornographic content, child sexual abuse material (CSAM), hate speech and violent content often not suitable for the wide audience such platforms caters to.

The INFORMATION TECHNOLOGY (GUIDELINES FOR INTERMEDIARIES AND DIGITAL MEDIA ETHICS CODE) RULES, 2021 notified by the Ministry of Electronics and Information Technology (MEITY), together with the Ministry of Information and Broadcasting (MIB), Government of India – under the IT Act 2000, seeks to monitor and control user generated content and provide firm guidelines for social media Intermediaries, digital news publications and other organizations who host or transfer content on the internet.

The Rules were notified in February 2021, and went into effect in May 2021. Organizations and individuals have challenged the Rules on various counts2 – including their applicability under the parent law. Large platforms and social media networks have expressed concern about implementation and compliance.

Privacy Mode, a hub for conversations around privacy, data security and compliance, conducted a two-part research project seeking to understand the impact of the Rules on organizations and employees in the tech ecosystem who might be responsible for implementing the Rules and achieving compliance in tech and media products.

A qualitative study of Social Media Platforms, Digital News Publications, and Cloud Computing services providers, was undertaken to look at the possible impact on encryption, traceability, compliance, applicability of law among others, was conducted in May-June 2021; and a quantitative survey of tech workers across India, looking at awareness, professional and personal impact, work flows and requirements, was conducted in June-July 2021.

This report is a comprehensive analysis of both surveys and presents a rounded picture of the impact of the IT Rules 2021 on organizations and its employees. This research report also looks at larger questions and concerns about privacy, freedom of expression and speech given the discursive debates around responsible tech, digital platforms and ethics, and impact on society and individuals.

Executive Summary and Core Concerns

Scope of the law

By definition, the ‘Rules’ framed for any law in India are ‘Subordinate Legislation’ or ‘Delegated Legislation’. While laws are made by the Parliament/Legislature, Rules are made by the Executive i.e., the Government of India, to fulfill the requirements of the parent law. In Indian democracy, it is only the Legislative that can make laws. The Executive can only implement them. If the law says ‘XYZ has to be accomplished’, rules can frame the methods in which ‘XYZ’ can be accomplished. However, in the case of IT Rules 2021, the Rules are seen as overarching and exceeding the parent law.

Notified under the Information Technology Act, 20003 , which provides ‘Safe Harbour’ status to digital intermediaries, the Rules are ultra vires of the parent Act and seek to regulate activities that have no mention in it. Further, bringing digital news publishers under the ambit of the Rules, is unconstitutional and ultra vires of the IT act, as news websites do not fit the definition of ‘Intermediaries’ given under the Act4.

Further, the activities of news publishers and media are regulated by the Ministry of Information and Broadcasting (MIB)5, and thus excluded from the ambit of the IT Act. Concerns emerged that the Rules – which did not pass through the legislative body – sought to curtail rights and laws that did emerge from due legislative process.

Further, with existing guidelines under the Press Council Act that govern news organizations, the Rules are seen as overarching and drafted to censor specific media channels and outlets.

The Rules require intermediaries to identify the first originator of messages deemed objectionable. This implies that messaging platforms and social networking sites will have to significantly alter their product (and the technology underlying products) to comply. This is again not governed by the parent act, and is therefore unconstitutional. The Rules also operate from a position of assumed guilt, where all conversations and communications are expected to be scanned for potentially offensive material, and traced back to the original sender. This is against the assumption of innocence enshrined in the legal system operating in the country.

Breaking encryption and implementing traceability, a fundamental requirement of the new Rules, have international legal implications, as messaging services and social media platforms will need to alter the underlying technical architecture of their products or services - or at least have a different product and user experience for Indian users. Since this cannot be implemented for users in India alone and will affect every user of the services across the world, these social media intermediaries will be in violation of international laws governing user privacy and security, thus inviting legal costs.

Freedom of expression and natural justice

The Rules are seen as violating freedom of expression guaranteed in the Indian constitution by implementing traceability, which breaks encryption. Privacy, also a fundamental right as determined by the Supreme Court of India, is increasingly seen as a ‘make-or-break’ feature of all websites, apps, products, and services. Privacy operates from a position of assumption of innocence of the user. The Rules, by enforcing traceability, violate the fundamental rights of Indian citizens by reducing privacy to a conditional service, and not a constitutional guarantee

Cost of compliance

When the IT Rules came into effect in May 2021, they were criticized for imposing high costs of compliance, including legal and personal liability attached to employees of social media organizations. In the case of the office of the Chief Compliance Officer (CCO), liability extended even after the CCO retired from office. Every social media and news organization surveyed during this research pointed to the personal liability attached to the role of the CCO, grievance and nodal officers as imposing financial and legal costs on their organizations.

Proactive content filtering requirements will impact human resources requirements, demand changes in product and business operations, thereby significantly increasing costs. Traceability clauses under the Rules require extensive overhaul of messaging services and social networking platforms’ core architecture, requiring significant monetary and human resource investment.

Further, respondents in the Focus Group Discussions (FGDs) believed that ease of doing business will diminish given the stringent compliance regime and employee impact.

The Rules are also framed vaguely and arbitrarily, leading to confusion over operating clauses. Additionally, they have stringent reporting requirements. This will affect all organizations, especially small and medium enterprises, financially, and otherwise.

Skill and competency of Industry

In addition to the legal and ethical concerns emerging from implementation of the Rules, there are knowledge, awareness, and skill gaps across a representative sample of the IT industry, which may impact the ability of organizations to comply with the IT Rules.

Software developers in junior and mid-level roles in IT organizations believe their workload will increase with the IT Rules. Respondents hinted at their jobs now requiring them to do more documentation and reporting, and their role in achieving compliance in the company’s product as increasing their workload.

Industry representatives however felt that tech workers and product managers will fundamentally need knowledge in, or retraining in, privacy features, content filtering and user experience, in order to fully comply with the Rules. Experts in the industry believe that more than just technical skills or knowledge, what is missing is also perspective and understanding of how executing the Rules will impact users of media and tech products.

As noted above, encryption and traceability requirements of the Rules will mean major changes in products, especially user experience and inability to safeguard privacy of Indian users under the IT Rules. Implementing features such as voluntary verification will need product managers to acquire new skills and knowledge. Tech workers will need to learn how to work in coordination with legal teams. Given the implementation of the IT Rules, each content takedown request will have to be serviced on a case-by-case basis. This will impact scale and standard operating procedures in organizations, or will result in organizations relying more heavily on automation to censor content proactively (and to avoid being served takedown notices). In both cases, users of these products will bear the brunt, where their freedom of speech and expression will be reduced drastically.

Individual chapters and sections of the report are presented as submissions. Scroll down to read them.

About the principal researchers

Nadika Nadja is a researcher at Hasgeek. She has worked across advertising, journalism, TV and film production as a writer, editor and researcher.

Bhavani S is a Research Associate at Hasgeek. She has previously worked for the Centre for Budget and Policy Studies (CBPS), Microsoft Research India, and the University of Michigan, Ann Arbor.

Support team

Anish TP illustrated the report. Satyavrat KK provided research and editorial support. David Timethy and Zainab Bawa were project managers for producing this report. Kiran Jonnalagadda and Zainab Bawa advised on research design and execution.

Acknowledgements

We would like to thank the following individuals who provided feedback during different stages of the research. Their feedback helped the team fine-tune and bring rigour to the research process.

  1. Suman Kar, founder of security firm Banbreach, for reviewing early drafts of the quantitative research questionnaire, and providing detailed inputs on survey design.
  2. Prithwiraj Mukherjee, Assistant Professor of Marketing at IIM-Bangalore, for reviewing early drafts of the quantitative research questionnaire, and providing detailed inputs on survey design.
  3. Chinmayi SK, Founder of The Bachchao Project, for reviewing and providing feedback on the final report and conclusions

While Hasgeek sought funding from organizations, the research itself was conducted – with full disclosure at all stages – independently and objectively. The findings do not reflect any individual organization’s needs.


  1. Unicef: Growing concern for well-being of children and young people amid soaring screen time (2021) - https://www.unicef.org/press-releases/growing-concern-well-being-children-and-young-people-amid-soaring-screen-time ↩︎

  2. LiveLaw: Supreme Court Lists Centre’s Transfer Petitions, Connected Cases After 6 Weeks
    https://www.livelaw.in/top-stories/it-rules-2021-supreme-court-lists-centres-transfer-petitions-connected-cases-after-6-weeks-181032 ↩︎

  3. India Code: The Information Technology Act 2000 https://www.indiacode.nic.in/bitstream/123456789/1999/3/A2000-21.pdf ↩︎

  4. India Code: IT Act Definitions https://www.indiacode.nic.in/show-data?actid=AC_CEN_45_76_00001_200021_1517807324077&sectionId=13011&sectionno=2&orderno=2 ↩︎

  5. Ministry of Information and Broadcasting: About the ministry https://mib.gov.in/about-us/about-the-ministry ↩︎

Hosted by

Deep dives into privacy and security, and understanding needs of the Indian tech ecosystem through guides, research, collaboration, events and conferences. Sponsors: Privacy Mode’s programmes are sponsored by: more

Supported by

Google’s mission is to organize the world’s information and make it universally accessible and useful. more

Promoted

GitHub is the developer company. As the home to more than 65 million developers from across the globe, GitHub is where developers can create, share, and ship the best code possible. GitHub makes it easier to work together, solve challenging problems, and create the world’s most important technologi… more

Promoted

Bhavani Seetharaman

Bhavani Seetharaman

@Bhavani_21

Nadika N

Nadika N

@nadikanadja

Detailed findings: Ethical concerns

Submitted Sep 19, 2021

Broad definitions of unlawful content:

The government claims the IT Rules are framed as a response to illegal or unlawful content, including hate speech, pornography and Child Sexual Abuse Material (CSAM), among others. Under the Rules, unlawful content will also include content that is deemed against national interest or national security, including those that tarnish the image of the nation.

The vague, broad-brush nature of such definitions have been called out by activists, journalists, public interest technologists, and representatives of social media intermediaries1. They believe that these Rules effectively curtail freedom of speech and give power to the Government to classify dissent and criticism as objectionable content and demand it be taken down.

We have already seen instances of posts on social media platforms, (most notably on the platform Twitter) being taken down since the IT Rules came into effect in May 2021. Popular opposition politicians and activists have been denied access to their social media accounts2.

As an expert of the social media space, with a historical understanding of the sector, says,

“To de-platform sitting Members of Parliament, to de-platform journalists and other critics for tweets that by any objective analysis were only critical of the government’s handling of the farmers’ protests and were not invoking violence or insurrection. And there was no direct linkage of offline violence with online speech. But the intent of the government is such that it can use any of these Rules to issue any orders, and get any kind of compliance out of these platforms. And at the end of it, you basically have users who suffer the consequences of this, be it from the content being withheld in particular geographies, or the content being forced to be deleted, or their accounts being deleted, or even worse, legal action against these folks.”

While looking at the broad nature of the definitions and the difficulties of implementation one must question where the recourse is situated for individuals unfairly penalised by the law. This is especially true of content creators. A senior journalist highlights:

“In the intermediary rules, you have this definition where misinformation or harmful content is not defined. Is my criticism of Yogi Adityanath harmful content? If a sub-inspector in Raebareli thinks it is, then it is, and I have no recourse, because this has not been defined in a way that it doesn’t leave any scope for interpretation by those enforcing these laws. It automatically leads you to believe that it is going to be prone to misuse, and which it will be. And it goes back to the question of how bureaucracies are made. And, who knows what’s going to come under the purview of these orders. It really will depend on the imagination of a cop seeking a promotion, or seeking a quick favour with a political overlord, or a political overlord putting the pressure on a cop to score some quick points with bigger political overlords and this chain doesn’t stop.”

Many of our interview and FGD respondents wondered what the overall priority of the law would be, considering how it has been implemented in the last few months with the deplatforming of opposition leaders and extreme focus given to the social media intermediaries. An industry veteran and activist said:

“What’s our priority there? Is it the safety and security of society? Are we looking down at society in a kind of a patronizing manner? And to think that people cannot actually engage in consensual activity on the internet and decide for themselves? Or do we really need to offer that level of protection to people? Are we dealing with adults here or kids?”

An industry veteran, journalist and activist said:

“I think instinctively I feel like it looks at the online sphere as an extension of the public sphere. And exactly how it polices the public sphere with terrible laws, it has gone on to extend to the online sphere as well. So I see very little disconsonance in the government’s approach to how to police our online and offline lives.”

Freedom of speech curtailed

Respondents also felt that the IT Rules are designed to curtail freedom of speech and expression. Digital publications will now have to follow a three-tier self-regulatory model which includes a grievance officer within the organization, a self-regulating body headed by a retired Supreme Court or High Court judge or an eminent person from the industry, and finally the Union government’s oversight at the top.

This model – with the union government being the ultimate arbitrator of what content is allowed to be published and what is deemed objectionable – has been rightly called out for stifling freedom of speech and placing limits on a free press.

As a journalist covering tech and business sectors for an international publication says,

“It’s no secret that press freedom in the country has been plummeting over the last few years. Forcing a rule like that (IT Rules) will put massive powers in the hands of the government to basically force digital media outlets to take off content. Is that not a pressing issue? I mean what is more of a pressing issue than that right now? You know, the whole world is talking about the shrinking space for dissent and free speech in India.”

A tech journalist felt that the onus of pushback rested on the public. He said,

“On the one hand, they (the platforms) do have to sort of legally comply with most of these IT Rules. But on the other hand, the IT rules force them to not stand up for their users’ free speech. I think you are going to see a lot of pushback from people about the platforms, on the platform themselves. I think this is a good thing. Because, if there is one thing that platforms are really sensitive to is this bad press, and I think often bad press, if done well, is quite effective in getting them to take a harder look at their policies around these things.”

Another point that individuals have raised was the fact that many platforms already have internal mechanisms in place for content filtering based on internal mandates and codes. Even small and upcoming platforms have implemented such practices. With some smaller startups relying on third party contractors to aid in the content moderation process3. However, with the large scale implementation of such legislation more content might be forced to be reviewed, due to the scale of such operations there might be a larger dependence on AI/ML for the reviewing of such content. Such practices have been considered to lack both accuracy and reliability unlike human employees4.

A public interest technologist opined that larger entities had already invested in the manpower for such laws,

“I think it overlooks the fact that Facebook already has armies that are on a daily basis filtering content, according to their content posting guidelines 10s of 1000s of employees, whose basic job is to just go through picture after picture and decide whether it should be posted or not to be allowed to be posted or not... But at the same time, we get into a bit of a gray area. What’s our priority there? Is it safety and security of the society? Are we looking down at society in a kind of a patronizing manner? And think that people cannot actually engage in consensual activity on the internet and decide for themselves? Or do we need, do we really need to offer that level of protection to people? Are we dealing with adults here or kids?”

As a senior journalist puts it,

“There are many vloggers (video bloggers) on the platform doing a good job. Some give life experiences, some do some factual reporter jobs. There is one guy putting whatever reports he has and makes about $300 a month, which is 1/10 of the salary he was making in a mainstream newspaper before he lost his job. I’m talking about that guy also, and I’m talking about engineering graduates sitting in small towns. There are hundreds and thousands of them. They have their heart in the right place, and they are doing a decent job and sincerely. Now all of them will come under this …... How will that reporter hire a compliance officer? He is not even covering the costs of creating this content, he is not getting back the money he’s put into creating it that. Smaller, individual players are many in this country. They all will have problems, they all will be liable for whatever they publish. All I’m saying is, have rules, but don’t put this cost on them. They can’t hire a compliance officer.”

Broad scope and arbitrary governance:

The IT Rules stated aim is to govern digital news publishers and online curated content providers. With many traditional and legacy media houses having a strong digital presence, there was uncertainty if the Rules would apply to them. The Rules as notified on the Gazette of India did not mention the digital arm of traditional media houses, and it appeared that the content they produce or transmit are not regulated under the Rules. Digital news publishers were very quick to respond and file cases under the implementation of the Rules5, while legacy media stayed relatively quiet.

Representatives of digital news organizations felt that although they operate under the Press Act of India and are subject to regulatory oversight by the Press Council of India, they were still being targeted by the government. A month later, in June, the government issued a clarification – after pushback from digital media houses that had already begun complying with the Rules – that the digital arm of legacy/mainstream media would also be governed under the Rules6.

It appeared that digital news organisations were prepared for such legislation, observing the government decisions. As a founder of a digital news organization, and an office bearer of the industry body for digital news publishers puts it,

“We did know that the government was trying to do something for a while. They had been giving hints that digital media will be brought under Rules, that there will be some control over digital media. We understood at that time that the first rule that the government will bring in is to somehow put clauses and control on funding, especially funding from abroad. So we knew that much… The Union Government at least twice said that if they had to bring in rules, it will be for the digital media first, because ‘we’ don’t have any rules. I think most of us anticipated that these rules will be brought in as soon as possible. And it did happen like that.”

Similarly, others have raised the issue over different rules for those deemed significant social media intermediaries (SSMIs) and other social media intermediaries (SMI). As a public interest technologist explained,

“They have brought in the distinction between significant social media, and social media - where significant is 50 lakh (users) or something. But that doesn’t actually make sense. You can have lots of harms, even in small communities. There are forums online where there are only like thousand users and they share child pornography etc. So, it doesn’t mean that size is necessarily the right parameter.”

Therefore with the given examples it is important that the regulations take into consideration not just size but also the type of content and distribution mechanisms as well and requiring further introspection over the nature of content.


  1. Firstpost: IT rules 2021 add to fears over online speech, privacy; critics believe it may lead to ‘outright censorship’
    https://www.firstpost.com/tech/news-analysis/it-rules-2021-add-to-fears-over-online-speech-privacy-critics-believe-it-may-lead-to-outright-censorship-9810571.html ↩︎

  2. Reuters: India’s Rahul Gandhi says blocked by Twitter for political reasons https://www.reuters.com/world/india/indias-rahul-gandhi-says-blocked-by-twitter-political-reasons-2021-08-13/ ↩︎

  3. The Ken: The cracks in ShareChat, Moj’s content moderation machine
    https://the-ken.com/story/the-cracks-in-sharechat-mojs-content-moderation-machine/ ↩︎

  4. New America: The limitations of Automated Tools in Content Moderation https://www.newamerica.org/oti/reports/everything-moderation-analysis-how-internet-platforms-are-using-artificial-intelligence-moderate-user-generated-content/the-limitations-of-automated-tools-in-content-moderation/ ↩︎

  5. The News Minute: Centre’s new IT rules challenged in court, Delhi HC issues notice to I&B Ministry
    https://www.thenewsminute.com/article/centre-s-new-it-rules-challenged-court-delhi-hc-issues-notice-ib-ministry-144890 ↩︎

  6. The Wire: Govt says mainstream media not exempt from new IT Rules https://thewire.in/media/govt-says-mainstream-media-not-exempt-from-new-it-rules-asked-to-comply-with-provisions ↩︎

Comments

{{ gettext('Login to leave a comment') }}

{{ gettext('Post a comment…') }}
{{ gettext('New comment') }}
{{ formTitle }}

{{ errorMsg }}

{{ gettext('No comments posted yet') }}

Hosted by

Deep dives into privacy and security, and understanding needs of the Indian tech ecosystem through guides, research, collaboration, events and conferences. Sponsors: Privacy Mode’s programmes are sponsored by: more

Supported by

Google’s mission is to organize the world’s information and make it universally accessible and useful. more

Promoted

GitHub is the developer company. As the home to more than 65 million developers from across the globe, GitHub is where developers can create, share, and ship the best code possible. GitHub makes it easier to work together, solve challenging problems, and create the world’s most important technologi… more

Promoted