Bhavani Seetharaman

Bhavani Seetharaman

@Bhavani_21

Nadika N

Nadika N

@nadikanadja

Detailed findings: Ethical concerns

Submitted Sep 19, 2021

Broad definitions of unlawful content:

The government claims the IT Rules are framed as a response to illegal or unlawful content, including hate speech, pornography and Child Sexual Abuse Material (CSAM), among others. Under the Rules, unlawful content will also include content that is deemed against national interest or national security, including those that tarnish the image of the nation.

The vague, broad-brush nature of such definitions have been called out by activists, journalists, public interest technologists, and representatives of social media intermediaries1. They believe that these Rules effectively curtail freedom of speech and give power to the Government to classify dissent and criticism as objectionable content and demand it be taken down.

We have already seen instances of posts on social media platforms, (most notably on the platform Twitter) being taken down since the IT Rules came into effect in May 2021. Popular opposition politicians and activists have been denied access to their social media accounts2.

As an expert of the social media space, with a historical understanding of the sector, says,

“To de-platform sitting Members of Parliament, to de-platform journalists and other critics for tweets that by any objective analysis were only critical of the government’s handling of the farmers’ protests and were not invoking violence or insurrection. And there was no direct linkage of offline violence with online speech. But the intent of the government is such that it can use any of these Rules to issue any orders, and get any kind of compliance out of these platforms. And at the end of it, you basically have users who suffer the consequences of this, be it from the content being withheld in particular geographies, or the content being forced to be deleted, or their accounts being deleted, or even worse, legal action against these folks.”

While looking at the broad nature of the definitions and the difficulties of implementation one must question where the recourse is situated for individuals unfairly penalised by the law. This is especially true of content creators. A senior journalist highlights:

“In the intermediary rules, you have this definition where misinformation or harmful content is not defined. Is my criticism of Yogi Adityanath harmful content? If a sub-inspector in Raebareli thinks it is, then it is, and I have no recourse, because this has not been defined in a way that it doesn’t leave any scope for interpretation by those enforcing these laws. It automatically leads you to believe that it is going to be prone to misuse, and which it will be. And it goes back to the question of how bureaucracies are made. And, who knows what’s going to come under the purview of these orders. It really will depend on the imagination of a cop seeking a promotion, or seeking a quick favour with a political overlord, or a political overlord putting the pressure on a cop to score some quick points with bigger political overlords and this chain doesn’t stop.”

Many of our interview and FGD respondents wondered what the overall priority of the law would be, considering how it has been implemented in the last few months with the deplatforming of opposition leaders and extreme focus given to the social media intermediaries. An industry veteran and activist said:

“What’s our priority there? Is it the safety and security of society? Are we looking down at society in a kind of a patronizing manner? And to think that people cannot actually engage in consensual activity on the internet and decide for themselves? Or do we really need to offer that level of protection to people? Are we dealing with adults here or kids?”

An industry veteran, journalist and activist said:

“I think instinctively I feel like it looks at the online sphere as an extension of the public sphere. And exactly how it polices the public sphere with terrible laws, it has gone on to extend to the online sphere as well. So I see very little disconsonance in the government’s approach to how to police our online and offline lives.”

Freedom of speech curtailed

Respondents also felt that the IT Rules are designed to curtail freedom of speech and expression. Digital publications will now have to follow a three-tier self-regulatory model which includes a grievance officer within the organization, a self-regulating body headed by a retired Supreme Court or High Court judge or an eminent person from the industry, and finally the Union government’s oversight at the top.

This model – with the union government being the ultimate arbitrator of what content is allowed to be published and what is deemed objectionable – has been rightly called out for stifling freedom of speech and placing limits on a free press.

As a journalist covering tech and business sectors for an international publication says,

“It’s no secret that press freedom in the country has been plummeting over the last few years. Forcing a rule like that (IT Rules) will put massive powers in the hands of the government to basically force digital media outlets to take off content. Is that not a pressing issue? I mean what is more of a pressing issue than that right now? You know, the whole world is talking about the shrinking space for dissent and free speech in India.”

A tech journalist felt that the onus of pushback rested on the public. He said,

“On the one hand, they (the platforms) do have to sort of legally comply with most of these IT Rules. But on the other hand, the IT rules force them to not stand up for their users’ free speech. I think you are going to see a lot of pushback from people about the platforms, on the platform themselves. I think this is a good thing. Because, if there is one thing that platforms are really sensitive to is this bad press, and I think often bad press, if done well, is quite effective in getting them to take a harder look at their policies around these things.”

Another point that individuals have raised was the fact that many platforms already have internal mechanisms in place for content filtering based on internal mandates and codes. Even small and upcoming platforms have implemented such practices. With some smaller startups relying on third party contractors to aid in the content moderation process3. However, with the large scale implementation of such legislation more content might be forced to be reviewed, due to the scale of such operations there might be a larger dependence on AI/ML for the reviewing of such content. Such practices have been considered to lack both accuracy and reliability unlike human employees4.

A public interest technologist opined that larger entities had already invested in the manpower for such laws,

“I think it overlooks the fact that Facebook already has armies that are on a daily basis filtering content, according to their content posting guidelines 10s of 1000s of employees, whose basic job is to just go through picture after picture and decide whether it should be posted or not to be allowed to be posted or not... But at the same time, we get into a bit of a gray area. What’s our priority there? Is it safety and security of the society? Are we looking down at society in a kind of a patronizing manner? And think that people cannot actually engage in consensual activity on the internet and decide for themselves? Or do we need, do we really need to offer that level of protection to people? Are we dealing with adults here or kids?”

As a senior journalist puts it,

“There are many vloggers (video bloggers) on the platform doing a good job. Some give life experiences, some do some factual reporter jobs. There is one guy putting whatever reports he has and makes about $300 a month, which is 1/10 of the salary he was making in a mainstream newspaper before he lost his job. I’m talking about that guy also, and I’m talking about engineering graduates sitting in small towns. There are hundreds and thousands of them. They have their heart in the right place, and they are doing a decent job and sincerely. Now all of them will come under this …... How will that reporter hire a compliance officer? He is not even covering the costs of creating this content, he is not getting back the money he’s put into creating it that. Smaller, individual players are many in this country. They all will have problems, they all will be liable for whatever they publish. All I’m saying is, have rules, but don’t put this cost on them. They can’t hire a compliance officer.”

Broad scope and arbitrary governance:

The IT Rules stated aim is to govern digital news publishers and online curated content providers. With many traditional and legacy media houses having a strong digital presence, there was uncertainty if the Rules would apply to them. The Rules as notified on the Gazette of India did not mention the digital arm of traditional media houses, and it appeared that the content they produce or transmit are not regulated under the Rules. Digital news publishers were very quick to respond and file cases under the implementation of the Rules5, while legacy media stayed relatively quiet.

Representatives of digital news organizations felt that although they operate under the Press Act of India and are subject to regulatory oversight by the Press Council of India, they were still being targeted by the government. A month later, in June, the government issued a clarification – after pushback from digital media houses that had already begun complying with the Rules – that the digital arm of legacy/mainstream media would also be governed under the Rules6.

It appeared that digital news organisations were prepared for such legislation, observing the government decisions. As a founder of a digital news organization, and an office bearer of the industry body for digital news publishers puts it,

“We did know that the government was trying to do something for a while. They had been giving hints that digital media will be brought under Rules, that there will be some control over digital media. We understood at that time that the first rule that the government will bring in is to somehow put clauses and control on funding, especially funding from abroad. So we knew that much… The Union Government at least twice said that if they had to bring in rules, it will be for the digital media first, because ‘we’ don’t have any rules. I think most of us anticipated that these rules will be brought in as soon as possible. And it did happen like that.”

Similarly, others have raised the issue over different rules for those deemed significant social media intermediaries (SSMIs) and other social media intermediaries (SMI). As a public interest technologist explained,

“They have brought in the distinction between significant social media, and social media - where significant is 50 lakh (users) or something. But that doesn’t actually make sense. You can have lots of harms, even in small communities. There are forums online where there are only like thousand users and they share child pornography etc. So, it doesn’t mean that size is necessarily the right parameter.”

Therefore with the given examples it is important that the regulations take into consideration not just size but also the type of content and distribution mechanisms as well and requiring further introspection over the nature of content.


  1. Firstpost: IT rules 2021 add to fears over online speech, privacy; critics believe it may lead to ‘outright censorship’
    https://www.firstpost.com/tech/news-analysis/it-rules-2021-add-to-fears-over-online-speech-privacy-critics-believe-it-may-lead-to-outright-censorship-9810571.html ↩︎

  2. Reuters: India’s Rahul Gandhi says blocked by Twitter for political reasons https://www.reuters.com/world/india/indias-rahul-gandhi-says-blocked-by-twitter-political-reasons-2021-08-13/ ↩︎

  3. The Ken: The cracks in ShareChat, Moj’s content moderation machine
    https://the-ken.com/story/the-cracks-in-sharechat-mojs-content-moderation-machine/ ↩︎

  4. New America: The limitations of Automated Tools in Content Moderation https://www.newamerica.org/oti/reports/everything-moderation-analysis-how-internet-platforms-are-using-artificial-intelligence-moderate-user-generated-content/the-limitations-of-automated-tools-in-content-moderation/ ↩︎

  5. The News Minute: Centre’s new IT rules challenged in court, Delhi HC issues notice to I&B Ministry
    https://www.thenewsminute.com/article/centre-s-new-it-rules-challenged-court-delhi-hc-issues-notice-ib-ministry-144890 ↩︎

  6. The Wire: Govt says mainstream media not exempt from new IT Rules https://thewire.in/media/govt-says-mainstream-media-not-exempt-from-new-it-rules-asked-to-comply-with-provisions ↩︎

Comments

{{ gettext('Login to leave a comment') }}

{{ gettext('Post a comment…') }}
{{ gettext('New comment') }}
{{ formTitle }}

{{ errorMsg }}

{{ gettext('No comments posted yet') }}

Hosted by

Deep dives into privacy and security, and understanding needs of the Indian tech ecosystem through guides, research, collaboration, events and conferences. Sponsors: Privacy Mode’s programmes are sponsored by: more

Supported by

Google’s mission is to organize the world’s information and make it universally accessible and useful. more

Promoted

GitHub is the developer company. As the home to more than 65 million developers from across the globe, GitHub is where developers can create, share, and ship the best code possible. GitHub makes it easier to work together, solve challenging problems, and create the world’s most important technologi… more

Promoted