Detailed findings: Legal concerns
The IT Rules are notified under the IT Act1, 2000, which provides ‘Safe Harbour’ status to digital intermediaries. That is, the intermediary will not be held liable for content that is merely hosted by, or transmitted by them, provided they are not the creators or owners of the content.
By nature, subordinate legislation, such as the IT Rules 2021, cannot regulate activities that find no mention in the parent law. The Rules cannot also alter definitions or extend the scope of the parent law.
Therefore, respondents from digital media organizations and social media organizations said that the Rules are ultra vires of the parent Act.
The definition of Intermediaries given under the Act is:
“intermediary”, with respect to any particular electronic records, means any person who on behalf of another person receives, stores or transmits that record or provides any service with respect to that record and includes telecom service providers, network service providers, internet service providers, web-hosting service providers, search engines, online payment sites, online-auction sites, online-marketplaces and cyber cafes
Journalists and digital media representatives who participated in the qualitative interviews pointed out that the activities of news publishers and media are regulated by the Ministry of Information and Broadcasting (MIB)2, and thus excluded from the ambit of the IT Act.
Further, digital news publishers and journalists are governed under the Press Council Act of 19783 and follow editorial and journalistic ethics. They are thus held responsible for the content that they publish. They do not fit the definition of an intermediary under the IT Act.
The Rules curtail rights and mandate restructuring processes (such as user privacy, censorship of content deemed offensive by the government, etc) for digital-only news publishers, thus bringing them to the same status as an intermediary, while they are not. Such mandates and requirements do not find mention in the parent law. Journalists and digital news publishers have challenged the IT Rules in the court, on these counts4.
Digital news publishers are also most concerned about the impact of the Rules on their freedom as a free press that can call out issues in the way governments function.
With existing guidelines under the Press Council Act that govern news organizations, the Rules are seen as overarching and drafted to censor specific media channels and outlets. As a senior journalist and a member of a trade body representing digital media houses puts it:
“The fact that they had included us in these IT Rules as news publishers was my biggest problem. Because we are a completely different entity, we have a different functioning, we have a separate role as a public service, as news content producers … everything was wrong with it. Specifically, we are concerned that a committee of bureaucrats will sit and decide what is defamatory. This bypasses the judicial process. For instance, we are already in court with two people who have sued us for defamation. Now (with the IT Rules) those people need not go to the court … They can just go to this committee and get a takedown order on what they see as defamatory content.”
Another senior journalist echoed this sentiment, and added,
“On the Intermediary Liability issue, the IT Rules are very broad, and can be interpreted in any way that vested interests want . It is the one apprehension that everyone has. That is why it is very important that a proper framework, which is crystal clear, brings out the transparent ways to operate.”
Safe harbour status for intermediaries has been considered a gold standard, but the Rules remove it. However, as a subordinate legislation, it cannot do that and is thus ultra vires.
The IT Rules also distinguish between intermediaries based on arbitrary thresholds. Mainly, the Rules differentiate between a Social Media Intermediary and a Significant Social Media Intermediary (SSMI), based on a threshold of user base i.e., SSMIs are defined as organizations that have a user base of five million and above. Again, such distinctions and thresholds are not mentioned in the parent law and are therefore in violation here, respondents in the qualitative interviews and FGDs said. Some of the SSMI representatives who participated in the qualitative research felt that this was an arbitrary and vague definition. As a public interest technologist puts it:
“I have a problem with this large company and small company differentiation, especially with intermediaries. No company starts off by saying I want to remain a small company. When someone starts a social media company, they want to be the biggest., That is obviously their goal. So if they’re going to have Rules, then the Rules should be consistent for everyone.”
To summarize, the IT Rules are unconstitutional because:
- They intend to regulate activities that have no mention in the parent act.
- News publishing organizations will be censored under the IT Rules which prevents the growth of a free press
- The Rules curtail, and even try to replace the authority of the Press Council as a governing body for news organizations.
- Safe harbor removal which violates accepted norms for internet service providers and networking organizations, putting them at risk of prosecution.
- Creation of new categories of ‘media organizations’ based on arbitrary distinctions of user base thresholds, thereby exceeding the scope of the parent law.
In 2017, the Supreme Court of India declared Privacy a Fundamental Right of citizens, in the Puttaswamy v. Union of India5 case. Although India doesn’t currently have a personal data protection law, several countries across the world have implemented such laws – like the General Data Protection Regulation (GDPR) in the European Union – and require tech companies to abide by them.
Messaging services and platforms have increasingly responded to international laws as well as user requirements for privacy, and have implemented safeguards such as end-to-end encryption and restrictions on data collection and retention.
However, the IT Rules demand that messaging services break encryption in order to identify the ‘originator of messages’ – whether or not a crime has been committed. Not only does this affect how tech companies could function internationally, breaking encryption and implementing traceability will seriously harm citizens.
A public interest technologist says,
“The way the Rules have defined the originator law means that there will be no end-to-end encryption. Because you will need some form of fingerprinting… If I want to know who this particular person is who first started this meme, I have to know all the recipients this meme was sent to. If I know that this person has sent this meme to another person, then it’s not encrypted anymore.”
In the FGD with public interest technologists and their opinion about the IT Rules, a public policy representative explained the onus for big tech.
“… companies like WhatsApp and Signal have two choices: either they implement traceability by massively increasing the data collection, or breaking into an encryption and associating metadata of messages with users, number one, or two, they don’t do it. And then because of that, eventually, they get banned by the Indian government or get taken to court. So in practice, I would say that that’s the end game we’re moving towards. Who gives in first, what is actually taped together, is what we have to see.”
Encryption and user privacy assume good faith and goodwill on the part of all users. This implies that some bad faith actors may go undetected in the network, but the majority of users are presumed and treated as innocent. Respondents who participated in the qualitative interviews and round tables said that traceability, or the breaking of encryption, operates from a position of assumed guilt on the part of all users. Breaking encryption and violating user privacy flips the position and assumes all users as guilty until proven innocent. This is against the principles of natural justice.
By fingerprinting everybody on a messaging platform or social media site, the IT Rules effectively deem that everyone is under suspicion, that everyone is guilty. This has severe consequences for free speech and freedom of expression because any individual can be targeted for anything they said or even for forwarding messages.
A public interest technologist expressed concerns about the lack of a data protection law as a safeguard to the right to privacy.
“… when the Supreme Court ruled that privacy is a fundamental right, this is a right which is not explicitly written in the Constitution. There are no laws around it, compared to all the other fundamental rights. … this is our newest fundamental right, which has been declared by the Supreme Court based on existing fundamental rights. … ideally, what should have happened is the government should have come up with some privacy safeguard (?), or a data privacy act for citizens. But considering that this government argues that Indians do not have a fundamental right to privacy, they are not going to come up with that law. Under the IT Rules, every aspect of privacy will need to be litigated.”
Many of the participants in the qualitative research felt that the Rules – which are in essence meant to clarify how a law is implemented – are vague, and perhaps deliberately so.
As a veteran of the social media sector, and a senior activist, puts it:
“There is very little by way of transparency of orders that are going to be passed under these Rules, as is already becoming quite evident. And users are not going to know. To my mind, this is equal to buying a house or renting a house and the government always having a key to that house with the assurance that “Oh, but you know, this is for your own good.”
A participant from the media industry said:
“The laws have been deliberately framed so vaguely that you don’t know what the scope is or what the scope could be.”
Section 4 of the IT Rules, titled Additional Due Diligence of Social Media Intermediaries, requires social media organizations and services to employ a Chief Compliance Officer (CCO) who will be the one-point of contact between the intermediary and the government. This officer, whose role includes responding to the government’s takedown notices for content deemed objectionable or offensive, will be held personally accountable for any violations. The Rules also state that legal proceedings can be carried out against the CCO beyond their term as employee of the social media organization, with multiple parties including state and Union governments initiating proceedings.
Back in May 2021, many SSMIs were concerned about hiring for the position of the CCOs given the personal liability associated with the role of the CCO. Who will apply for a role that they know will persecute them for the rest of their lives?
The legal and policy head of a video sharing platform stated that placing liability on a single individual i.e., the CCO is an unprecedented practice. Organizations have a responsibility to protect their employees from legal risks.
“Any platform will be cautious to not expose their employees to a certain legal risk, a criminal liability in this case. I think this will have an impact on freedom of expression, on excessive takedowns … That is the biggest risk.”
The above statement indicates that organizations will make a trade-off between protecting their employees at the cost of users i.e. organizations might over-censor content to avoid litigation and thereby jeopardizing employees such as the CCO. Overall therefore, freedom of expression will be compromised as organizations prioritize protecting their employees.
India Code: The Information Technology Act 2000 https://www.indiacode.nic.in/bitstream/123456789/1999/3/A2000-21.pdf ↩
India Legislative: Press Council Act https://legislative.gov.in/sites/default/files/A1978-37.pdf ↩
The News Minute: Centre’s new IT Rules challenged in court https://www.thenewsminute.com/article/centre-s-new-it-rules-challenged-court-delhi-hc-issues-notice-ib-ministry-144890 ↩
Supreme Court of India:Justice Puttaswamy and others https://main.sci.gov.in/supremecourt/2012/35071/35071_2012_Judgement_24-Aug-2017.pdf ↩
Detailed findings: Ethical concerns
Broad definitions of unlawful content: The government claims the IT Rules are framed as a response to illegal or unlawful content, including hate speech, pornography and Child Sexual Abuse Material (CSAM), among others. Under the Rules, unlawful content will also include content that is deemed against national interest or national security, including those that tarnish the image of the nation. more