Social Media networks and platforms, chat and messaging applications, photo and video sharing services have radically transformed the internet landscape in India and elsewhere in the last decade. User generated content has allowed diverse voices to create and share views, political opinions, dance videos, movie and music commentaries.

While the platforms and networks have encouraged these voices, there is also a growing concern1 over the sharing of potentially offensive material such as pornographic content, child sexual abuse material (CSAM), hate speech and violent content often not suitable for the wide audience such platforms caters to.

The INFORMATION TECHNOLOGY (GUIDELINES FOR INTERMEDIARIES AND DIGITAL MEDIA ETHICS CODE) RULES, 2021 notified by the Ministry of Electronics and Information Technology (MEITY), together with the Ministry of Information and Broadcasting (MIB), Government of India – under the IT Act 2000, seeks to monitor and control user generated content and provide firm guidelines for social media Intermediaries, digital news publications and other organizations who host or transfer content on the internet.

The Rules were notified in February 2021, and went into effect in May 2021. Organizations and individuals have challenged the Rules on various counts2 – including their applicability under the parent law. Large platforms and social media networks have expressed concern about implementation and compliance.

Privacy Mode, a hub for conversations around privacy, data security and compliance, conducted a two-part research project seeking to understand the impact of the Rules on organizations and employees in the tech ecosystem who might be responsible for implementing the Rules and achieving compliance in tech and media products.

A qualitative study of Social Media Platforms, Digital News Publications, and Cloud Computing services providers, was undertaken to look at the possible impact on encryption, traceability, compliance, applicability of law among others, was conducted in May-June 2021; and a quantitative survey of tech workers across India, looking at awareness, professional and personal impact, work flows and requirements, was conducted in June-July 2021.

This report is a comprehensive analysis of both surveys and presents a rounded picture of the impact of the IT Rules 2021 on organizations and its employees. This research report also looks at larger questions and concerns about privacy, freedom of expression and speech given the discursive debates around responsible tech, digital platforms and ethics, and impact on society and individuals.

Executive Summary and Core Concerns

Scope of the law

By definition, the ‘Rules’ framed for any law in India are ‘Subordinate Legislation’ or ‘Delegated Legislation’. While laws are made by the Parliament/Legislature, Rules are made by the Executive i.e., the Government of India, to fulfill the requirements of the parent law. In Indian democracy, it is only the Legislative that can make laws. The Executive can only implement them. If the law says ‘XYZ has to be accomplished’, rules can frame the methods in which ‘XYZ’ can be accomplished. However, in the case of IT Rules 2021, the Rules are seen as overarching and exceeding the parent law.

Notified under the Information Technology Act, 20003 , which provides ‘Safe Harbour’ status to digital intermediaries, the Rules are ultra vires of the parent Act and seek to regulate activities that have no mention in it. Further, bringing digital news publishers under the ambit of the Rules, is unconstitutional and ultra vires of the IT act, as news websites do not fit the definition of ‘Intermediaries’ given under the Act4.

Further, the activities of news publishers and media are regulated by the Ministry of Information and Broadcasting (MIB)5, and thus excluded from the ambit of the IT Act. Concerns emerged that the Rules – which did not pass through the legislative body – sought to curtail rights and laws that did emerge from due legislative process.

Further, with existing guidelines under the Press Council Act that govern news organizations, the Rules are seen as overarching and drafted to censor specific media channels and outlets.

The Rules require intermediaries to identify the first originator of messages deemed objectionable. This implies that messaging platforms and social networking sites will have to significantly alter their product (and the technology underlying products) to comply. This is again not governed by the parent act, and is therefore unconstitutional. The Rules also operate from a position of assumed guilt, where all conversations and communications are expected to be scanned for potentially offensive material, and traced back to the original sender. This is against the assumption of innocence enshrined in the legal system operating in the country.

Breaking encryption and implementing traceability, a fundamental requirement of the new Rules, have international legal implications, as messaging services and social media platforms will need to alter the underlying technical architecture of their products or services - or at least have a different product and user experience for Indian users. Since this cannot be implemented for users in India alone and will affect every user of the services across the world, these social media intermediaries will be in violation of international laws governing user privacy and security, thus inviting legal costs.

Freedom of expression and natural justice

The Rules are seen as violating freedom of expression guaranteed in the Indian constitution by implementing traceability, which breaks encryption. Privacy, also a fundamental right as determined by the Supreme Court of India, is increasingly seen as a ‘make-or-break’ feature of all websites, apps, products, and services. Privacy operates from a position of assumption of innocence of the user. The Rules, by enforcing traceability, violate the fundamental rights of Indian citizens by reducing privacy to a conditional service, and not a constitutional guarantee

Cost of compliance

When the IT Rules came into effect in May 2021, they were criticized for imposing high costs of compliance, including legal and personal liability attached to employees of social media organizations. In the case of the office of the Chief Compliance Officer (CCO), liability extended even after the CCO retired from office. Every social media and news organization surveyed during this research pointed to the personal liability attached to the role of the CCO, grievance and nodal officers as imposing financial and legal costs on their organizations.

Proactive content filtering requirements will impact human resources requirements, demand changes in product and business operations, thereby significantly increasing costs. Traceability clauses under the Rules require extensive overhaul of messaging services and social networking platforms’ core architecture, requiring significant monetary and human resource investment.

Further, respondents in the Focus Group Discussions (FGDs) believed that ease of doing business will diminish given the stringent compliance regime and employee impact.

The Rules are also framed vaguely and arbitrarily, leading to confusion over operating clauses. Additionally, they have stringent reporting requirements. This will affect all organizations, especially small and medium enterprises, financially, and otherwise.

Skill and competency of Industry

In addition to the legal and ethical concerns emerging from implementation of the Rules, there are knowledge, awareness, and skill gaps across a representative sample of the IT industry, which may impact the ability of organizations to comply with the IT Rules.

Software developers in junior and mid-level roles in IT organizations believe their workload will increase with the IT Rules. Respondents hinted at their jobs now requiring them to do more documentation and reporting, and their role in achieving compliance in the company’s product as increasing their workload.

Industry representatives however felt that tech workers and product managers will fundamentally need knowledge in, or retraining in, privacy features, content filtering and user experience, in order to fully comply with the Rules. Experts in the industry believe that more than just technical skills or knowledge, what is missing is also perspective and understanding of how executing the Rules will impact users of media and tech products.

As noted above, encryption and traceability requirements of the Rules will mean major changes in products, especially user experience and inability to safeguard privacy of Indian users under the IT Rules. Implementing features such as voluntary verification will need product managers to acquire new skills and knowledge. Tech workers will need to learn how to work in coordination with legal teams. Given the implementation of the IT Rules, each content takedown request will have to be serviced on a case-by-case basis. This will impact scale and standard operating procedures in organizations, or will result in organizations relying more heavily on automation to censor content proactively (and to avoid being served takedown notices). In both cases, users of these products will bear the brunt, where their freedom of speech and expression will be reduced drastically.

Individual chapters and sections of the report are presented as submissions. Scroll down to read them.

About the principal researchers

Nadika Nadja is a researcher at Hasgeek. She has worked across advertising, journalism, TV and film production as a writer, editor and researcher.

Bhavani S is a Research Associate at Hasgeek. She has previously worked for the Centre for Budget and Policy Studies (CBPS), Microsoft Research India, and the University of Michigan, Ann Arbor.

Support team

Anish TP illustrated the report. Satyavrat KK provided research and editorial support. David Timethy and Zainab Bawa were project managers for producing this report. Kiran Jonnalagadda and Zainab Bawa advised on research design and execution.

Acknowledgements

We would like to thank the following individuals who provided feedback during different stages of the research. Their feedback helped the team fine-tune and bring rigour to the research process.

  1. Suman Kar, founder of security firm Banbreach, for reviewing early drafts of the quantitative research questionnaire, and providing detailed inputs on survey design.
  2. Prithwiraj Mukherjee, Assistant Professor of Marketing at IIM-Bangalore, for reviewing early drafts of the quantitative research questionnaire, and providing detailed inputs on survey design.
  3. Chinmayi SK, Founder of The Bachchao Project, for reviewing and providing feedback on the final report and conclusions

While Hasgeek sought funding from organizations, the research itself was conducted – with full disclosure at all stages – independently and objectively. The findings do not reflect any individual organization’s needs.


  1. Unicef: Growing concern for well-being of children and young people amid soaring screen time (2021) - https://www.unicef.org/press-releases/growing-concern-well-being-children-and-young-people-amid-soaring-screen-time ↩︎

  2. LiveLaw: Supreme Court Lists Centre’s Transfer Petitions, Connected Cases After 6 Weeks
    https://www.livelaw.in/top-stories/it-rules-2021-supreme-court-lists-centres-transfer-petitions-connected-cases-after-6-weeks-181032 ↩︎

  3. India Code: The Information Technology Act 2000 https://www.indiacode.nic.in/bitstream/123456789/1999/3/A2000-21.pdf ↩︎

  4. India Code: IT Act Definitions https://www.indiacode.nic.in/show-data?actid=AC_CEN_45_76_00001_200021_1517807324077&sectionId=13011&sectionno=2&orderno=2 ↩︎

  5. Ministry of Information and Broadcasting: About the ministry https://mib.gov.in/about-us/about-the-ministry ↩︎

Hosted by

Deep dives into privacy and security, and understanding needs of the Indian tech ecosystem through guides, research, collaboration, events and conferences. Sponsors: Privacy Mode’s programmes are sponsored by: more

Supported by

Google’s mission is to organize the world’s information and make it universally accessible and useful. more

Promoted

GitHub is the developer company. As the home to more than 65 million developers from across the globe, GitHub is where developers can create, share, and ship the best code possible. GitHub makes it easier to work together, solve challenging problems, and create the world’s most important technologi… more

Promoted

Nadika N

Nadika N

@nadikanadja

IT Rules 2021 Intermediary Guidelines: Concerns and recommendations from the community

Submitted May 25, 2021

Introduction

The INFORMATION TECHNOLOGY (GUIDELINES FOR INTERMEDIARIES AND DIGITAL MEDIA ETHICS CODE) RULES, 2021 notified by the Ministry of Electronics and Information Technology (MEITY), together with the Ministry of Information and Broadcasting (MIB), Government of India has a significant impact on what was perceived as the ‘safe harbour’ for organizations providing communications and information services for users in India.

To understand the specific impact of the Rules and concerns that they might have, Hasgeek spoke to representatives from these sectors as well as public interest technologists, lawyers, and users of services.

This report is a summary of opinions, concerns and recommendations emerging from these conversations, and is a part of a larger Hasgeek study into the implementation of the IT Rules.

Summary: The newly notified Intermediary Guidelines will require organizations to significantly change their operations, and will impact user policies, data retention and privacy policies, and impose costs on employees. From concerns over personal liabilities to timelines for compliance, the Rules are perceived as too stringent by organizations operating in the social media, messaging, digital news and content streaming sectors in India.

Prominent concerns and recommendations

  • Concern: Cost of compliance high; personal liability on Chief Compliance Officer (CCO) of grave concern.

  • Recommendation: Remove personal liability clause and consider other disincentives.

  • Recommendation: Relax timelines for implementation of Rules.

  • Concern: Ease of doing business will be impacted given stringent compliance regiment and employee impact. Proactive content filtering requirements have cost implications.

  • Recommendation: Organizations and social media intermediaries already proactively filter CSAM, pornographic and offensive material, and hate speech. Allow for greater control at user/community level.

  • Concern: Vagueness of rules; open to arbitrary interpretation.

  • Recommendation: Produce Standard Operating Procedures (SOPs) and case studies; redraft operational clauses for clarity. Provide qualifying definitions and limits. Institute representative, quasi-judicial body to oversee compliance from a ‘privacy first’ perspective.

  • Concern: Tracing the originator/first sharer of content deemed offensive opens up concerns about privacy and freedom of expression; could lead to breaking encryption currently used by service providers.

  • Recommendation: Strengthen existing laws such as POCSO to handle CSAM and child pornography; give users control to flag material or messaging of that nature. Allow platforms/organizations to implement full end-to-end encryption on the Assumption of Innocence principle.

  • Recommendation: Have training and knowledge sharing sessions for SSMIs, social media organizations and public-interest groups on identifying and reviewing CSAM and hate-speech in local languages.

  • Concern: User notification of content takedown; provision of removal of user content can open up arbitration, increase cost of operation and retention of users, impose costs on social media organizations. Also can violate freedom of speech/expression.

  • Recommendation: Quasi-judicial appellate authority to regulate/enforce free speech while providing reasoned judgements on takedowns/de-platforming of user content/users, better community-level tools to flag offensive content.

Note on methodology

This is preliminary qualitative research on awareness, sentiments and opinions about the Intermediary Liabilities (IL) Guidelines of the IT Rules 2021. It has been created through semi-structured interviews and Focus Group Discussions (FGD), led by Privacy Mode, a forum that hosts conversations on privacy and technology, operated by Hasgeek.

The Privacy Mode team identified and shortlisted business leaders, startup founders from social media, messaging, and video-sharing sectors, public interest technologists, lawyers and policy experts. We interviewed 18 industry leaders in May 2021. The team shortlisted interviewees by looking at domain diversity, scale of operations of the startups, and gender diversity. The Privacy Mode team reached out to interviewees with background materials, an interview questionnaire, and an informed-consent form.

The Privacy Mode team also produced a short primer covering the major points of the IT Rules, and provided links to media reports so that participants are better informed ahead of the interviews and FGDs.

Privacy Mode Research Team:
Nadika Nadja and Bhavani Seetharaman are the principal researchers. Satyavrat KK provided research and editorial support. David Timethy and Zainab Bawa were project managers for producing this report. Anish TP designed the final report. Kiran Jonnalagadda and Zainab Bawa advised on research design and execution.

Detailed findings

Personal liability and hiring impact, cost of compliance and periodic reporting

Section 4 of the new IT Rules, titled Additional Due Diligence of Social Media Intermediaries, requires social media organizations and services to employ a Chief Compliance Officer (CCO) who will be the one-point of contact between the intermediary and the government. This officer, whose role includes responding to the government’s takedown notices for content deemed objectionable or offensive, will be held personally accountable for any violations.

The Rules also provide for legal proceedings against the CCO to extend beyond their term as employee of the social media organization, with multiple parties including state and union governments initiating proceedings. Given that the rules hold the CCO - a senior managerial representative resident in India - personally accountable for compliance, organizations say this will create a hiring vacuum.

Further, the rules require social media intermediaries to publish monthly compliance reports, with data points that many smaller organizations may not have. Another cost implication and, which, failure to comply with, may impact Safe Harbour status.

Every representative Privacy Mode spoke to say that these guidelines and requirements are overly strict with a high cost for compliance. In particular, they point to personal liability associated with the role of the CCO, and feel that this is unnecessarily burdensome.

The legal and policy head of a video sharing platform felt that there were certain benefits to social media intermediaries having local presence in the country. However, placing liability on a single individual is an unprecedented practice, and organizations have a responsibility to protect their employees from legal risks. The respondent further articulated that such a degree of liability will have an impact on freedom of expression.

“Any platform will be cautious to not expose their employees to a certain legal risk, a criminal liability in this case. So I think this will have an impact on freedom of expression, on excessive takedowns… So that’s the biggest risk.”

Organizations anticipate that the personal liability clauses of the IT Rules will significantly increase the cost of hiring, especially at senior levels in the organization, increase the cost of compliance and reporting. There is also concern of what skillsets apply when hiring for such a role, and availability of candidates with that skill set, and willingness to face legal action. As the legal head of a large social media platform said,

“A startup might find it hard to hire three people. There’s a need for understanding the regulatory cost. I don’t think we can quantify it in any form. But it’s important to have that number. Large organizations may not find it a problem. It is the small intermediaries that will bear the brunt. I remember when GDPR discussions were happening, there were studies about the average cost of compliance. There is a need for that to happen here.”

All this will add to the overall cost of operating thus impacting the ease of doing business in India.

Some representatives who Privacy Mode spoke to also point to the requirement for periodic compliance reports, currently set at a monthly duration, and say such a clause will increase the burden of compliance for the organization and its employees. Strict deadlines will also affect the day-to-day operations of an organisation, requiring a dedicated department. There is no clarity on what such a report must cover, other than expanding on the grievances received. How or why the government wants such a report with such strict deadlines has not been detailed in the Rules, and is a cause of concern in addition to the strict deadlines mentioned.

Many social media platforms and services with user-generated content say they already have systems in place for monitoring content, and produce periodic reports on compliance, content filtering, and user related information. Some organizations also produce transparency reports regularly.

Representatives of organizations say that this is a cumbersome procedure with impact on costs. Mandatory reporting at monthly intervals will further increase the burden. Smaller organizations and startups will thus be at greater risk of non-compliance.

User content and data, takedown notification, and processes for data retention and deletion

The IL Guidelines require social media intermediaries, Significant Social Media Intermediaries (SSMIs) and messaging services to take down any user-generated content that is deemed either defamatory, offensive, pornographic, false or misleading, or a threat to national interests. The intermediary must remove flagged or offensive, pornographic content within a period of 36 hours, but retain the data of the user and the content deemed objectionable for a period of up to 180 days to aid in investigation. Further, the Rules require the intermediary to notify users of such takedown and provide grievance redressal forum, if required.

The respondents who Privacy Mode spoke to pointed out that the provisions on what content can be flagged as objectionable are vaguely defined. Organizations say they already have internal teams to monitor, review, and take down content that violates their community guidelines and user policies. Further there is widespread acknowledgement that content of a pornographic nature, including Child Sexual Abuse Material (CSAM) or paedophilia and religious hate speech are a danger to safe and free societies, and that this understanding extends to online platforms and social media spaces too. Further all respondents said that strict measures must be taken to remove such content from respective platforms. However, some social media intermediaries and some public interest groups believe that this process must be carved out of existing laws such as POCSO or the IT Act and should not be vaguely defined as has currently been done under the IT Rules 2021.

Some organizations also said that larger platforms and social media intermediaries already have robust content monitoring capabilities, and that users are able to flag or report CSAM and similar. They question whether providing additional clauses and attaching personal liability to employees of social media platforms will work as intended or only complicate processes further. As a policy head of a large social media platform puts it,

“There is active sharing of (content) hashes that happens across these companies. We’ve been doing it (monitoring and removing pornographic/CSAM content) for a long time. There is this 24 hour turnaround time, we are cognizant of the objective of the GOI, but I think it is important to have some guard rails to understand what falls under this bucket. We do understand it’s needed given the number of cases that happen in India. But we need to have some guard rails to ensure that it (is) not abused. We need more clarity on the protocols.”

Another respondent echoed this and said,

“We have a strong internal process, every piece of content is reviewed. But here, there is a 24-hour return time. And there might be many frivolous complaints, and we need a process to understand how we weed out frivolous concerns. We require time to build this up.”

Respondents also pointed to requirements for user notification of takedown content, including the 36-hour, 72-hour timelines for reporting and takedown, as potential pitfalls that may act counterproductive to stated aims of the rules. For instance, one organization believed that providing notification to users for take-down of patently illegal or threatening content could result in situations where platforms may potentially be alerting terror networks or Child Sexual Abuse (CSA) rings of impending state action.

Additionally, social media organizations and platforms may not always have requisite details regarding user registration and the 36 hours timeframe provided will not be enough time for respondent organizations to collect such data. In this scenario, requiring user notification, takedown of content, and setting up grievance redressal can become challenging for social media organizations, particularly small and medium organizations. As the representative of a video sharing platform says,

“36 hours is content takedown timeline, if the request is coming for the government agency or the platform has responded in 72 hours for law enforcement requests asking for user information. We may not have sufficient details in many cases. Putting an obligation on the platform to respond within those stipulated timelines, and having consequences – such as removing Safe Harbour, or holding CCOs personally liable – if the organizations don’t comply because of lack of information is concerning, something which requires more clarification. Can there be ‘stop the clock’ clauses here?”

Another large organization with significant presence in India believes that major re-tooling and restructuring of processes will be required to comply with the data retention policies of the new Rules. This is particularly true in retaining user registration data for up to 180 days. The organization states that the current timelines given under the Rules will not be sufficient.

Further organizations say that these clauses – 180 days for retention – are against global norms followed by social media platforms and that this will violate existing user policies. They recommend that existing policies regarding user data retention and consent management be practiced where required.

Privacy, encryption and principles of natural justice

One of the biggest concerns over the new IT Rules is the requirement for social media intermediaries, especially those providing messaging services, to implement traceability – identifying the first originator of a message.

Privacy has been deemed a fundamental right in the country, and constitutional guarantees protect freedom of expression and speech. The traceability requirement is thus a violation of the right to privacy and free speech. Messaging services and platforms have increasingly responded to user requirements for privacy and have implemented safeguards such as end-to-end encryption and restrictions on data collection and retention. Newer technologies and platforms provide these features as “basic” and “by right” to users. Further, encryption and user privacy assume “good faith” and goodwill on the part of all users, and regard every user of a platform on that merit. This might mean that some bad-faith actors may go undetected in the network, but the majority of users are presumed and treated as innocent. Altering this foundational assumption will require a fundamental change in how users and organizations respond to each other and how systems are used.

In this scenario, requiring social media organizations to break encryption in order to identify the first originator of messaging is seen as a foundational-overreach by the authorities.

End-to-end encryption protects freedom of expression, but traceability allows only conditional guarantees for expression, reducing it to a “service” provided or rescinded by a government or law enforcement authority periodically. Respondents said that traceability, or the breaking of encryption, operates from a position of assumed guilt on the part of all users. The widespread belief is that regulation must be proportionate and not presume all users are fundamentally guilty until proven otherwise. As a startup founder put it,

“The first is, when we talk about traceability we are fundamentally asking these companies to implement a system designed around efficiency, that is the efficiency of finding guilt. So essentially, you’re operating on a presumption of guilt. This is not how our justice system works. Our legal systems typically work from the presumption of innocence. So there’s this huge discord right at the beginning.”

A public interest technologist adds,

“The way the rules have defined the originator law means that there will be no end-to-end encryption. Because you would need some form of fingerprinting… If I want to know who this particular person is who first started this meme, I have to know all the recipients this meme was sent to. If I know that this person has sent this meme to another person, then it’s not encrypted anymore.”

Public interest technologists and social media representatives agree with the stated goal of the Rules – to monitor, filter, and remove Child Sexual Abuse Material (CSAM) and pornography involving minors. Towards this goal, every respondent has said that they have robust community guidelines and internal processes. However, breaking encryption and implementing traceability for such a rule may have unintended consequences, and make “visible to government” other, legal and consensual exchanges between individuals, make the government party to transactions it didn’t want to be. This unintended consequence again violates the assumption of innocence that operates in the legal system in the country, thus violating natural justice. As a public interest technologist puts it,

“So if you have a very wide scope, you can always say, ‘This also falls in my scope. So why not do this?’ They haven’t restricted the scoping in any sense at all.”

As another expert puts it,

“If you do want to do something like trace to the origin, it runs counter to message deniability as part of the protocol itself. Most of those kinds of requirements run counter to, and don’t reflect technological understanding. It comes down to end-to-end encryption being broken. What’s our stance on it as a society? Should it be permitted or should it not be permitted? We do need to prevent CSAM – there’s nobody, I think, who would vouch against doing that. But the way it’s been proposed has all sorts of repercussions from other areas and basically compromises end-to-end encryption.”

Prof. Matthew Green, associate professor of computer science at the Johns Hopkins University, brought up the American example, citing the most common rationale for real-time content scanning being the prevalence of CSAM media,

“The idea that end-to-end encryption facilitates the transmission of this kind of media is horrifying, and it brings out an emotional reaction in everybody. But from a technical point of view, it’s really important to understand the difference in the requirement of what governments are asking for, between asking for exceptional access and asking for real-time content scanning. The first is an exceptional capability that is not used except in occasional, serious legal instances when a court has granted a warrant. Real-time content scanning is a ubiquitous workflow. Every single message is being checked, maybe not every message is being flagged if the system worked well, but every message is being scanned. And from a technical point of view, that’s a very, very big difference.”

Prof. Green is the co-author of the renowned position paper on encryption titled Keys under the Doormat. In 2020, Prof. Green assisted the technical team at Zoom to implement end-to-end encryption after users in the US raised concerns about the privacy of their conversations on the video conferencing platform.

Prof. Green delivered a keynote on the state of technical and policy debates on encryption at Rootconf and Privacy Mode’s Data Privacy Conference in April 2021. Referring to the US debates and India’s IT Rules, Prof. Green explained the complexities of implementing technical specifications for tracing the first originator on messaging app platforms:

“The government in India has proposed some kind of hash-based tracing. I’ve seen some research that proposes other ways to do this. It’s possible that WhatsApp already has some way to track forwards, it’s a little bit difficult to tell. But it’s not necessarily the case that we know how to do this because the situation is sort of fluid. Some people have proposed tracing approaches that are privacy preserving, and approaches that can only trace messages if they achieve large-scale virality. But the problem is we don’t know when a message is just starting out if an attachment is sent out from one person, and it reaches 10 people and then it goes to 100 people and it goes to a million people. At what point does it become traceable? Is it always traceable? From the point where I send it to my 10 friends? Or is it traceable at the point where it reaches a million? And then somehow when it reaches a million people? Can we go back in time and identify who sent the original one (without breaking end-to-end encryption)?”

A video of this talk - on the state of technical and policy debates about end-to-end encryption - is available at https://hasgeek.com/rootconf/data-privacy-conference/sub/end-to-end-encryption-state-of-the-technical-and-p-X8h4qierQT4Da31x7cRFYe

Oversight and monitoring

The IT Rules 2021 are open-ended and vague as to which agency or institution of the government will be responsible to monitor and enforce the rules. Further, the rules themselves are defined to be broad in scope and implementation. While there are some provisions to cater to freedom of speech or expression, a judicial summons or government order overrides these minimum protections.

Organizations therefore believe that this can potentially result in multiple layers of authorities and governments that they must comply with, significantly impacting business operations. Limiting scope and providing clarity is essential for businesses in order to ensure that their Safe Harbour status isn’t adversely affected. Further, from a user’s perspective, it is not clear how individuals may seek redressal under the new Rules in the case of false information, privacy infringement, or defamatory content.

Future legislation – such as the proposed Personal Data Protection (PDP) Bill, or the Non-Personal Data (NPD) regulation framework – can impose further restrictions on social media platforms and increase bureaucratic hurdles for them.

From both the organizations’ side and the users’ side, this appears to lead to litigation with cases being filed in Delhi, Karnataka and Kerala High courts against the IT Rules 2021, by individuals and organisations, forcing the courts to lay down the final law.

As the policy head of a Indian social media startup said,

“There are processes that MEITY has put into place where it will call the intermediary before any unlawful takedown. But even there, I’m not sure how much power an intermediary has. If there is something that the organization doesn’t agree with or something that doesn’t suit it, they will have to go to the courts. I think for the next one year, we will see the judiciary playing a part in bringing a balance between what the government is looking to correct or take care of, versus an individual’s right to express freely.”

As a senior Supreme Court (SC) lawyer said,

“Right now, there is no grievance redressal mechanism, outside the organization’s. Within the organization if they take the line that, ‘no, you, the user have no grievance, and we will not hear you’, there is no redress whatsoever. In fact, US Senator Bernie Sanders said that the fact that the President of the United States can’t be on a popular social media platform worries him, notwithstanding the fact that he disagrees with Trump. So this is simply too much power in too few hands.”

An independent, quasi-judicial body, the senior lawyer said, can be modelled on the Federal Communications Commission (FCC) of the US or the Office of Communications (Ofcom) of the UK and function as both a monitoring agency and a court of appeal. He said,

“It will have to be a body that gives reasoned judgments. A reasoned order, say, ‘Yes, this is why this takedown, or this de-platforming was correct. That the person has had warnings, was a repeat offender. And if allowed, unchecked, this was the harm that would have ensued. Or is the other way around? No, there were no warnings given. This is not conduct which requires to be de-platformed.’ The basic thing that I would like to see from any regulator would be reasoned orders, which could then if necessary, be further tested in the courts. Right now, what the government offers is a bunch of under secretaries and joint secretaries. I want this process outside the companies and outside the government in a truly independent body. So, once that body comes in, the government procedure would not be required, and ought not to be required.”

Recommendations

While the concerns and recommendations here are not exhaustive, the major themes remain fairly common.

On compliance and reporting

The primary recommendation from the representatives interviewed is that the Rules – which are seen to be vague and open to interpretation – be clarified in the form of SOPs and case studies, and if required, specific sections or clauses be redrafted for clarity and consistency. Representatives across the sectors point to a lack of clear definitions in the Rules, and say that this needs to be fixed first before the Rules to go some way towards achieving their stated objectives.

While MEITY held some consultations with industry bodies and representatives in 2018 for some aspects of the Intermediary Rules, many feel that the consultations did not achieve much and thus the Rules, especially around encryption and privacy, are not reflective of industry sentiments. Community representatives therefore recommend further rounds of consultations with social media organizations, public interest technologists, academics, and legal practitioners, to drill-down and clarify definitions.

Further, threshold limits for social media intermediaries, such as user base or revenue, will need to be confirmed and written into the Rules, following extensive consultation for the same.

The second major recommendation from representatives we spoke to cover the operational constraints and costs of compliance. Every social media intermediary recommends that the clause of Personal Liability on the CCO be removed. With multiple Law Enforcement Agencies (LEAs) potentially able to initiate proceedings against the CCO under the current version of the Rules, industry representatives say this will create a culture of fear and self-censorship, stifling free speech. While they strongly recommend removing personal liability entirely, organizations also say that at the very least, restrict it to a role-based responsibility, to ensure that employees are not unfairly held accountable even after their term of employment ends.

Some organizations also recommend that the government and the ministry consider imposing financial penalties in place of personal liability. The extent of financial liability will need to be determined after consultation with the industry, and based on the organization’s size, turnover, user base and other parameters.

Representatives believe the Rules will increase the cost of compliance and burden on organizations – especially smaller social media organizations and startups. The cost implications of content filtering technology and of producing periodic reports of compliance will need to be considered before determining any financial liability on organizations.

Therefore, the community strongly recommends that MEITY work with stakeholders to assess the costs of compliance for organizations of different sizes and at various levels of operations.

Representatives also ask that the timelines for compliance be relaxed, and staggered deadlines be implemented. Organizations, especially smaller organizations, will require extensive tool-sets and technological support to implement content filtering solutions. As stated above, hiring for senior compliance and nodal officers is a cost that organizations may not be able to afford immediately. It is thus recommended that MEITY relax timelines by 6-12 months for overall compliance while taking into account good-faith actions by organizations across the country.

Representatives also strongly recommended that ‘Stop-the-clock’ timelines be adopted and coded into the Rules for takedown notifications and grievance redressal mechanisms. The current timeframe of 36 hours is not sufficient to gather all the required data points to comply with legal requests and will significantly impact smaller organizations. Respondents felt that the Rules fail to consider the volume and quality of grievances, and the time taken to create the required backend mechanisms to implement such forms of legislation. A few representatives also require clarity on the timeframe, and recommend that the timelines be 36 to 72 business hours for compliance.

Representatives also recommended that MEITY relax the timelines for periodic monthly reports. It is suggested that MEITY must hold consultations with organizations and experts to firm up the data points that will need to be covered by the periodic compliance reports. One representative also suggested that these reports be submitted once every business quarter, with monthly details incorporated in the report. This will significantly ease operations for businesses and allow for greater clarity and transparency.

On privacy and content filtering

The single most important recommendation coming out of conversations Privacy Mode has had with industry representatives and experts is to strengthen encryption – especially end-to-end encryption on messaging services – and thus protect privacy. This requires that traceability should not be a part of the new Rules.

SSMI representatives, experts in security and privacy and legal experts all agree that privacy can be preserved while also protecting against the spread of CSAM or other patently illegal and unlawful content. Organizations already have robust internal mechanisms and teams to monitor and review such content and do not need to rely on identifying the originator to remove such content from their systems.

Representatives strongly recommended that the IT Rules explicitly mention Encryption and Privacy as stated goals, while strengthening and enhancing clauses to govern and regulate spread of CSAM or hate speech in existing legislation such as the POCSO.

Further, MEITY can hold workshops and knowledge sharing sessions with industry experts and academics on privacy, content filtering and monitoring, and larger organizations can help and guide smaller organizations in implementing their internal processes.

Representatives believe that strengthening acts such as POCSO, providing clear, precisely worded community guidelines for users of social media services, and strengthening organizations’ internal teams monitoring content will have a greater positive impact on CSAM than breaking encryption and implementing traceability. Additionally, experts suggest that training and resources be given to teams scanning content to identify CSAM in local languages, hate-speech, caste-based hate speech and violence, and have periodic reviews of the same to create more equitable, safe social media spaces.

On reporting, monitoring and oversight

Representatives strongly recommended that the layers of reporting for Social Media Intermediaries and SSMIs be limited and streamlined. This is particularly important when considering the personal liability of CCOs as defined in the Rules.

Representatives suggest that only a state or union government – via a specially appointed Nodal Officer – be the point of contact for SSMIs. Further all takedown requests and grievance redressal be directed through the Nodal Officer of the government, helping streamline the process and providing clarity to organizations.

Further, representatives suggest the setting up of a quasi-judicial independent body, modelled on the US FCC or the UK Ofcom, which will eventually monitor for and enforce free speech while providing reasoned judgements on content filtering, grievance redressal and user complaints. The community suggests that the Nodal Officer role be transitioned into the independent body over a period of time.

Comments

{{ gettext('Login to leave a comment') }}

{{ gettext('Post a comment…') }}
{{ gettext('New comment') }}
{{ formTitle }}

{{ errorMsg }}

{{ gettext('No comments posted yet') }}

Hosted by

Deep dives into privacy and security, and understanding needs of the Indian tech ecosystem through guides, research, collaboration, events and conferences. Sponsors: Privacy Mode’s programmes are sponsored by: more

Supported by

Google’s mission is to organize the world’s information and make it universally accessible and useful. more

Promoted

GitHub is the developer company. As the home to more than 65 million developers from across the globe, GitHub is where developers can create, share, and ship the best code possible. GitHub makes it easier to work together, solve challenging problems, and create the world’s most important technologi… more

Promoted