Community Consultation for TRAI + Meredith Whittaker at BIC Hello everyone! There are 2 upcoming events of interest to those involved in privacy, tech policy, and AI. more
Social Media networks and platforms, chat and messaging applications, photo and video sharing services have radically transformed the internet landscape in India and elsewhere in the last decade. User generated content has allowed diverse voices to create and share views, political opinions, dance videos, movie and music commentaries.
While the platforms and networks have encouraged these voices, there is also a growing concern1 over the sharing of potentially offensive material such as pornographic content, child sexual abuse material (CSAM), hate speech and violent content often not suitable for the wide audience such platforms caters to.
The INFORMATION TECHNOLOGY (GUIDELINES FOR INTERMEDIARIES AND DIGITAL MEDIA ETHICS CODE) RULES, 2021 notified by the Ministry of Electronics and Information Technology (MEITY), together with the Ministry of Information and Broadcasting (MIB), Government of India – under the IT Act 2000, seeks to monitor and control user generated content and provide firm guidelines for social media Intermediaries, digital news publications and other organizations who host or transfer content on the internet.
The Rules were notified in February 2021, and went into effect in May 2021. Organizations and individuals have challenged the Rules on various counts2 – including their applicability under the parent law. Large platforms and social media networks have expressed concern about implementation and compliance.
Privacy Mode, a hub for conversations around privacy, data security and compliance, conducted a two-part research project seeking to understand the impact of the Rules on organizations and employees in the tech ecosystem who might be responsible for implementing the Rules and achieving compliance in tech and media products.
A qualitative study of Social Media Platforms, Digital News Publications, and Cloud Computing services providers, was undertaken to look at the possible impact on encryption, traceability, compliance, applicability of law among others, was conducted in May-June 2021; and a quantitative survey of tech workers across India, looking at awareness, professional and personal impact, work flows and requirements, was conducted in June-July 2021.
This report is a comprehensive analysis of both surveys and presents a rounded picture of the impact of the IT Rules 2021 on organizations and its employees. This research report also looks at larger questions and concerns about privacy, freedom of expression and speech given the discursive debates around responsible tech, digital platforms and ethics, and impact on society and individuals.
By definition, the ‘Rules’ framed for any law in India are ‘Subordinate Legislation’ or ‘Delegated Legislation’. While laws are made by the Parliament/Legislature, Rules are made by the Executive i.e., the Government of India, to fulfill the requirements of the parent law. In Indian democracy, it is only the Legislative that can make laws. The Executive can only implement them. If the law says ‘XYZ has to be accomplished’, rules can frame the methods in which ‘XYZ’ can be accomplished. However, in the case of IT Rules 2021, the Rules are seen as overarching and exceeding the parent law.
Notified under the Information Technology Act, 20003 , which provides ‘Safe Harbour’ status to digital intermediaries, the Rules are ultra vires of the parent Act and seek to regulate activities that have no mention in it. Further, bringing digital news publishers under the ambit of the Rules, is unconstitutional and ultra vires of the IT act, as news websites do not fit the definition of ‘Intermediaries’ given under the Act4.
Further, the activities of news publishers and media are regulated by the Ministry of Information and Broadcasting (MIB)5, and thus excluded from the ambit of the IT Act. Concerns emerged that the Rules – which did not pass through the legislative body – sought to curtail rights and laws that did emerge from due legislative process.
Further, with existing guidelines under the Press Council Act that govern news organizations, the Rules are seen as overarching and drafted to censor specific media channels and outlets.
The Rules require intermediaries to identify the first originator of messages deemed objectionable. This implies that messaging platforms and social networking sites will have to significantly alter their product (and the technology underlying products) to comply. This is again not governed by the parent act, and is therefore unconstitutional. The Rules also operate from a position of assumed guilt, where all conversations and communications are expected to be scanned for potentially offensive material, and traced back to the original sender. This is against the assumption of innocence enshrined in the legal system operating in the country.
Breaking encryption and implementing traceability, a fundamental requirement of the new Rules, have international legal implications, as messaging services and social media platforms will need to alter the underlying technical architecture of their products or services - or at least have a different product and user experience for Indian users. Since this cannot be implemented for users in India alone and will affect every user of the services across the world, these social media intermediaries will be in violation of international laws governing user privacy and security, thus inviting legal costs.
The Rules are seen as violating freedom of expression guaranteed in the Indian constitution by implementing traceability, which breaks encryption. Privacy, also a fundamental right as determined by the Supreme Court of India, is increasingly seen as a ‘make-or-break’ feature of all websites, apps, products, and services. Privacy operates from a position of assumption of innocence of the user. The Rules, by enforcing traceability, violate the fundamental rights of Indian citizens by reducing privacy to a conditional service, and not a constitutional guarantee
When the IT Rules came into effect in May 2021, they were criticized for imposing high costs of compliance, including legal and personal liability attached to employees of social media organizations. In the case of the office of the Chief Compliance Officer (CCO), liability extended even after the CCO retired from office. Every social media and news organization surveyed during this research pointed to the personal liability attached to the role of the CCO, grievance and nodal officers as imposing financial and legal costs on their organizations.
Proactive content filtering requirements will impact human resources requirements, demand changes in product and business operations, thereby significantly increasing costs. Traceability clauses under the Rules require extensive overhaul of messaging services and social networking platforms’ core architecture, requiring significant monetary and human resource investment.
Further, respondents in the Focus Group Discussions (FGDs) believed that ease of doing business will diminish given the stringent compliance regime and employee impact.
The Rules are also framed vaguely and arbitrarily, leading to confusion over operating clauses. Additionally, they have stringent reporting requirements. This will affect all organizations, especially small and medium enterprises, financially, and otherwise.
In addition to the legal and ethical concerns emerging from implementation of the Rules, there are knowledge, awareness, and skill gaps across a representative sample of the IT industry, which may impact the ability of organizations to comply with the IT Rules.
Software developers in junior and mid-level roles in IT organizations believe their workload will increase with the IT Rules. Respondents hinted at their jobs now requiring them to do more documentation and reporting, and their role in achieving compliance in the company’s product as increasing their workload.
Industry representatives however felt that tech workers and product managers will fundamentally need knowledge in, or retraining in, privacy features, content filtering and user experience, in order to fully comply with the Rules. Experts in the industry believe that more than just technical skills or knowledge, what is missing is also perspective and understanding of how executing the Rules will impact users of media and tech products.
As noted above, encryption and traceability requirements of the Rules will mean major changes in products, especially user experience and inability to safeguard privacy of Indian users under the IT Rules. Implementing features such as voluntary verification will need product managers to acquire new skills and knowledge. Tech workers will need to learn how to work in coordination with legal teams. Given the implementation of the IT Rules, each content takedown request will have to be serviced on a case-by-case basis. This will impact scale and standard operating procedures in organizations, or will result in organizations relying more heavily on automation to censor content proactively (and to avoid being served takedown notices). In both cases, users of these products will bear the brunt, where their freedom of speech and expression will be reduced drastically.
Individual chapters and sections of the report are presented as submissions. Scroll down to read them.
Nadika Nadja is a researcher at Hasgeek. She has worked across advertising, journalism, TV and film production as a writer, editor and researcher.
Bhavani S is a Research Associate at Hasgeek. She has previously worked for the Centre for Budget and Policy Studies (CBPS), Microsoft Research India, and the University of Michigan, Ann Arbor.
Anish TP illustrated the report. Satyavrat KK provided research and editorial support. David Timethy and Zainab Bawa were project managers for producing this report. Kiran Jonnalagadda and Zainab Bawa advised on research design and execution.
We would like to thank the following individuals who provided feedback during different stages of the research. Their feedback helped the team fine-tune and bring rigour to the research process.
- Suman Kar, founder of security firm Banbreach, for reviewing early drafts of the quantitative research questionnaire, and providing detailed inputs on survey design.
- Prithwiraj Mukherjee, Assistant Professor of Marketing at IIM-Bangalore, for reviewing early drafts of the quantitative research questionnaire, and providing detailed inputs on survey design.
- Chinmayi SK, Founder of The Bachchao Project, for reviewing and providing feedback on the final report and conclusions
While Hasgeek sought funding from organizations, the research itself was conducted – with full disclosure at all stages – independently and objectively. The findings do not reflect any individual organization’s needs.
Unicef: Growing concern for well-being of children and young people amid soaring screen time (2021) - https://www.unicef.org/press-releases/growing-concern-well-being-children-and-young-people-amid-soaring-screen-time ↩︎
LiveLaw: Supreme Court Lists Centre’s Transfer Petitions, Connected Cases After 6 Weeks
India Code: The Information Technology Act 2000 https://www.indiacode.nic.in/bitstream/123456789/1999/3/A2000-21.pdf ↩︎
India Code: IT Act Definitions https://www.indiacode.nic.in/show-data?actid=AC_CEN_45_76_00001_200021_1517807324077§ionId=13011§ionno=2&orderno=2 ↩︎
Conclusion and Recommendations
In the final analysis, the IT Rules are seen as having a deep, detrimental impact on the social media and digital news sectors, and result in increased work loads for employees, increased cost of operations and compliance for organizations. Along with existing concerns over the potential impact of the Personal Data Protection Bill (PDP) that is currently under advisement by a parliamentary body and future legislation around Non-Personal Data (NPD) and a growing concern over data governance, data nationalization, there is a very serious gap in awareness and understanding within the IT ecosystem on how policy impacts work.
In the specific case of the IT Rules, SSMI and Digital News organizations have gone to court to challenge the Rules. Sections of the Rules that are Ultra Vires of the parent IT Act will require arbitration in competent courts, and jurisprudence.
For SSMIs, the MEITY may proactively revise and redraft the Rules and ensure it complies with the parent law. Particularly, the changes to definition of intermediaries and creation of classes within intermediaries - as the IT Rules 2021 envisions - will need to be removed entirely. Safe Harbour, a ‘taken for granted’ privilege offered by nations across the world, and provided by the IT Act, is threatened under the IT Rules 2021. This will need to be reiterated and written explicitly into the new Rules.
However, if the ministry continues to maintain distinction between classes of intermediaries for operational or other purposes, it is recommended that threshold limits for social media intermediaries, such as user base or revenue, will need to be clarified and written explicitly into the Rules, following extensive consultation for the same.
At the industry and community level, representatives we spoke to say that the Rules will need to be redrafted to remove the personal liability attached to the role of the Chief Compliance Officer. Representatives also ask for guidance in the form of Standard Operating Procedures from the ministry, with regards to reporting, content take down, grievance redressal and others. The community strongly recommends that MEITY work with stakeholders to assess the costs of compliance for organizations of different sizes and at various levels of operations.
With multiple Law Enforcement Agencies (LEAs) potentially able to initiate proceedings against the CCO under the current version of the Rules, industry representatives say this will create a culture of fear and self-censorship, stifling free speech. While they strongly recommend removing personal liability entirely, organizations also say that at the very least, restrict it to a role-based responsibility, to ensure that employees are not unfairly held accountable even after their term of employment ends.
Some organizations also recommend that the government and the ministry consider imposing financial penalties in place of personal liability. The extent of financial liability will need to be determined after consultation with the industry, and based on the organization’s size, turnover, user base and other parameters.
Representatives suggest that only a state or union government – via a specially appointed Nodal Officer – be the point of contact for SSMIs. Further all takedown requests and grievance redressal be directed through the Nodal Officer of the government, helping streamline the process and providing clarity to organizations.
SSMI representatives, experts in security and privacy and legal experts all agree that privacy can be preserved while also protecting against the spread of CSAM or other patently illegal and unlawful content. Organizations already have robust internal mechanisms and teams to monitor and review such content and do not need to rely on identifying the originator to remove such content from their systems. Therefore the recommendation is to strengthen encryption and confirm protecting user privacy as a stated goal of the Rules.
Digital news organizations require clarity on why their sector has been clubbed in the Rules and how they qualify as intermediaries. Cases are currently under trial in various Indian courts challenging this. But there is a larger concern over dissent, freedom of the press and freedom of speech.
For organizations and industry bodies the take-aways are clear: invest in training, resources and forums by which skill and knowledge gaps are plugged. This is an immediate and critical requirement if the industry is to weather the changing policy climate.
Within organizations, leadership must invest in training to address the gaps in knowledge and skills of entry level and junior employees, and build resources for the continued learning and development of middle managers and leaders. The shape and nature of these trainings will necessarily vary from organization to organization. MEITY can play a role, by facilitating workshops and knowledge sharing sessions for organizations and industry bodies. However that is not enough and the larger concern with policy and governance will need a coordinated, industry-level solution.
This requires a forum in which individuals and groups across sectors and across the country can meet and discuss, and evolve solutions to issues of civil liberties. As we have seen through this report, there is a sorely felt gap: one for a peer network of technologists and practitioners, and civil society organizations to help and guide each other on technology and its impact on society.