IT Rules 2021: Impact assessment; legal, ethical and implementation concerns #
Social Media networks and platforms, chat and messaging applications, photo and video sharing services have radically transformed the internet landscape in India and elsewhere in the last decade. User generated content has allowed diverse voices to create and share views, political opinions, dance videos, movie and music commentaries.
While the platforms and networks have encouraged these voices, there is also a growing concern1 over the sharing of potentially offensive material such as pornographic content, child sexual abuse material (CSAM), hate speech and violent content often not suitable for the wide audience such platforms caters to.
The INFORMATION TECHNOLOGY (GUIDELINES FOR INTERMEDIARIES AND DIGITAL MEDIA ETHICS CODE) RULES, 2021 notified by the Ministry of Electronics and Information Technology (MEITY), together with the Ministry of Information and Broadcasting (MIB), Government of India – under the IT Act 2000, seeks to monitor and control user generated content and provide firm guidelines for social media Intermediaries, digital news publications and other organizations who host or transfer content on the internet.
The Rules were notified in February 2021, and went into effect in May 2021. Organizations and individuals have challenged the Rules on various counts2 – including their applicability under the parent law. Large platforms and social media networks have expressed concern about implementation and compliance.
Privacy Mode, a hub for conversations around privacy, data security and compliance, conducted a two-part research project seeking to understand the impact of the Rules on organizations and employees in the tech ecosystem who might be responsible for implementing the Rules and achieving compliance in tech and media products.
A qualitative study of Social Media Platforms, Digital News Publications, and Cloud Computing services providers, was undertaken to look at the possible impact on encryption, traceability, compliance, applicability of law among others, was conducted in May-June 2021; and a quantitative survey of tech workers across India, looking at awareness, professional and personal impact, work flows and requirements, was conducted in June-July 2021.
This report is a comprehensive analysis of both surveys and presents a rounded picture of the impact of the IT Rules 2021 on organizations and its employees. This research report also looks at larger questions and concerns about privacy, freedom of expression and speech given the discursive debates around responsible tech, digital platforms and ethics, and impact on society and individuals.
Executive Summary and Core Concerns #
Scope of the law #
By definition, the ‘Rules’ framed for any law in India are ‘Subordinate Legislation’ or ‘Delegated Legislation’. While laws are made by the Parliament/Legislature, Rules are made by the Executive i.e., the Government of India, to fulfill the requirements of the parent law. In Indian democracy, it is only the Legislative that can make laws. The Executive can only implement them. If the law says ‘XYZ has to be accomplished’, rules can frame the methods in which ‘XYZ’ can be accomplished. However, in the case of IT Rules 2021, the Rules are seen as overarching and exceeding the parent law.
Notified under the Information Technology Act, 20003 , which provides ‘Safe Harbour’ status to digital intermediaries, the Rules are ultra vires of the parent Act and seek to regulate activities that have no mention in it. Further, bringing digital news publishers under the ambit of the Rules, is unconstitutional and ultra vires of the IT act, as news websites do not fit the definition of ‘Intermediaries’ given under the Act4.
Further, the activities of news publishers and media are regulated by the Ministry of Information and Broadcasting (MIB)5, and thus excluded from the ambit of the IT Act. Concerns emerged that the Rules – which did not pass through the legislative body – sought to curtail rights and laws that did emerge from due legislative process.
Further, with existing guidelines under the Press Council Act that govern news organizations, the Rules are seen as overarching and drafted to censor specific media channels and outlets.
The Rules require intermediaries to identify the first originator of messages deemed objectionable. This implies that messaging platforms and social networking sites will have to significantly alter their product (and the technology underlying products) to comply. This is again not governed by the parent act, and is therefore unconstitutional. The Rules also operate from a position of assumed guilt, where all conversations and communications are expected to be scanned for potentially offensive material, and traced back to the original sender. This is against the assumption of innocence enshrined in the legal system operating in the country.
Breaking encryption and implementing traceability, a fundamental requirement of the new Rules, have international legal implications, as messaging services and social media platforms will need to alter the underlying technical architecture of their products or services - or at least have a different product and user experience for Indian users. Since this cannot be implemented for users in India alone and will affect every user of the services across the world, these social media intermediaries will be in violation of international laws governing user privacy and security, thus inviting legal costs.
Freedom of expression and natural justice #
The Rules are seen as violating freedom of expression guaranteed in the Indian constitution by implementing traceability, which breaks encryption. Privacy, also a fundamental right as determined by the Supreme Court of India, is increasingly seen as a ‘make-or-break’ feature of all websites, apps, products, and services. Privacy operates from a position of assumption of innocence of the user. The Rules, by enforcing traceability, violate the fundamental rights of Indian citizens by reducing privacy to a conditional service, and not a constitutional guarantee
Cost of compliance #
When the IT Rules came into effect in May 2021, they were criticized for imposing high costs of compliance, including legal and personal liability attached to employees of social media organizations. In the case of the office of the Chief Compliance Officer (CCO), liability extended even after the CCO retired from office. Every social media and news organization surveyed during this research pointed to the personal liability attached to the role of the CCO, grievance and nodal officers as imposing financial and legal costs on their organizations.
Proactive content filtering requirements will impact human resources requirements, demand changes in product and business operations, thereby significantly increasing costs. Traceability clauses under the Rules require extensive overhaul of messaging services and social networking platforms’ core architecture, requiring significant monetary and human resource investment.
Further, respondents in the Focus Group Discussions (FGDs) believed that ease of doing business will diminish given the stringent compliance regime and employee impact.
The Rules are also framed vaguely and arbitrarily, leading to confusion over operating clauses. Additionally, they have stringent reporting requirements. This will affect all organizations, especially small and medium enterprises, financially, and otherwise.
Skill and competency of Industry #
In addition to the legal and ethical concerns emerging from implementation of the Rules, there are knowledge, awareness, and skill gaps across a representative sample of the IT industry, which may impact the ability of organizations to comply with the IT Rules.
Software developers in junior and mid-level roles in IT organizations believe their workload will increase with the IT Rules. Respondents hinted at their jobs now requiring them to do more documentation and reporting, and their role in achieving compliance in the company’s product as increasing their workload.
Industry representatives however felt that tech workers and product managers will fundamentally need knowledge in, or retraining in, privacy features, content filtering and user experience, in order to fully comply with the Rules. Experts in the industry believe that more than just technical skills or knowledge, what is missing is also perspective and understanding of how executing the Rules will impact users of media and tech products.
As noted above, encryption and traceability requirements of the Rules will mean major changes in products, especially user experience and inability to safeguard privacy of Indian users under the IT Rules. Implementing features such as voluntary verification will need product managers to acquire new skills and knowledge. Tech workers will need to learn how to work in coordination with legal teams. Given the implementation of the IT Rules, each content takedown request will have to be serviced on a case-by-case basis. This will impact scale and standard operating procedures in organizations, or will result in organizations relying more heavily on automation to censor content proactively (and to avoid being served takedown notices). In both cases, users of these products will bear the brunt, where their freedom of speech and expression will be reduced drastically.
Individual chapters and sections of the report are presented as submissions. Scroll down to read them.
About the principal researchers #
Nadika Nadja is a researcher at Hasgeek. She has worked across advertising, journalism, TV and film production as a writer, editor and researcher.
Bhavani S is a Research Associate at Hasgeek. She has previously worked for the Centre for Budget and Policy Studies (CBPS), Microsoft Research India, and the University of Michigan, Ann Arbor.
Support team #
Anish TP illustrated the report. Satyavrat KK provided research and editorial support. David Timethy and Zainab Bawa were project managers for producing this report. Kiran Jonnalagadda and Zainab Bawa advised on research design and execution.
We would like to thank the following individuals who provided feedback during different stages of the research. Their feedback helped the team fine-tune and bring rigour to the research process.
- Suman Kar, founder of security firm Banbreach, for reviewing early drafts of the quantitative research questionnaire, and providing detailed inputs on survey design.
- Prithwiraj Mukherjee, Assistant Professor of Marketing at IIM-Bangalore, for reviewing early drafts of the quantitative research questionnaire, and providing detailed inputs on survey design.
- Chinmayi SK, Founder of The Bachchao Project, for reviewing and providing feedback on the final report and conclusions
While Hasgeek sought funding from organizations, the research itself was conducted – with full disclosure at all stages – independently and objectively. The findings do not reflect any individual organization’s needs.
*[MEITY]: Ministry of Electronics and Information Technology
*[MIB]: Ministry of Information and Broadcasting
*[MSME]: Ministry of Micro, Small, and Medium Enterprises
*[CSAM]: Child Sexual Abuse Material
Unicef: Growing concern for well-being of children and young people amid soaring screen time (2021) - https://www.unicef.org/press-releases/growing-concern-well-being-children-and-young-people-amid-soaring-screen-time ↩︎
LiveLaw: Supreme Court Lists Centre’s Transfer Petitions, Connected Cases After 6 Weeks
India Code: The Information Technology Act 2000 https://www.indiacode.nic.in/bitstream/123456789/1999/3/A2000-21.pdf ↩︎
India Code: IT Act Definitions https://www.indiacode.nic.in/show-data?actid=AC_CEN_45_76_00001_200021_1517807324077§ionId=13011§ionno=2&orderno=2 ↩︎
The Wages Of Fear: A Compendium Of Global and Domestic Encryption Debates
This submission is part of a compendium on encryption. I would implore readers to first watch a short primer where Matt Green takes us through the specific technological developments that surround the history mentioned in this compendium. Better still, check out Matt Green’s submission End-to-end encryption: State of the Technical and Policy Debate for a deeper dive into the history of global encryption debates.
“Governments of the Industrial World, you weary giants of flesh and steel, I come from Cyberspace, the new home of Mind. On behalf of the future, I ask you of the past to leave us alone. You are not welcome among us. You have no sovereignty where we gather.”
—John Perry Barlow, A Declaration of the Independence of Cyberspace
A prehistory of the crypto wars #
The late 60’s saw a series of wheels set in motion, the Vietnam War entered its extended denouement and sparked widespread anti war demonstrations in America,1 the Civil Rights movement in the same country took on a militant flavour with the emergence of Huey Newton and the Black Panthers,2 it was the beginning of the period known as the Troubles in Northern Ireland3, Eastern Europe went through a series of uprisings, and, closer home, even Pakistan saw mass protests against General Ayub Khan.4 Despite modern notions of ours being a deeply interconnected, globalised world, the ripples of the Cultural Revolution in hermetically sealed China (atleast to the West) was felt all the way in distant France, with the French Maoists or the broader formation of ‘soixante-huitards’ continuing to dominate the cultural sphere through the 70s through mass media newspapers like ‘Le Monde’ amongst many other spheres of influence.5
The great American disillusionment with its government after the fallout of Vietnam only received a steroid boost with the Watergate scandal, with Alan J Pakula’s tagline for the last of the series of films that came to be known as the ‘Paranoia Trilogy’, “There is no conspiracy. Just twelve people dead.”, evoking the mood of times adequately.6 A quick look at some of whom the counterintelligence program COINTELPRO run by the FBI sought to target inadvertently reveals a list of actors who would play crucial roles in conversations about our topic of enquiry in this compendium; feminist organizations, anti–Vietnam War organizers, activists of the civil rights movement or Black Power movement (e.g., the Nation of Islam, and the Black Panther Party), and environmentalist and animal rights organizations.7
It was these actors, part of the broad free speech movement and counterculture, along with its various inheritors and progeny, who sought to keep the old flame alive with the advent of the internet. Many who directly impacted the internet, like John Perry Barlow mentioned right at the beginning of this compendium, were directly involved and informed by the cultural churning of those times, seeking out a world “of the Mind in Cyberspace...more humane and fair than the world your governments have made before.”8
Despite the persistence of these groups, they never came to assume real political power, often being decimated by the efforts of intelligence agencies (in the notable case of the Black Panthers).9
However, global leaders like Ronald Regan and Margaret Thatcher were direct outcomes of these suspicions surrounding traditional political structures, with their eye towards global finance, the free market, corporations, and a mutual fear of Communist expansionism. The principle underlying their bonhomie was perfectly encapsulated through Thatcher’s dictum; “Who is society? There is no such thing! There are individual men and women and there are families and no government can do anything except through people and people look to themselves first.”10
Encryption through the social tumult #
In Silicon Valley, another transition was underway that would change the world as we knew it. Once a
center of blimp production, then transformed by William Shockley Jr’s work on transistors in the post war era, a group of Shockley’s disgruntled and exceptionally brilliant assistants went on to form 85 companies between them including Intel, AMD, and IBM.11 As the ‘Keys Under Doormats’ study neatly presents, it was here that conversations around encryption began; “The Crypto Wars actually began in the 1970s, with conflicts over whether computer companies such as IBM and Digital Equipment Corporation could export hardware and software with strong encryption, and over whether academics could publish cryptographic research freely. They continued through the 1980s over whether the NSA or the National Institute of Standards and Technology (NIST) would control the development of cryptographic standards for the non-national security side of the government (NIST was given the authority under the 1987 Computer Security Act). They came to full force during the 1990s, when the US government, largely through the use of export controls, sought to prevent companies such as Microsoft and Netscape from using strong cryptography in web browsers and other software that was at the heart of the growing Internet. The end of the wars — or the apparent end — came because of the Internet boom.”12
Through the 90’s, the American government, it’s Intelligence Agencies, as well as many other industrialised nations sought various ways to undermine encryption through legislation. Claiming that widespread encryption would be disastrous for law enforcement, the US government proposed the use of the Clipper Chip, an encryption device that contained a government master key to give the government access to encrypted communications. Other governments followed suit with proposals for encryption licensing that would require copies of keys to be held in escrow by trusted third parties — companies that would be trusted to hand over keys in response to warrants.13
These overtures evoked impassioned debates amongst academia, industry, NGOs and other members of civil society. Many of the issues raised by key escrow or trusted-third-party encryption that analyzed the technical difficulties, the added risks, and the likely costs of such an escrow system were brought up during these debates. The push for key escrow fizzled out in 2000 because of pressure from industry during the dotcom boom and because of political resistance from the European Union, among others.
At Macworld 2007, Steve Jobs offered a prescient preamble for the product he was about to unveil, “Every once in a while a revolutionary product comes along that changes everything.”14 The product was the first iPhone, the first mass produced smartphone, which “moved us over to a world where we have a portable device with a relatively powerful processor and memory and the ability to hold cryptographic keys.
This created a world where suddenly we wanted to communicate with people using keyboards and electronically in a portable way.” as Prof. Matt Green mentioned in his keynote address at the Hasgeek data privacy conference.12:1 In the years that followed, the advent of iMessage, Whatsapp, and Signal offered encryption to billions of people and counting. This has reinvigorated the fervour of many nations in their bid to come up with legislation around encryption. Green summarises the needs of various governments in a graded hierarchy of preference very succinctly:
- No Encryption Use at all: An ideal case scenario where messages are sent in plain text and retained in long term stores at providers, who are privy to Governments.
- Key Escrow: Where a master encryption key, which could be held by the government, but more likely held by the provider or company like Facebook or Apple. This master key will decrypt on ad hoc demand. Governments have also been asking for the ability to have what they call ‘exceptional access’ when they have a warrant.
- Real Time Content Scanning: Real-time content scanning entails looking at every single message that a user sends through the system. This has largely been brought up in the context of preventing CSAM (Child Sexual Abuse Material).
- Targeted Eavesdropping: Governments like the U.K have pushed for more targeted wiretapping and eavesdropping, specifically the Ghost User’s Proposal by GCHQ.
- Traceability: India’s proposal of hash based tracing, specifically targeting the originators of viral forwards is a unique proposal on the global scale.
India through the age of encryption #
“Finally, the Committee has noted that the 26/11 and other terrorist acts have shown that terrorism kills persons irrespective of religion, caste, community, age or sex, rich and poor alike. This threat must therefore be faced as a peoples’ war. In stating so, the Committee wishes to instill a sense of alertness, and not to create any fear psychosis, because that is what terrorism aims to achieve.”
—Ram Pradhan Inquiry Commission, 2008
In 2008, a series of terrorist attacks took place when 10 members of Lashkar-e-Taiba carried out 12 coordinated shooting and bombing attacks lasting four days across Mumbai. The consequences of this day would see India charting new courses when it came to questions of surveillance and domestic security. The commission assigned to enquire into the intelligence failures of the terrorist attack took cognizance of the various efforts made by American police and intelligence agencies in tackling terrorism; “The Department of Home Land Security (DHS) created several intelligence Fusion centres where the intelligence producers and the executing wings like State police, Port and Transportation Security (in-charge of Aviation Security) etc take part in constant dialogue on the likely terrorist threats based on available intelligence. This has worked well in that country.”15
In 2019 a convoy of vehicles carrying Indian security personnel on the Jammu–Srinagar National Highway was attacked by a vehicle-borne suicide bomber at Lethapora (near Awantipora) in the Pulwama district of the erstwhile state of Jammu and Kashmir. A notable feature of this attack was the discovery of the use of encrypted messaging through a feature known as the YSMS communication application.16
Debates about surveillance and encryption were also raised around the Bhima Koregaon incident, where Surendra Gadling, Sudhir Dhawale, Rona Wilson, Shoma Sen, Mahesh Raut, Varavara Rao, Sudha Bharadwaj, Arun Ferreira, Gautam Navlakha, Vernon Gonsalves, and Stan Swamy were arrested for their alleged links to Maoists. Writing on the discrepancies in Rona Wilson’s case, the national security specialist journalist Praveen Swami wrote “There’s one especially strange feature of the entire case: We don’t know if what the Forensic Science Laboratory in Pune studied is actually a faithful copy of whatever was on Wilson’s computer. Every computer file or collection of files can generate a unique hash-value...for obvious reasons, the hash-value of a disk is crucial for proving that digital information has not been tampered with after it has been seized by law-enforcement.
Globally as well as in India, police protocols mandate not even to turn on a computer after seizing it, since that would change the contents of the disk, and allow defence lawyers to claim it had been manipulated. Instead, the disks seized by law-enforcement are removed by forensic experts, and exact images made of them using tools installed on specialised gigs. The hash value of the originally-seized disks— which takes a few minutes to generate — and these mirror-images must be identical. The images — not the original — are then analysed using software which does not modify the files by opening them. Even if an error is committed during analysis, only the image is affected, not the original. For reasons that have yet to become clear, the Pune Police never provided the hash value of the disk recovered from Wilson. The hash value, defence lawyers say, has still to be made available.”17
In 2019 Prof. Kamakoti of IIT filed an affidavit in the Madras High Court that sought two measures towards traceability:
- Adding an originator information with every message.
- A permission-based system that allows users to classify a message as forwardable and not-forwardable.18
Addressing these concerns, Dr. Prabhakar, a cryptography expert from IIT, filed an expert submission stating, “Including a mechanism for tracing the originator (that can be optionally turned off, for messages not intended for sharing) is a relatively mild modification, and in the short term, could be effective in deterring some individual actors from creating viral messages that the law enforcement authorities find objectionable. But it has very limited effectiveness in the long term, or against determined attackers. We note that more effective alternatives may exist. Viral messages (not the users) could be “outed” and made available publicly so that fact checkers could add comments to them, and the WhatsApp client can display these comments alongside the messages. To implement this, an optional feature could let users identify messages they have received after several forwards (as estimated using a counter maintained within the message), and anonymously communicate those messages to a server for making them available publicly. It may also be possible to design offline or online “spam filters,” which can detect and mark messages as potentially unreliable, to discourage users from sharing them. For a company like Facebook it would be easy to quickly develop such a mechanism (and incrementally improve its effectiveness), with minimal disruption to the user experience.
Finally, there is increasing recognition that a lasting defence against the spread of fake news should be based on education and information literacy. Such efforts should complement technological and legal attempts to regulate the online world.”19
The difference between the Indian context and places like the USA, the EU, and the UK, is that these issues don’t command the same interest by civil society groups and cannot galvanise numbers for a robust democractic debate as seen in the West. This can be explained by issues of internet penetration in the country as well as the relative recency of these concerns in the legislative context.
Theories of surveillance #
“At all times, whether I am eating, or am in the women’s apartments, or in my inner apartments, or at the cattle-shed, or in my carriage, or in my gardens - wherever I may be, my informants should keep me in touch with public business. Thus everywhere I transact public business. And whatever I may order by word of mouth, whether it concerns a donation or a proclamation or whatever urgent matter is entrusted to my officers, if there is any dispute or deliberation about it in the Council, it is to be reported to me immediately, at all places and at all times. This I have commanded.”
—Ashoka, 6th Major Rock Edict
“A multiplicity of political units, none powerful enough to defeat all the others, many adhering to contradictory philosophies and internal practices, in search of neutral rules to regulate their conduct and mitigate conflict, a defense based on the necessity to come to an arrangement with each other, not on some sort of superior morality, bringing a balance of power that we are missing today."
—Henry Kissinger, World Order
In the subcontinent, as evidenced by the edict mentioned above, surveillance was an important aspect of statecraft whether reified in the writing of Kautilya or in the presence of caurōddharaṇikas (‘remover of thieves’), who eventually gained hereditary access to land for carrying out secular functions like surveillance and then went on to solidify into castes such as Chaudharis, Chaudharys etc.20 For the purposes of this compendium, we will limit ourselves to the emergence of modern justifications for the push towards global mass surveillance, which are in small part to do with the chickens Kissinger set off in the Middle East that came home to roost.
Towards his vision of ‘constructive ambiguity’ and a ‘balance of power’ in the world, Kissinger sought to suppress what he perceived could become a potentially all-powerful united Middle East that set off a chain of events which resulted in wars in every other sovereign state in the Middle East. As Prof Greg Grandin put it, “We are still reaping the bloody returns of Kissinger’s interventions.”21
It is those interventions which laid the foundation for a wholly new rationale of surveillance, perhaps best expressed in the book ‘An End to Evil: How to Win the War on Terror’ by David Frum (a political commentator and former speechwriter who coined the ‘Axis of Evil’ expression) and Richard Perle (a defense strategist deeply embedded in various organs of American National Security). Presenting a neoconservative, hawkish view of the world, the book charted new territory in some of its recommendations:
Requiring all residents to carry a national identity card that includes “biometric data, like fingerprints or retinal scans or DNA,” and empowering all law enforcement officers to enforce immigration laws. The authors admit that such a card “could be used in abusive ways,” but reassure us by saying that victims of “executive branch abuse will be able to sue.”
Encouraging Americans to “report suspicious activity.” Apparently alone among Americans, the authors lament the demise of the TIPS program, a domestic intelligence-gathering program designed by President George W. Bush to have United States citizens report suspicious activity. The program’s website implied that US workers who had access to private citizens’ homes, such as many cable installers and telephone repair workers, would be reporting on what was in people’s homes if it were deemed “suspicious.”
Changing immigration policy so that the U.S. can bar all would-be visitors who have “terrorist sympathies.”22
While the book generated a fair bit of debate, the guiding logic did not go unheeded. The NSA (National Security Agency) transcended the neoconservate parapraxis into a reality that would be far too complex for either of the authors to grasp. With humble beginnings in Stellar Wind,23 a program that allowed the NSA to monitor call and text metadata of U.S. citizens and tap any international calls that included a U.S.-based caller, to PRISM, an NSA internet surveillance tool created to collect the private Internet data of foreign nationals, but in doing so, also sweeps up the data of U.S. citizens, including emails, files and photos, through accessing user accounts on Gmail, Facebook, Apple, Microsoft and other tech companies, all the way to SOMALGET, which formed part of MYSTIC, a formerly classified NSA program that spied on the phone calls of the entire populations of five countries, covertly tracking the communication records of 250 million people in total.24 The Snowden leaks however have not deterred the course of action, and most of the protagonists have been rendered stateless or incognito.
Given the Manichean rationale that discourse around surveillance operates, it is unsurprising that one of the more common reasons cited for breaking encryption is the prevalence of CSAM (Child Sexual Abuse Material). However, a recent study found that “there is no evidence to suggest that online abuse and exploitation are more serious or pervasive offences than crimes occurring offline.”,25
Nonetheless as of April 2021, Home secretary Priti Patel called for tech companies including Facebook to “live up to their moral duty”, and do more to safeguard children in a roundtable discussion about end-to-end message encryption. As the previously mentioned study also suggests, there seems to be a lack of communication between policy makers and activists and researchers who study the complex terrain of CSAM.13:1
Things as they are now #
“The encryption of data and communications has long been understood as essential. Strong encryption thwarts criminals and preserves privacy for myriad beneficiaries, from vulnerable populations to businesses to governments. At the same time, encryption has complicated law enforcement investigations, leading to law enforcement calls for lawful access capabilities to be required of encryption technologies.”
The Carnegie Endowment For International Peace, Moving the Encryption Policy Conversation Forward
“The Privacy by Design (PbD) approach is characterized by proactive rather than reactive measures. It anticipates and prevents privacy invasive events before they happen. PbD does not wait for privacy risks to materialize, nor does it offer remedies for resolving privacy infractions once they have occurred — it aims to prevent them from occurring. In short, Privacy by Design comes before-the-fact, not after.”
—Ann Cavoukian, Privacy by Design: The 7 Foundational Principles
Carnegie’s ‘Moving the Encryption Policy Conversation Forward’ paper warns of the troubles of this push, a paper that has come to be regarded well by governments across ideological divides. “There will be no single approach for requests for lawful access that can be applied to every technology or means of communication. More work is necessary, such as that initiated in this paper, to separate the debate into its component parts, examine risks and benefits in greater granularity, and seek better data to inform the debate. Based on our attempt to do this for one particular area, the working group believes that some forms of access to encrypted information, such as access to data at rest on mobile phones, should be further discussed. If we cannot have a constructive dialogue in that easiest of cases, then there is likely none to be had with respect to any of the other areas. Other forms of access to encrypted information, including encrypted data-in-motion, may not offer an achievable balance of risk vs. benefit, and as such are not worth pursuing and should not be the subject of policy changes, at least for now. We believe that to be productive, any approach must separate the issue into its component parts.”26
At this stage of the debate, we have only breached the realm of cellphones, with questions of a higher order regarding privacy, digital identity, individual liberty etc being shelved for a better day in the sun. It is imperative that these questions be raised, with this compendium offering a modest insight into a complex and thorny world.
American Experience. “Protests and Backlash.” American Experience | PBS, 9 Aug. 2017, www.pbs.org/wgbh/americanexperience/features/two-days-in-october-student-antiwar-protests-and-backlash. ↩︎
“The Black Panther Party.” National Archives, 22 Mar. 2021, www.archives.gov/research/african-americans/black-power/black-panthers. ↩︎
Wallenfeldt, Jeff. “The Troubles | Summary, Causes, & Facts.” Encyclopedia Britannica, 16 Sept. 2014, www.britannica.com/event/The-Troubles-Northern-Ireland-history. ↩︎
Eisenberg, Aileen. “Pakistani Students, Workers, and Peasants Bring down a Dictator, 1968–1969 | Global Nonviolent Action Database.” Global Non-Violent Action Database, 22 Feb. 2013, nvdatabase.swarthmore.edu/content/pakistani-students-workers-and-peasants-bring-down-dictator-1968-1969. ↩︎
Fields, Belden. “French Maoism.” Social Text, no. 9/10, 1984, pp. 148–177. JSTOR, www.jstor.org/stable/466540. Accessed 19 May 2021. ↩︎
Simon, Art. “In The Parallax View, Conspiracy Goes All the Way to the Top—and Beyond.” Slate Magazine, 21 July 2017, slate.com/culture/2017/07/the-parallax-view-is-a-70s-paranoid-classic-about-evil-corporations-and-political-assassinations.html. ↩︎
Frederique, Nadine. “COINTELPRO | United States Government Program.” Encyclopedia Britannica, 13 Nov. 2014, www.britannica.com/topic/COINTELPRO. ↩︎
Barlow, John Perry. “A Declaration of the Independence of Cyberspace.” Electronic Frontier Foundation, 8 Apr. 1996, www.eff.org/cyberspace-independence. ↩︎
Segal, Rachel. “The Rise and Fall of the Black Panthers | The Perspective.” Theperspective.Com/, 6 May 2021, www.theperspective.com/subjective-timeline/politics/the-rise-and-fall-of-the-black-panthers. ↩︎
Thatcher, M. ‘Interview for “Woman’s Own” (“No Such Thing as Society”). ↩︎
Moffitt, Mike. “How a Racist Genius Created Silicon Valley by Being a Terrible Boss.” SFGATE, 22 Aug. 2018, www.sfgate.com/tech/article/Silicon-Valley-Shockley-racist-semiconductor-lab-13164228.php. ↩︎
Apple. “Steve Jobs Introducing The IPhone At MacWorld 2007.” YouTube, 3 Dec. 2010, www.youtube.com/watch?v=x7qPAY9JqE4&ab_channel=superapple4ever. ↩︎
Staff, The Wire. “Full Text: What the High Level Inquiry Committee on the 26/11 Attacks Had to Say.” The Wire, 26 June 2019, thewire.in/security/26-11-mumbai-terror-attack-inquiry-committee. ↩︎
Joseph, Josy. “Militants Outsmart Indian Agencies with New Tech Tool.” The Hindu, 16 Aug. 2016, www.thehindu.com/news/national/militants-outsmart-indian-agencies-with-new-tech-tool/article7638397.ece. ↩︎
Swami, Praveen. “Bhima Koregaon Hacking Allegations: What Arsenal Analysis of Rona Wilsons PC Says.” Firstpost, 15 Feb. 2021, www.firstpost.com/india/an-idiots-guide-to-bhima-koregaon-hacking-allegations-what-arsenal-analysis-of-rona-wilsons-pc-says-9304031.html. ↩︎
Kamakoti, R. “Report Of Prof. Kamakoti In WP Nos. 20214 And 20774 Of 2018.” Internet Archive, 2018, archive.org/details/reportofprof.kamakotiinwpnos.20214and20774of2018. ↩︎
Prabhakaran, Manoj. “Manoj Prabhakaran_Expert Report.Pdf.” Google Docs, 31 Sept. 2018, drive.google.com/file/d/1vivciN8tNSbOrA9eZ8Ej0mCAUBzRWu5N/view. ↩︎
Institute for Public Affairs. “The Disastrous History of Henry Kissinger’s Policies in the Middle East.” In These Times, 15 Aug. 2020, inthesetimes.com/article/henry-kissinger-greg-grandin-us-blowback-middle-east. ↩︎
Jervis R. Review: An end to Evil. International Journal. 2004;59(3):714-716. doi:10.1177/002070200405900316 ↩︎
P/K. “Edward Snowden and the STELLARWIND Report.” Edward Snowden, 19 May 2021, www.electrospaces.net/2020/03/edward-snowden-and-stellarwind-report.html. ↩︎
“United States of Surveillance.” The Privacy Issue, 10 Oct. 2018, theprivacyissue.com/government-surveillance/united-states-of-surveillance-us-history-spying. ↩︎
Encryption Working Group. “Moving the Encryption Policy Conversation Forward.” Carnegie Endowment for International Peace, 10 Sept. 2019, carnegieendowment.org/2019/09/10/moving-encryption-policy-conversation-forward-pub-79573. ↩︎