2020 might have been heralding the decade of privacy. From leaks in video conferencing software to doubts over COVID-19 apps, privacy concerns have been front and centre. While most of the commentary is when things go wrong, there is an angle not talked about: the attitudes of the people and organizations who make the technology we use on a daily basis. Their work today shapes our experience of tomorrow, but we know little of their perspective.
Between April and December 2020, the Privacy Mode team at Hasgeek (henceforth “we” and “us”) surveyed the makers of tech products and services in India. We sought to understand whether privacy is a guiding factor in product development, or an afterthought in response to regulation and public sentiment.
Summary of findings #
We started by conducting interviews with experts, followed by focus group discussions. The learnings from these sessions helped frame hypotheses and a qualitative questionnaire. See our note on framing and review, and the detailed findings and analysis.
The survey finds that 90% of the respondents believe that their organization treats privacy as a fundamental right and that it attempts to realize that intent while building products and services. At the baseline, at least three in four respondents in authority roles require thinking about privacy and compliance issues across all organization sizes.
However converting that intent into practice remains an onerous challenge. Small and medium-sized organizations fare worse than large organizations. All three struggle with:
- Not having dedicated roles to manage conflicts arising out of product and engineering decisions on user privacy,
- Not having institutionalized “playbooks” (meaning process workflows, standard operating procedures and cultural values that shape consistent outcomes), and
- Peer groups to debate these issues both within and outside the organization.
Some industry sectors are governed by regulatory regimes such as HIPAA, CCPA and GDPR. The responses within these sectors did not deviate greatly from the general responses. This suggests adding regulatory pressure on small and medium organizations does not automatically change their internal product development dynamics to make them more privacy-friendly.
*[HIPAA]: Health Insurance Portability and Accountability Act (United States federal statute, 1996)
*[CCPA]: California Consumer Privacy Act (California state statute, 2018)
*[GDPR]: General Data Protection Regulation (European Union regulation, 2016)
Over a fifth of respondents across all organization sizes say there is a lack of a peer group across the tech ecosystem to discuss and find solutions to privacy concerns. This lack was felt both within the organization and outside of it. This points to a larger concern where even those who may want to implement privacy-respecting features in products or services do not have the adequate support or use-cases to guide them. As organizations get larger, their access to external peer groups reduces, indicating the development of an insular culture.
- Close to 30% of organizations lack learning and development budgets for understanding privacy during product development.
- Small organizations find it particularly hard to get stakeholders interested in privacy concerns.
- Small and medium organizations do not have the organization structure or capability to have specialised departments to handle risk and compliance, which – combined with their lack of training budgets for privacy, and lack of standard procedures to handle privacy concerns or risk – is a point of great concern and will have implications across the tech ecosystem.
As one startup founder we interviewed said:
“Companies are doing a poor job communicating this (the need for user privacy and security)... I don’t think they are malicious. They’re lousy in their implementation and security, but there is no mal-intent or ill intent to begin with, they are lousy sometimes.”
The good news is that awareness about privacy and recognition of its importance as a practice – and as an end-user benefit – exists across leadership in large, medium and small organizations. The not-so-good news is that intent by itself is not sufficient to move organizations towards the ideal of protecting user privacy and user data.
As the data in this research shows, specialized personnel, budgets, processes and community resources have to be available in abundance if India wants a smooth transition towards a data protection regime. This is where the gap is.
Here are our recommendations for different stakeholders to move towards the privacy-preserving ideal, one step at a time.
Small and medium organizations struggle to establish a business model with repeatable unit economics. This is a paramount concern in the early stages. While there is intent to embed privacy practices in the product-development cycle, small and medium organizations do not have the skills or budgets.
As our data indicates, adding regulatory pressure does not improve outcomes. On the other hand, regulation can increase the compliance burden, thereby adversely affecting small and medium organizations and turning them into non-viable businesses.
This research recommends to the Joint Parliamentary Committee (JPC) and the Data Protection Authority (DPA) for the Personal Data Protection (PDP) bill that the bill’s provisions must provide for exemptions for organizations below a certain revenue or fundraise threshold, and that the government must invest in the development of community resources to make the law feasible.
Across all the segments, including large organizations, software engineering practices have not matured enough to be ready for PDP compliance from day one. There is also a lack of clarity at the operational level on the implementation of privacy processes such as consent management, data-security audits, data-erasure requests, anonymisation, and purpose limitation.
Technical standards have to evolve – both from within and by knowledge-sharing at the community level – to address these issues. The JPC and DPA must provide at least two years’ time for organizations to become compliant with the standards prescribed in the PDP bill’s provisions.
Business organizations, especially early and growth stage startups, generally refrain from sharing internal knowledge and practices with other organizations. Insecurity about growth and organizational secrets, and pressure from investors to grow Intellectual Property (IP), has led to a culture of withholding knowledge, making it hard to develop capacity.
The structural capacity issues that this research refers to – skill gaps and lack of knowledge resources – can be resolved if leadership, legal, and data privacy officers in small and medium organizations become forthcoming about systematic best-practice and knowledge sharing. Journeyman documentation, which lays out the map of these practices at operational, and technical levels, will be more beneficial than broad directions.
While simplistic solutions may get widespread adoption because they substitute for missing capacity, they will hamper small and medium organizations from competing on a global scale, mainly because these organizations may have to maintain and invent different sets of practices to comply with standards such as GDPR and CCPA. This will increase the cost of compliance.
For investors, advisors and external stakeholders, as well as community organizations and founders of small and medium organizations, this research makes clear recommendations:
Build capacity by contributing to a shared knowledge pool beyond your organization, in the form of meetup groups and their knowledge archives
Encourage thinking about privacy at the product and engineering levels. Allow your practitioners to gain public recognition for their work.
Take privacy preserving ideas and practices in product development cycles to external forums, to policy makers and such, and encourage adoption across the ecosystem.
*[IP]: Intellectual Property
*[JPC]: Joint Parliamentary Committee
*[DPA]: Data Protection Authority
*[PDP]: Personal Data Protection
Scroll down for the individual chapters.
About the principal researchers #
Nadika Nadja is a researcher at Hasgeek. She has worked across advertising, journalism, TV & film production as a writer, editor and researcher.
Anand Venkatnarayanan has worked as a cybersecurity engineer for two decades. He is currently consulting editor for Privacy Mode and a non-executive director at Hasgeek.
Support team #
We would like to thank the following individuals from tech, legal, and policy backgrounds who provided feedback during different stages of the research. Their feedback helped the team fine-tune and bring rigour to the research process. While Hasgeek sought funding from organizations, the research itself was conducted – with full disclosure at all stages – independently and objectively, and the findings do not reflect any individual organization’s needs.
- Akshat Jain, Consultant at Omidyar Network India
- Alok Prasanna Kumar, Co-founder and senior resident fellow at Vidhi Centre for Legal Policy
- Ashwin Kumar, Director at FMR India
- Chinmayi S K, Founder at The Bachchao Project
- Raahil Rai, Investment Associate | Digital Society at Omidyar Network India
- Subhashish Bhadra, Principal at Omidyar Network India
- Suchana Seth, Founder and CEO at The Mindful AI Lab
Updates and corrections are posted in the Updates section. Use the “Follow” button to get notified.
The implications of caste and gender on privacy
In a paper for the Network and Info-Tech Research and Development programme of the US Govt, authors Baskerville, Alashoor and Zhu, say1 “In the language of privacy discourse, the notion of identity becomes highly pertinent, merely due to the associated threat of privacy invasion to identity reconstruction.”
The authors propose a “Identity Bridge” in which a threat to a physical self or a physical identity has consequences on a digital or online identity, and similarly, a threat to digital or online identity has consequences for the physical self.
“if the online identity is a metaphor indicating an individual’s real life and real social interactions, privacy invasions are very consequential. Hence, a threat to the digital identity brings about a threat to online identity; and if true personal identity and social identity were used to construct the online 6 identity, a threat to the online identity generates a threat to the social identity and the personal identity, respectively. The threats to the social and personal identities are hypersensitive as they are directly related to the physical milieu.”
In his 1975 book, “The Environment & Social Behaviour”,2 Irwin Altman defines Privacy as “selective control of the access to the self, or to one’s group”.
Here, the self is a pointer to one’s physical self and one’s identity. That is, the self is an awareness of who one is, which is both a pointer to a physical embodiment and a social/cultural identity forged by interaction with others.
In the Indian context, the identity is both an individual identity and a membership to a group identity – a caste identity – further reinforced by social interaction. Caste is a social and a ritual hierarchy, and is also intricately linked to one’s economic activity. It imposes a social order and emphasises a social hierarchy over the individual identity. Sukhdeo Thorat and Katherine Newman in an EPW essay say,3 “The economic organisation of the caste system is based on the division of the population into a hierarchical order of social groups that determine the economics rights of members, which are determined by birth and are hereditary in the strictest sense of the term.” The authors, paraphrasing Ambedkar, say. “A community-based system of enforcement regulates caste privileges by means of social ostracism, violence, and economic penalties that find their justification in elements of Hindu religion.”
In the caste hierarchy, those higher up the ladder, with more privilege, have greater authority, better economic and social prospects, and more political say. The upper and dominant castes have better access to education, employment, wealth and capital, health, and political office. This provides members of these castes a greater cushion to soften blows that may arise from any threat - be it physical, social, or economic.
The more social and cultural capital or privilege one has, the less a potential invasion of privacy matters. Social privilege offers the individual a protection from the “bang”, as well as a cushion to weather the effects following the bang. That is, social hierarchy and privilege offers both “before the bang” privacy protection, and better “after the bang” ability to recover.
For the upper castes and privileged, this manifests in what personally identifiable information (PII) one is willing to reveal – be it one’s caste surname, age, marital status, educational qualifications, occupation/profession or political affiliation. The lower down the caste hierarchy one is, the lesser the protection, and greater the cost of recovery.
For instance, a digital wallet or payments app may gather the user’s payment methods – vital for core functioning – but may also ask for physical location, addresses, age and gender, and collect information regarding other apps used and nature and quantity of purchases made, and use these to infer social locations of the user such as caste and gender, marital status, and more. When a breach happens and this data is leaked, users with lesser privilege are affected more – in that they had good reasons to not reveal such data compared to those with greater privilege.
Product decisions – beyond core functionality – are influenced by the social and economic conditions and locations of the team making the product.
A study by Surinder Jodhka & Katherine Newman in 2007 concludes:4 “The belief in merit is only sometimes accompanied by a truly ‘caste blind’ orientation. Instead, we see the commitment to merit voiced alongside convictions that merit is distributed by caste or region and hence the qualities of individuals fade from view, replaced by stereotypes. Under these circumstances, one must take the profession of deep belief in meritocracy with a heavy grain of salt.”
A study by Krishna & Brahmadeyam5 found that a majority of new recruits in Bangalore’s software firms had parents who were both educated to at least a high-school degree.
A study by Sonalde Desai and Veena Kulkarni, on education attainment of youth6 (ages 24-29) showed that in 1983, over 50% of upper caste men, and 32% of upper caste women in India had access to primary and secondary education, and went on to a college degree. The numbers for 1999-2000, was around 60% for upper caste men, and 41% for upper caste women. In contrast, for Dalits and other men of a marginalised caste location in 1983, only 33% had access to education at a primary or high-school level, and only 11% of the Dalit women had access to the same. For 1999-2000, 47% of Dalit men had access to education up to high school, and 41% of Dalit women were educated to primary or secondary school level.
These studies indicate that a large section of technology professionals holding authority roles are upper-caste men. Thus, one can argue that the decision making when it comes to privacy-enhancing features of products or services – whose core functionality is not privacy protection – is made by those for whom privacy risks are minimal, and may adversely impact those for whom the same risks are significant.
There is also considerable study and anecdotal evidence to suggest that ignorance of one’s own caste identity, or ability to claim castelessness, is often a preserve of the upper castes. Sathish Deshpande in Caste and Castlessness says,7 “there is an awareness that ‘in this land of equality and liberty’ the public declaration of upper caste identity has been made voluntary, and that this could be a decisive tactical advantage. Unlike the compulsory marking of lower caste identity which the new republic perpetuates and intensifies, upper caste identity may now be declared or not at will. Most important, the privileges and benefits that accrue to the upper caste identity may now be accessed anonymously, while its political-moral debts and liabilities are written off by the new Constitution.”
A. M. Shah in Mirage of Caste-less Society, says8 that while a new, urbanised section of society claiming to be free of caste does exist, “Often the claim to being caste-less is skin deep, and caste surfaces all of a sudden in mysterious ways.”
Of the 180+ respondents, more than a third (68) self identified as upper caste. Another 25 responded “Don’t know”. The responses are presented here classified by whether the respondent holds an authority role in their organization.
*[EPW]: Economic and Political Weekly
Alashoor, T., Baskerville, R. and Zhu, R., 2016, January. Privacy and identity theft recovery planning: an onion skin model. In 2016 49th Hawaii International Conference on System Sciences (HICSS), IEEE, pp. 3696–3705 ↩︎
Altman, I., 1975: The Environment and Social Behavior: Privacy, Personal Space, Territory, Crowding, Page 24, Brooks/Cole Publishing Company ↩︎
Thorat, S. and Newman, K. S., 2007. Caste and economic discrimination: Causes, consequences and remedies. Economic and Political Weekly, pp. 4121–4124. ↩︎
Jodhka, S. S. and Newman, K. S., 2007. In the name of globalisation: Meritocracy, productivity and the hidden language of caste. Economic and Political Weekly, pp. 4125–4132. ↩︎
Krishna, A. and Brihmadesam, V., 2006. What does it take to become a software professional?. Economic and Political Weekly, pp. 3307–3314. ↩︎
Desai, S. and Kulkarni, V., 2008. Changing educational inequalities in India in the context of affirmative action. Demography, 45(2), pp.245–270. ↩︎
Deshpande, S., 2013. Caste and Castelessness. Economic and Political Weekly, vol. 48, no. 15, 2013, pp. 32–39. ↩︎
Shah, A. M., 2017. The mirage of a caste-less society in India. Economic and Political Weekly, vol. 52, no. 9, 2017, pp. 61–66. ↩︎