2020 might have been heralding the decade of privacy. From leaks in video conferencing software to doubts over COVID-19 apps, privacy concerns have been front and centre. While most of the commentary is when things go wrong, there is an angle not talked about: the attitudes of the people and organizations who make the technology we use on a daily basis. Their work today shapes our experience of tomorrow, but we know little of their perspective.
Between April and December 2020, the Privacy Mode team at Hasgeek (henceforth “we” and “us”) surveyed the makers of tech products and services in India. We sought to understand whether privacy is a guiding factor in product development, or an afterthought in response to regulation and public sentiment.
We started by conducting interviews with experts, followed by focus group discussions. The learnings from these sessions helped frame hypotheses and a qualitative questionnaire. See our note on framing and review, and the detailed findings and analysis.
The survey finds that 90% of the respondents believe that their organization treats privacy as a fundamental right and that it attempts to realize that intent while building products and services. At the baseline, at least three in four respondents in authority roles require thinking about privacy and compliance issues across all organization sizes.
However converting that intent into practice remains an onerous challenge. Small and medium-sized organizations fare worse than large organizations. All three struggle with:
- Not having dedicated roles to manage conflicts arising out of product and engineering decisions on user privacy,
- Not having institutionalized “playbooks” (meaning process workflows, standard operating procedures and cultural values that shape consistent outcomes), and
- Peer groups to debate these issues both within and outside the organization.
Some industry sectors are governed by regulatory regimes such as HIPAA, CCPA and GDPR. The responses within these sectors did not deviate greatly from the general responses. This suggests adding regulatory pressure on small and medium organizations does not automatically change their internal product development dynamics to make them more privacy-friendly.
Over a fifth of respondents across all organization sizes say there is a lack of a peer group across the tech ecosystem to discuss and find solutions to privacy concerns. This lack was felt both within the organization and outside of it. This points to a larger concern where even those who may want to implement privacy-respecting features in products or services do not have the adequate support or use-cases to guide them. As organizations get larger, their access to external peer groups reduces, indicating the development of an insular culture.
- Close to 30% of organizations lack learning and development budgets for understanding privacy during product development.
- Small organizations find it particularly hard to get stakeholders interested in privacy concerns.
- Small and medium organizations do not have the organization structure or capability to have specialised departments to handle risk and compliance, which – combined with their lack of training budgets for privacy, and lack of standard procedures to handle privacy concerns or risk – is a point of great concern and will have implications across the tech ecosystem.
As one startup founder we interviewed said:
“Companies are doing a poor job communicating this (the need for user privacy and security)... I don’t think they are malicious. They’re lousy in their implementation and security, but there is no mal-intent or ill intent to begin with, they are lousy sometimes.”
The good news is that awareness about privacy and recognition of its importance as a practice – and as an end-user benefit – exists across leadership in large, medium and small organizations. The not-so-good news is that intent by itself is not sufficient to move organizations towards the ideal of protecting user privacy and user data.
As the data in this research shows, specialized personnel, budgets, processes and community resources have to be available in abundance if India wants a smooth transition towards a data protection regime. This is where the gap is.
Here are our recommendations for different stakeholders to move towards the privacy-preserving ideal, one step at a time.
Small and medium organizations struggle to establish a business model with repeatable unit economics. This is a paramount concern in the early stages. While there is intent to embed privacy practices in the product-development cycle, small and medium organizations do not have the skills or budgets.
As our data indicates, adding regulatory pressure does not improve outcomes. On the other hand, regulation can increase the compliance burden, thereby adversely affecting small and medium organizations and turning them into non-viable businesses.
This research recommends to the Joint Parliamentary Committee (JPC) and the Data Protection Authority (DPA) for the Personal Data Protection (PDP) bill that the bill’s provisions must provide for exemptions for organizations below a certain revenue or fundraise threshold, and that the government must invest in the development of community resources to make the law feasible.
Across all the segments, including large organizations, software engineering practices have not matured enough to be ready for PDP compliance from day one. There is also a lack of clarity at the operational level on the implementation of privacy processes such as consent management, data-security audits, data-erasure requests, anonymisation, and purpose limitation.
Technical standards have to evolve – both from within and by knowledge-sharing at the community level – to address these issues. The JPC and DPA must provide at least two years’ time for organizations to become compliant with the standards prescribed in the PDP bill’s provisions.
Business organizations, especially early and growth stage startups, generally refrain from sharing internal knowledge and practices with other organizations. Insecurity about growth and organizational secrets, and pressure from investors to grow Intellectual Property (IP), has led to a culture of withholding knowledge, making it hard to develop capacity.
The structural capacity issues that this research refers to – skill gaps and lack of knowledge resources – can be resolved if leadership, legal, and data privacy officers in small and medium organizations become forthcoming about systematic best-practice and knowledge sharing. Journeyman documentation, which lays out the map of these practices at operational, and technical levels, will be more beneficial than broad directions.
While simplistic solutions may get widespread adoption because they substitute for missing capacity, they will hamper small and medium organizations from competing on a global scale, mainly because these organizations may have to maintain and invent different sets of practices to comply with standards such as GDPR and CCPA. This will increase the cost of compliance.
For investors, advisors and external stakeholders, as well as community organizations and founders of small and medium organizations, this research makes clear recommendations:
Build capacity by contributing to a shared knowledge pool beyond your organization, in the form of meetup groups and their knowledge archives
Encourage thinking about privacy at the product and engineering levels. Allow your practitioners to gain public recognition for their work.
Take privacy preserving ideas and practices in product development cycles to external forums, to policy makers and such, and encourage adoption across the ecosystem.
Scroll down for the individual chapters.
Nadika Nadja is a researcher at Hasgeek. She has worked across advertising, journalism, TV & film production as a writer, editor and researcher.
Anand Venkatnarayanan has worked as a cybersecurity engineer for two decades. He is currently consulting editor for Privacy Mode and a non-executive director at Hasgeek.
We would like to thank the following individuals from tech, legal, and policy backgrounds who provided feedback during different stages of the research. Their feedback helped the team fine-tune and bring rigour to the research process. While Hasgeek sought funding from organizations, the research itself was conducted – with full disclosure at all stages – independently and objectively, and the findings do not reflect any individual organization’s needs.
- Akshat Jain, Consultant at Omidyar Network India
- Alok Prasanna Kumar, Co-founder and senior resident fellow at Vidhi Centre for Legal Policy
- Ashwin Kumar, Director at FMR India
- Chinmayi S K, Founder at The Bachchao Project
- Raahil Rai, Investment Associate | Digital Society at Omidyar Network India
- Subhashish Bhadra, Principal at Omidyar Network India
- Suchana Seth, Founder and CEO at The Mindful AI Lab
Updates and corrections are posted in the Updates section. Use the “Follow” button to get notified.
Analysis of survey responses
The survey asked questions to test hypotheses. They are presented here along with a breakdown of responses by organization size and segment, and the respondent’s role therein. Questions are numbered by the order in which they appeared in the survey.
- Privacy and security
- Care and concern about user privacy
- Privacy as a business imperative
- Management incentives and resource guides
- Training budgets for continuous learning
- Maturity on intent
- Maturity on people
- Maturity on process
Hypothesis 1: I understand the difference between privacy and security.
Question 12: How does your organization look at the relationship between privacy and security?
- They are two distinct ideas and are not related (R1)
- They are somewhat related (R2)
- They are closely related (R3)
- They are two sides of the same coin and are fully intertwined (R4)
Security is about the safeguarding of data, whereas privacy is about the safeguarding of user identity and their right to self-determination. It is possible to have security without privacy, but not privacy without security. Or, as Paul Dourish and Ken Anderson wrote in 2006 for the journal Human-computer Interaction,1 “We read security here as the state of being free from danger; technological ‘security mechanisms’ are deployed as means to ensure this state. Risks to privacy (solitude, confidentiality, autonomy), then, are among the various risks against which we might wish to be secure.”
Privacy and security are related, and R1 represents an incorrect understanding. By bucketing these, we have:
|Authority role||Unrelated (R1)||Related (R2+R3+R4)||% Related|
|Organization size||Unrelated (R1)||Related (R2+R3+R4)||% Related|
We therefore conclude that within our samples, the hypothesis is generally true, and ≈90% of the participants agree that privacy and security are related to each other.
“In terms of differences between privacy and security I think there is a lot of overlap and at times when you talk about privacy you also talk about security but they are still distinct entities in terms of the laws that govern privacy and security. As a company we are very heavy on compliance in general on PII data to being HIPAA compliant and CCPA compliant... there is (also) a team dedicated to make sure that every single application, data flow, and platform that stores data is secure.”
—A senior engineering manager at a multinational e-commerce company, during an FGD
We believe there is room for interpretation that the ≈10% chose their answer based on the boundaries of their specific work responsibilities. For instance, an infrastructure engineer and a product manager may have non-overlapping roles handling security and privacy respectively.
Hypothesis 2: Tech engineers (senior and middle) recognise that the tech industry does not respect user privacy and believe that it needs to be more responsible.
Question 14: Which of the following statements accurately captures the attitude towards privacy in your organization/network?
- Respectful and actively works to protect it (R1)
- Indifferent (R2)
- It is not a concern at all (R3)
- Don’t know / Can’t say (R4)
R2 and R3 indicate not enough is being done. R4 suggests the respondent is not even aware. Putting them together:
|Organization size||Does enough (R1)||Not enough (R2+R3+R4)||% Does enough|
Question 15: Does your organization have Standard Operating Procedures (SOPs) for engaging and handling criticism on privacy issues in your products?
- Yes, we have a fully functional SOP (R1)
- No, we don’t have anything like that (R2)
- We aspire to have one, but could not find the time or resources to build one (merged with R2 as it is not qualitatively different)
|Organization size||R1 (have SOP)||R2 (no SOP)||% Have SOP|
Question 16: I have a peer group within my organization, where I can discuss and freely express my views about privacy and data security.
- Yes and it is a very active peer group (R1)
- Yes, but it is not very active (merged with R1 as it is not sufficiently distinct)
- No, I don’t have one (R2)
|Organization size||R1 (have peer group)||R2 (no peer group)||% Have peer group|
The percentage columns in these tables show an interesting contrast:
|Organization size||% Does enough||% Have SOP||% Have peer group|
Across all sizes, a significant number of participants have said there are no SOPs available for handling privacy concerns, even though they think that their organizations are respectful of privacy concerns and have a peer group. Large organizations stand out for having invested in creating processes compared to other size segments.
In summary, while a large percentage of participants agree that their organization recognizes privacy concerns, they also agree that not enough is being done about it. Small organizations are significantly lagging in having a peer group and established processes.
“A lot of startups work in survival mode for a long time and these conversations often feel like (problems of) the elite in comparison to immediate survival problems you are facing. Not saying it is right but this is the attitude most of the time in these startups, it’s an elite concern. So we get around it rather than think through it, and when it is absolutely necessary is when we come to it.”
—A founder of a product-management startup, during an FGD
Hypothesis 3: Organizations can only build privacy respecting products if they have business imperative, skill, agency and competence.
Hypothesis 4: There are market segments where business imperative overrides privacy concerns.
Hypothesis 5: Tech engineers are dissatisfied with their organization’s stance/policies and efforts on building privacy respecting products.
Question 20: Which of the following best describes the product design and development practices of your organization, when it comes to Privacy?
- Features first, ship fast, privacy comes later (R1)
- Privacy-first design and development cycle (R2)
- Features first, ship fast but we do have privacy and security reviews as part of the process (merged with R2, as both indicate privacy is considered)
- Features first, ship fast, but I really wish that we do privacy and security reviews as part of the process (merged with R1, as both indicate privacy is low priority)
- Don’t know (excluded from analysis)
|Organization size||R1 (Privacy later)||R2 (Privacy fore)||% Privacy fore|
For another perspective, we segment organizations by their respective verticals. Some organizations appear in more than one segment:
|Vertical||R1 (privacy later)||R2 (privacy fore)||% Privacy fore|
In focus group discussions, the American HIPAA, the California-specific CCPA and the European GDPR were repeatedly mentioned as privacy regulations that organizations adhered to. Consequently, we asked survey respondents to select from these.
Question 30: Which of the following laws are you required to comply with? (Select all that apply)
A closer examination of the “Other” responses suggest that some respondents have confused auditing requirements such as SOC 2 and ISO 27001 with privacy regulations. Here we introduce a new column “None” for when none of CCPA, GDPR and HIPAA are applicable.
|Organization size||None||Count||% of None|
By cross-referencing responses to these two questions, we see which of these regimes is more effective. As Question 30 is multi-select, the total counts in this table do not match the total counts for Question 20 above.
|Compliance||R1 (Privacy later)||R2 (Privacy fore)||% Privacy fore|
Organization size matters, and smaller organizations (9/13 in our sample size) have a lower concern for privacy, possibly because of immature processes.
There is a distinct drop in concern in the Fintech, Social Media and Mobile App verticals (< 90%).
23% (42/182) of the organizations are not under the most commonly known privacy regulations with small and medium organizations constituting 64.28% (27 out of 42) of the population, which perhaps explains why they have immature processes compared to large organizations.
Being subject to regulation (average 93.% privacy fore) is not a significant improvement from the overall concern for privacy (92.12%). The survey does not reveal why, and suggests a deeper study is required.
“So privacy by design is still perhaps not very well adopted by the industry, it is more privacy by law at this point”
—A senior engineer and product manager at a software product and services company, during an FGD
“Certain types of information may not be essential for the application. But from a marketing standpoint and what you are selling, demographics such as geography, how the people behave, how the product is utilized - collecting this information helps us draw patterns. Everyone in the market is anyway capturing the data. It is more about whether the data is necessary for our organization, and if we have the compliance to go forward with collecting it.”
—A software engineering manager of a medium-sized fintech organization, during an FGD
“Knowing the customer’s ailment is the highest level of PII we deal with. Can I send a recommendation to the customer based on their health ailment? We are missing out on a potential business opportunity. I don’t know how much of a great business that could be, but we have just stayed away from using any of the PII to target better or up-sell, something that comes very naturally to say an e-commerce business.”
—VP of engineering of a medium-sized health-tech organization, during an FGD
“If you really care about privacy, make your (software) architectural choices robust. Policy will come later, because policy needs reinforcement and that again boils down to intent and who is in power. But if the underlying architecture is distributed, nobody can do anything because this is a foolproof system.”
—Product head at a large-sized paytech organization, during an FGD
Hypothesis 6: Middle and line management lack resources and guides to reduce the decision making overhead of regulatory compliance, and also lack agency to build privacy respecting products.
Hypothesis 7: Senior and middle management need immediate incentives for investing in building privacy respecting products.
Question 22: Which of the following statement best applies when you are building a product and make design decisions on the features:
- We have resources and guides to make decisions, if our implementation will have an impact on user privacy (Have process or people)
- We have a go-to person who will advise us, if our implementation has an impact on user privacy (Have process or people)
- We have both (resources, guides and a go-to person) (Have process or people)
- We have none (neither resources, guides nor a go-to person) (Have none)
Only respondents holding authority roles are considered here (121 of 182).
|Organization size||Have process or people||Have none||% Have none|
Does a regulatory compliance regime (CCPA, GDPR or HIPAA) make a difference? By only considering responses from (a) organizations that said they are regulated and (b) respondents holding positions of authority, we find that small and medium organizations have not improved.
|Regulated organization size||Have process or people||Have none||% Have none|
Question 29: If your organization has compliance standards imposed by regulatory authorities, which of the following best describes your development practices?
- We have periodic audits by a third party to review our standards compliance (Have process)
- We have an internal team that is responsible for our standards compliance (Have people)
- We have both (audits by third parties and an internal team) (Have both)
Only respondents holding authority roles are considered here (count of 121):
|Regulated organization size||Have process||Have people||Have both|
Small organizations have capacity shortcomings that do not disappear when they are subject to a regulatory regime, likely because regulation does not generate business. Larger organizations however can absorb the cost of compliance and create better processes.
“Sales and marketing are always aggressive. They love aggressive growth. So they will go to any lengths to collect whatever customer data they can lay their hands on. On the technical side of things, however, engineers and managers have to take responsibility and not leave it to the business to decide about what data can be collected and what data is off-limits. A lot depends on the management’s commitment.”
— Senior engineer working at a medium-sized Fintech startup, during an FGD
Hypothesis 8: Competence, skill and awareness together power an individual within an organization to pitch for and build privacy-respecting products.
Hypothesis 9: Tech companies do not spend enough time in training their employees on privacy and other safety issues.
Hypothesis 10: Senior and middle management need practical resources – including community support and peer reviewed knowledge – that help them to quantify RoI and risks for building privacy respecting products.
Question 21: Which of the following best describes the training and resources in your organization about privacy aspects during product design and development?
- Provides enough information and resources about privacy (R1)
- Provides some information and resources about privacy (R2)
- Does not provide any information (R3)
- Does not apply (merged with R3)
|Organization size||R1||R2||R3||% R3 (No info)|
Question 23: We have a learning and development budget for our product teams to enhance our understanding of privacy and security risks, while making feature and design decisions
- We have a no-questions-asked budget for this purpose (R1)
- On actuals basis, on approval from management (R2)
- Not available (R3)
- Don’t know (R4)
R4 is unclear on whether it means the respondent is not aware of a budget, or doesn’t have one. It is therefore excluded from this table.
|Organization size||R1||R2||R3||% R3 (Not available)|
- Large organizations have more training resources than small and medium organizations for handling privacy aspects during the product development cycle
- 30% of organizations don’t have baseline learning and development (L&D) budgets for understanding privacy and security risks while creating products
Question 5: Does your role require you to think about privacy or compliance on a regular basis?
- Role needs to think about Privacy and Compliance (Yes)
- Role does not need to think about Privacy and Compliance (No)
Responses are shown here classified by whether the respondent held an authority role in their organization:
|With authority role||Yes||No||% Yes|
|Without authority||Yes||No||% Yes|
Question 13: How does your organization look at privacy respecting aspects of its products or services?
- Privacy is a fundamental right, and is built into our products (R1)
- Privacy is a right, but is only available in premium versions of products (R2)
- Privacy is not a right, our products do not have privacy features (R3)
- Others (open text, excluded from responses table)
|Organization Size||R1||R2||R3||% R1|
There is near universal acknowledgement that privacy is a fundamental right and has to be built into the product development process, across organizations of all sizes.
In small and medium organizations, three out of every four authority roles require thinking about privacy and compliance issues. This number drops to two out of three for large organizations, suggesting they can afford to create specialized roles.
“The first thing we do is create a map of the data, which is called a metadata drive, so the first thing to know is what data is there in your system.”
“Nowadays every organization has something called a Chief Data Officer (CDO), especially in European companies which are GDPR compliant. The CDO lays down the rules for their organization, maps those rules to that metadata [drive], and either automates compliance for those rules, or manually encodes the rules to satisfy compliance so that all these issues will be caught in a certain timeframe. It’s the lineage of the entire data. If something goes wrong at one place, within minutes, teams are able to identify the whole datamap, and identify what went wrong.”
—A startup founder, during an FGD
“If you have to set rules for the whole company, it has to be done right at the top.”
—A VP of engineering of a medium-sized health-tech organization, during an FGD
Question 17: Do investors, customers, or other stakeholders of your company care about privacy?
- Investors, Customers, Stakeholders care about Privacy (R1)
- Investors, Customers, Stakeholders ask us to Prioritize Privacy aspects (R2)
- They don’t and we often spend time educating them about privacy (R3)
- Don’t know / Can’t say (R4)
R4 has been excluded as it does not convey any meaningful information.
|Organization size||R1||R2||R3||% R1||% R2||% R3|
While intent is a good starting point, the next step is to hire competent people in authority roles to establish processes, for which stakeholders must be on board.
Question 25: Does your organization have a Chief Data Officer (CDO), or Legal department which looks into risk and compliance?
- Have a Chief Data officer, Legal department for handling Privacy/Compliance (Yes)
- Don’t have a department (No)
Respondents who answered “Yes” were also asked for the team size as an indirect metric for a privacy and compliance budget:
Question 28: What is the approximate size of the team in your organization that looks into legal/compliance/regulatory aspects?
- More than 50
|Organization size||1–5||6–50||> 50|
Question 24: If you had to find out more about regulations applicable to your business, you can ask someone in your organization.
- Don’t know / Can’t say
|Organization size||Yes||No||Don’t know / Can’t say||Count|
Question 18: I have a peer group within my organization, where I can discuss and freely express my views about privacy and data security.
- Yes, and it is a very active peer group (Yes, good)
- Yes, but it is not very active (Yes, okay)
- No, I don’t have one (No)
|Organization size||Yes, good||Yes, okay||No||% No|
Question 19: I have a peer group or safe space outside my organization where I can express my views and learn from other practitioners about privacy and data security.
- Very supportive and helpful peer group (Yes, good)
- Somewhat helpful peer group (Yes, okay)
- No Peer group (No)
|Organization size||Yes, good||Yes, okay||No||% No|
- Small organizations find it harder to get stakeholders to invest in privacy practices
- This directly correlates to not having a specialized department to handle privacy risks and compliance
- This also correlates to not having a go-to person to ask internally about regulatory compliance
Over a fifth of respondents in organizations of all sizes don’t have a peer group to learn about privacy and data security, indicating a capacity gap in the larger ecosystem. Smaller organizations fare better with external support than larger organizations, suggesting that organizations get insular as they grow.
“Why I have this mindset is because we are a healthcare company. The whole company is built on trust. Anything in healthcare, you trust the provider. Our revenues come from clients who trust us with their data. So it is ingrained into how we think about data. We’ve had multiple rounds of product audits, keeping all compliances in mind. We revisited policy, both technically and in implementation and identified legacy issues to upgrade our data governance systems when legislation changes.”
—A VP of engineering of a medium-sized health-tech organization, during an FGD
“Compliance being cosmetic and lack of care for privacy has nothing to do with India. It has to do with the organization and the people you work with. Although, in my company, we do not collect that much data from the user and the customer, but the clients who are sharing their content with us, their security matters. I think the privacy and data, it’s about the clients and the users, and the scale of the company, and the kind of folks you work with.”
—A senior engineer at a large media-tech company, during an FGD
In mature organizations, processes are well defined, and specialized officers have authority over privacy-related decisions.
Question 27: Does your organization’s CDO/Risk and Compliance/Security/Legal team have veto power over the product/engineering team when questions of data security or privacy are raised?
- Don’t know / Can’t say
|Organization size||Yes||No||Don’t know / Can’t say||% Yes|
Question 31: What are the data governance practices and policies that your organization has adopted for complying with these regulations? (Multiple choice)
- Current Privacy/Data Retention policy (R1)
- Infrastructure/Cloud storage policy (R2)
- Accuracy in data collection, data minimisation (R3)
- Data localisation (R4)
- Anonymisation (R5)
- Data security audits (R6)
- Purpose and Time limitation (R7)
- Data erasure on request (R8)
Some respondents did not pick any of the options, possibly implying that their organizations have no data governance practices.
Expressed in percentages (based on totals per row)
Question 32: What are the consent management policies and practices that your organization has adopted for complying with regulations?
- Consent management tools and processes (R1)
- Paper trails, transparency, and accountability (R3)
- Privacy, data security audits (R4)
- Social engineering audits (R5)
Some respondents did not pick any of the options, possibly implying that their organizations have no consent management practices.
Expressed in percentages (based on totals per row):
“At least for the fintech organizations, it is quite important privacy is built as part of the culture. If an organization is small, boot-strapped, it can be quite tedious to look into all compliance. Generally, companies move in a self-compliant manner. But as the organization grows bigger, it is important to realize the fact that potentially you will bring in a lot of new people who probably may or may not be aware of the sensitivity of the data. So compliance processes and audits can help ensure that best practices are followed, and that the organization has better vulnerability analysis checks.”
—A software engineering manager of a medium-sized fintech organization, during an FGD
Dourish, P. and Anderson, K., 2006. Collective information practice: Exploring privacy and security as social and cultural phenomena. Human-computer interaction, 21(3), pp.319-342. http://www.douri.sh/publications/2006/DourishAnderson-InfoPractices-HCIJ.pdf ↩︎