Privacy practices in the Indian technology ecosystem

Privacy practices in the Indian technology ecosystem

A 2020 survey of the makers of products and services

2020 might have been heralding the decade of privacy. From leaks in video conferencing software to doubts over COVID-19 apps, privacy concerns have been front and centre. While most of the commentary is when things go wrong, there is an angle not talked about: the attitudes of the people and organizations who make the technology we use on a daily basis. Their work today shapes our experience of tomorrow, but we know little of their perspective.

Between April and December 2020, the Privacy Mode team at Hasgeek (henceforth “we” and “us”) surveyed the makers of tech products and services in India. We sought to understand whether privacy is a guiding factor in product development, or an afterthought in response to regulation and public sentiment.

Summary of findings

We started by conducting interviews with experts, followed by focus group discussions. The learnings from these sessions helped frame hypotheses and a qualitative questionnaire. See our note on framing and review, and the detailed findings and analysis.

The survey finds that 90% of the respondents believe that their organization treats privacy as a fundamental right and that it attempts to realize that intent while building products and services. At the baseline, at least three in four respondents in authority roles require thinking about privacy and compliance issues across all organization sizes.

However converting that intent into practice remains an onerous challenge. Small and medium-sized organizations fare worse than large organizations. All three struggle with:

  1. Not having dedicated roles to manage conflicts arising out of product and engineering decisions on user privacy,
  2. Not having institutionalized “playbooks” (meaning process workflows, standard operating procedures and cultural values that shape consistent outcomes), and
  3. Peer groups to debate these issues both within and outside the organization.

Some industry sectors are governed by regulatory regimes such as HIPAA, CCPA and GDPR. The responses within these sectors did not deviate greatly from the general responses. This suggests adding regulatory pressure on small and medium organizations does not automatically change their internal product development dynamics to make them more privacy-friendly.

Over a fifth of respondents across all organization sizes say there is a lack of a peer group across the tech ecosystem to discuss and find solutions to privacy concerns. This lack was felt both within the organization and outside of it. This points to a larger concern where even those who may want to implement privacy-respecting features in products or services do not have the adequate support or use-cases to guide them. As organizations get larger, their access to external peer groups reduces, indicating the development of an insular culture.

  1. Close to 30% of organizations lack learning and development budgets for understanding privacy during product development.
  2. Small organizations find it particularly hard to get stakeholders interested in privacy concerns.
  3. Small and medium organizations do not have the organization structure or capability to have specialised departments to handle risk and compliance, which – combined with their lack of training budgets for privacy, and lack of standard procedures to handle privacy concerns or risk – is a point of great concern and will have implications across the tech ecosystem.

As one startup founder we interviewed said:

“Companies are doing a poor job communicating this (the need for user privacy and security)... I don’t think they are malicious. They’re lousy in their implementation and security, but there is no mal-intent or ill intent to begin with, they are lousy sometimes.”

Recommendations

The good news is that awareness about privacy and recognition of its importance as a practice – and as an end-user benefit – exists across leadership in large, medium and small organizations. The not-so-good news is that intent by itself is not sufficient to move organizations towards the ideal of protecting user privacy and user data.

As the data in this research shows, specialized personnel, budgets, processes and community resources have to be available in abundance if India wants a smooth transition towards a data protection regime. This is where the gap is.

Here are our recommendations for different stakeholders to move towards the privacy-preserving ideal, one step at a time.

  1. Small and medium organizations struggle to establish a business model with repeatable unit economics. This is a paramount concern in the early stages. While there is intent to embed privacy practices in the product-development cycle, small and medium organizations do not have the skills or budgets.

    As our data indicates, adding regulatory pressure does not improve outcomes. On the other hand, regulation can increase the compliance burden, thereby adversely affecting small and medium organizations and turning them into non-viable businesses.

    This research recommends to the Joint Parliamentary Committee (JPC) and the Data Protection Authority (DPA) for the Personal Data Protection (PDP) bill that the bill’s provisions must provide for exemptions for organizations below a certain revenue or fundraise threshold, and that the government must invest in the development of community resources to make the law feasible.

  2. Across all the segments, including large organizations, software engineering practices have not matured enough to be ready for PDP compliance from day one. There is also a lack of clarity at the operational level on the implementation of privacy processes such as consent management, data-security audits, data-erasure requests, anonymisation, and purpose limitation.

    Technical standards have to evolve – both from within and by knowledge-sharing at the community level – to address these issues. The JPC and DPA must provide at least two years’ time for organizations to become compliant with the standards prescribed in the PDP bill’s provisions.

  3. Business organizations, especially early and growth stage startups, generally refrain from sharing internal knowledge and practices with other organizations. Insecurity about growth and organizational secrets, and pressure from investors to grow Intellectual Property (IP), has led to a culture of withholding knowledge, making it hard to develop capacity.

    The structural capacity issues that this research refers to – skill gaps and lack of knowledge resources – can be resolved if leadership, legal, and data privacy officers in small and medium organizations become forthcoming about systematic best-practice and knowledge sharing. Journeyman documentation, which lays out the map of these practices at operational, and technical levels, will be more beneficial than broad directions.

    While simplistic solutions may get widespread adoption because they substitute for missing capacity, they will hamper small and medium organizations from competing on a global scale, mainly because these organizations may have to maintain and invent different sets of practices to comply with standards such as GDPR and CCPA. This will increase the cost of compliance.

  4. For investors, advisors and external stakeholders, as well as community organizations and founders of small and medium organizations, this research makes clear recommendations:

    1. Build capacity by contributing to a shared knowledge pool beyond your organization, in the form of meetup groups and their knowledge archives

    2. Encourage thinking about privacy at the product and engineering levels. Allow your practitioners to gain public recognition for their work.

    3. Take privacy preserving ideas and practices in product development cycles to external forums, to policy makers and such, and encourage adoption across the ecosystem.

Scroll down for the individual chapters.

About the principal researchers

Nadika Nadja is a researcher at Hasgeek. She has worked across advertising, journalism, TV & film production as a writer, editor and researcher.

Anand Venkatnarayanan has worked as a cybersecurity engineer for two decades. He is currently consulting editor for Privacy Mode and a non-executive director at Hasgeek.

Support team

Anish TP illustrated the report. Umesh PN, Zainab Bawa and Kiran Jonnalagadda edited. David Timethy was the project manager.

Acknowledgements

We would like to thank the following individuals from tech, legal, and policy backgrounds who provided feedback during different stages of the research. Their feedback helped the team fine-tune and bring rigour to the research process. While Hasgeek sought funding from organizations, the research itself was conducted – with full disclosure at all stages – independently and objectively, and the findings do not reflect any individual organization’s needs.

  1. Akshat Jain, Consultant at Omidyar Network India
  2. Alok Prasanna Kumar, Co-founder and senior resident fellow at Vidhi Centre for Legal Policy
  3. Ashwin Kumar, Director at FMR India
  4. Chinmayi S K, Founder at The Bachchao Project
  5. Raahil Rai, Investment Associate | Digital Society at Omidyar Network India
  6. Subhashish Bhadra, Principal at Omidyar Network India
  7. Suchana Seth, Founder and CEO at The Mindful AI Lab

Errata

Updates and corrections are posted in the Updates section. Use the “Follow” button to get notified.

Hosted by

Deep dives into privacy and security, and understanding needs of the Indian tech ecosystem through guides, research, collaboration, events and conferences. Sponsors: Privacy Mode’s programmes are sponsored by: more

Supported by

Omidyar Network India invests in bold entrepreneurs who help create a meaningful life for every Indian, especially the hundreds of millions of Indians in low-income and lower-middle-income populations, ranging from the poorest among us to the existing middle class. To drive empowerment and social i… more
Anand Venkatanarayanan

Anand Venkatanarayanan

@anandvenkatanarayanan

Nadika N

Nadika N

@nadikanadja

Kiran Jonnalagadda

Kiran Jonnalagadda

@jace Editor

Analysis of survey responses

Submitted Apr 25, 2021

The survey asked questions to test hypotheses. They are presented here along with a breakdown of responses by organization size and segment, and the respondent’s role therein. Questions are numbered by the order in which they appeared in the survey.

Sections:

Privacy and security

Hypothesis 1: I understand the difference between privacy and security.

Question 12: How does your organization look at the relationship between privacy and security?

  1. They are two distinct ideas and are not related (R1)
  2. They are somewhat related (R2)
  3. They are closely related (R3)
  4. They are two sides of the same coin and are fully intertwined (R4)
Authority role R1 R2 R3 R4 Total
Yes 10 24 51 38 123
No 5 5 28 21 59
Total 15 29 79 59 182
Organization size R1 R2 R3 R4 Total
Small 5 16 30 18 69
Medium 5 4 19 13 41
Large 3 6 26 22 57
Unidentified 2 3 4 6 15
Total 15 29 79 59 182

Security is about the safeguarding of data, whereas privacy is about the safeguarding of user identity and their right to self-determination. It is possible to have security without privacy, but not privacy without security. Or, as Paul Dourish and Ken Anderson wrote in 2006 for the journal Human-computer Interaction,1 “We read security here as the state of being free from danger; technological ‘security mechanisms’ are deployed as means to ensure this state. Risks to privacy (solitude, confidentiality, autonomy), then, are among the various risks against which we might wish to be secure.”

Privacy and security are related, and R1 represents an incorrect understanding. By bucketing these, we have:

Authority role Unrelated (R1) Related (R2+R3+R4) % Related
Yes 10 113 91.86%
No 5 54 91.52%
Total 15 167 91.75%
Organization size Unrelated (R1) Related (R2+R3+R4) % Related
Small 5 64 92.75%
Medium 5 46 90.19%
Large 3 54 94.73%
Unidentified 2 13 86.66%
Total 15 167 91.75%

We therefore conclude that within our samples, the hypothesis is generally true, and ≈90% of the participants agree that privacy and security are related to each other.

“In terms of differences between privacy and security I think there is a lot of overlap and at times when you talk about privacy you also talk about security but they are still distinct entities in terms of the laws that govern privacy and security. As a company we are very heavy on compliance in general on PII data to being HIPAA compliant and CCPA compliant... there is (also) a team dedicated to make sure that every single application, data flow, and platform that stores data is secure.”

—A senior engineering manager at a multinational e-commerce company, during an FGD

We believe there is room for interpretation that the ≈10% chose their answer based on the boundaries of their specific work responsibilities. For instance, an infrastructure engineer and a product manager may have non-overlapping roles handling security and privacy respectively.

Care and concern about user privacy

Hypothesis 2: Tech engineers (senior and middle) recognise that the tech industry does not respect user privacy and believe that it needs to be more responsible.

Question 14: Which of the following statements accurately captures the attitude towards privacy in your organization/network?

  1. Respectful and actively works to protect it (R1)
  2. Indifferent (R2)
  3. It is not a concern at all (R3)
  4. Don’t know / Can’t say (R4)
Organization size R1 R2 R3 R4 Total
Small 59 6 0 4 69
Medium 34 5 1 1 41
Large 47 10 0 0 57
Unidentified 8 6 0 1 15
Total 148 27 1 6 182

R2 and R3 indicate not enough is being done. R4 suggests the respondent is not even aware. Putting them together:

Organization size Does enough (R1) Not enough (R2+R3+R4) % Does enough
Small 59 10 85.50%
Medium 34 7 82.92%
Large 47 10 82.45%
Unidentified 8 7 53.33%
Total 148 34 81.31%

Question 15: Does your organization have Standard Operating Procedures (SOPs) for engaging and handling criticism on privacy issues in your products?

  1. Yes, we have a fully functional SOP (R1)
  2. No, we don’t have anything like that (R2)
  3. We aspire to have one, but could not find the time or resources to build one (merged with R2 as it is not qualitatively different)
Organization size R1 (have SOP) R2 (no SOP) % Have SOP
Small 23 46 33.33%
Medium 17 24 41.46%
Large 34 23 59.64%
Unidentified 4 11 26.66%
Total 78 104 42.85%

Question 16: I have a peer group within my organization, where I can discuss and freely express my views about privacy and data security.

  1. Yes and it is a very active peer group (R1)
  2. Yes, but it is not very active (merged with R1 as it is not sufficiently distinct)
  3. No, I don’t have one (R2)
Organization size R1 (have peer group) R2 (no peer group) % Have peer group
Small 45 24 65.21%
Medium 35 6 85.36%
Large 45 12 78.94%
Unidentified 12 3 80.00%
Total 137 45 75.27%

The percentage columns in these tables show an interesting contrast:

Organization size % Does enough % Have SOP % Have peer group
Small 85.50% 33.33% 65.21%
Medium 82.92% 41.46% 85.36%
Large 82.45% 59.64% 78.94%
Unidentified 53.33% 26.66% 80.00%
Total 81.31% 42.85% 75.27%

Across all sizes, a significant number of participants have said there are no SOPs available for handling privacy concerns, even though they think that their organizations are respectful of privacy concerns and have a peer group. Large organizations stand out for having invested in creating processes compared to other size segments.

In summary, while a large percentage of participants agree that their organization recognizes privacy concerns, they also agree that not enough is being done about it. Small organizations are significantly lagging in having a peer group and established processes.

“A lot of startups work in survival mode for a long time and these conversations often feel like (problems of) the elite in comparison to immediate survival problems you are facing. Not saying it is right but this is the attitude most of the time in these startups, it’s an elite concern. So we get around it rather than think through it, and when it is absolutely necessary is when we come to it.”

—A founder of a product-management startup, during an FGD

Privacy as a business imperative

Hypothesis 3: Organizations can only build privacy respecting products if they have business imperative, skill, agency and competence.

Hypothesis 4: There are market segments where business imperative overrides privacy concerns.

Hypothesis 5: Tech engineers are dissatisfied with their organization’s stance/policies and efforts on building privacy respecting products.

Question 20: Which of the following best describes the product design and development practices of your organization, when it comes to Privacy?

  1. Features first, ship fast, privacy comes later (R1)
  2. Privacy-first design and development cycle (R2)
  3. Features first, ship fast but we do have privacy and security reviews as part of the process (merged with R2, as both indicate privacy is considered)
  4. Features first, ship fast, but I really wish that we do privacy and security reviews as part of the process (merged with R1, as both indicate privacy is low priority)
  5. Don’t know (excluded from analysis)
Organization size R1 (Privacy later) R2 (Privacy fore) % Privacy fore
Small 05 55 91.66%
Medium 04 35 89.74%
Large 04 49 92.45%
Unidentified 0 13 100.00%
Total 13 152 92.12%

For another perspective, we segment organizations by their respective verticals. Some organizations appear in more than one segment:

Vertical R1 (privacy later) R2 (privacy fore) % Privacy fore
Fintech 5 30 85.71%
Data Security 1 33 97.05%
Big Data 2 46 95.83%
Services/Consultancy 4 54 93.10%
Social Media 2 17 89.47%
Mobile Apps 6 32 84.21%

In focus group discussions, the American HIPAA, the California-specific CCPA and the European GDPR were repeatedly mentioned as privacy regulations that organizations adhered to. Consequently, we asked survey respondents to select from these.

Question 30: Which of the following laws are you required to comply with? (Select all that apply)

  1. CCPA
  2. GDPR
  3. HIPAA
  4. Other
Organization size CCPA GDPR HIPAA Other
Small 14 42 09 11
Medium 16 28 13 08
Large 23 39 23 12
Unidentified 04 11 01 14
Total 57 120 46 45

A closer examination of the “Other” responses suggest that some respondents have confused auditing requirements such as SOC 2 and ISO 27001 with privacy regulations. Here we introduce a new column “None” for when none of CCPA, GDPR and HIPAA are applicable.

Organization size None Count % of None
Small 20 69 28.98%
Medium 07 41 17.07%
Large 13 57 22.80%
Unidentified 02 15 13.33%
Total 42 182 23.07%

By cross-referencing responses to these two questions, we see which of these regimes is more effective. As Question 30 is multi-select, the total counts in this table do not match the total counts for Question 20 above.

Compliance R1 (Privacy later) R2 (Privacy fore) % Privacy fore
CCPA 3 52 94.54%
GDPR 9 103 91.96%
HIPAA 3 39 92.85%
Count 15 194 n/a

Insights:

  • Organization size matters, and smaller organizations (9/13 in our sample size) have a lower concern for privacy, possibly because of immature processes.

  • There is a distinct drop in concern in the Fintech, Social Media and Mobile App verticals (< 90%).

  • 23% (42/182) of the organizations are not under the most commonly known privacy regulations with small and medium organizations constituting 64.28% (27 out of 42) of the population, which perhaps explains why they have immature processes compared to large organizations.

  • Being subject to regulation (average 93.% privacy fore) is not a significant improvement from the overall concern for privacy (92.12%). The survey does not reveal why, and suggests a deeper study is required.

“So privacy by design is still perhaps not very well adopted by the industry, it is more privacy by law at this point”

—A senior engineer and product manager at a software product and services company, during an FGD

“Certain types of information may not be essential for the application. But from a marketing standpoint and what you are selling, demographics such as geography, how the people behave, how the product is utilized - collecting this information helps us draw patterns. Everyone in the market is anyway capturing the data. It is more about whether the data is necessary for our organization, and if we have the compliance to go forward with collecting it.”

—A software engineering manager of a medium-sized fintech organization, during an FGD

“Knowing the customer’s ailment is the highest level of PII we deal with. Can I send a recommendation to the customer based on their health ailment? We are missing out on a potential business opportunity. I don’t know how much of a great business that could be, but we have just stayed away from using any of the PII to target better or up-sell, something that comes very naturally to say an e-commerce business.”

VP of engineering of a medium-sized health-tech organization, during an FGD

“If you really care about privacy, make your (software) architectural choices robust. Policy will come later, because policy needs reinforcement and that again boils down to intent and who is in power. But if the underlying architecture is distributed, nobody can do anything because this is a foolproof system.”

—Product head at a large-sized paytech organization, during an FGD

Management incentives and resource guides

Hypothesis 6: Middle and line management lack resources and guides to reduce the decision making overhead of regulatory compliance, and also lack agency to build privacy respecting products.

Hypothesis 7: Senior and middle management need immediate incentives for investing in building privacy respecting products.

Question 22: Which of the following statement best applies when you are building a product and make design decisions on the features:

  1. We have resources and guides to make decisions, if our implementation will have an impact on user privacy (Have process or people)
  2. We have a go-to person who will advise us, if our implementation has an impact on user privacy (Have process or people)
  3. We have both (resources, guides and a go-to person) (Have process or people)
  4. We have none (neither resources, guides nor a go-to person) (Have none)

Only respondents holding authority roles are considered here (121 of 182).

Organization size Have process or people Have none % Have none
Small 37 13 26.00%
Medium 28 5 15.15%
Large 30 2 6.25%
Unidentified 5 1 80.00%
Total 100 21 21.00%

Does a regulatory compliance regime (CCPA, GDPR or HIPAA) make a difference? By only considering responses from (a) organizations that said they are regulated and (b) respondents holding positions of authority, we find that small and medium organizations have not improved.

Regulated organization size Have process or people Have none % Have none
Small 27 9 25.00%
Medium 23 4 14.81%
Large 25 0 0.00%
Unidentified 4 1 25.00%
Total 79 14 15.05%

Question 29: If your organization has compliance standards imposed by regulatory authorities, which of the following best describes your development practices?

  1. We have periodic audits by a third party to review our standards compliance (Have process)
  2. We have an internal team that is responsible for our standards compliance (Have people)
  3. We have both (audits by third parties and an internal team) (Have both)

Only respondents holding authority roles are considered here (count of 121):

Regulated organization size Have process Have people Have both
Small 6 23 7
Medium 9 6 16
Large 7 10 12
Unidentified 1 3 1
Total 23 42 36

Small organizations have capacity shortcomings that do not disappear when they are subject to a regulatory regime, likely because regulation does not generate business. Larger organizations however can absorb the cost of compliance and create better processes.

“Sales and marketing are always aggressive. They love aggressive growth. So they will go to any lengths to collect whatever customer data they can lay their hands on. On the technical side of things, however, engineers and managers have to take responsibility and not leave it to the business to decide about what data can be collected and what data is off-limits. A lot depends on the management’s commitment.”

— Senior engineer working at a medium-sized Fintech startup, during an FGD

Training budgets for continuous learning

Hypothesis 8: Competence, skill and awareness together power an individual within an organization to pitch for and build privacy-respecting products.

Hypothesis 9: Tech companies do not spend enough time in training their employees on privacy and other safety issues.

Hypothesis 10: Senior and middle management need practical resources – including community support and peer reviewed knowledge – that help them to quantify RoI and risks for building privacy respecting products.

Question 21: Which of the following best describes the training and resources in your organization about privacy aspects during product design and development?

  1. Provides enough information and resources about privacy (R1)
  2. Provides some information and resources about privacy (R2)
  3. Does not provide any information (R3)
  4. Does not apply (merged with R3)
Organization size R1 R2 R3 % R3 (No info)
Small 21 32 16 23.18%
Medium 16 17 8 19.51%
Large 34 16 7 12.28%
Unidentified 6 6 3 20.00%
Total 77 71 34 18.68%

Question 23: We have a learning and development budget for our product teams to enhance our understanding of privacy and security risks, while making feature and design decisions

  1. We have a no-questions-asked budget for this purpose (R1)
  2. On actuals basis, on approval from management (R2)
  3. Not available (R3)
  4. Don’t know (R4)

R4 is unclear on whether it means the respondent is not aware of a budget, or doesn’t have one. It is therefore excluded from this table.

Organization size R1 R2 R3 % R3 (Not available)
Small 7 25 19 37.25%
Medium 2 20 10 31.25%
Large 11 16 9 25.00%
Unidentified 1 8 1 10.00%
Total 21 69 39 30.23%

Insights:

  • Large organizations have more training resources than small and medium organizations for handling privacy aspects during the product development cycle
  • 30% of organizations don’t have baseline learning and development (L&D) budgets for understanding privacy and security risks while creating products

Maturity on intent

Question 5: Does your role require you to think about privacy or compliance on a regular basis?

  1. Role needs to think about Privacy and Compliance (Yes)
  2. Role does not need to think about Privacy and Compliance (No)

Responses are shown here classified by whether the respondent held an authority role in their organization:

With authority role Yes No % Yes
Small 38 12 76.00%
Medium 27 6 81.81%
Large 22 10 68.75%
Unidentified 5 1 83.33%
Total 92 29 76.03%
Without authority Yes No % Yes
Small 13 06 68.42%
Medium 03 05 37.50%
Large 20 05 80.00%
Unidentified 07 02 77.77%
Total 43 18 70.49%

Question 13: How does your organization look at privacy respecting aspects of its products or services?

  1. Privacy is a fundamental right, and is built into our products (R1)
  2. Privacy is a right, but is only available in premium versions of products (R2)
  3. Privacy is not a right, our products do not have privacy features (R3)
  4. Others (open text, excluded from responses table)
Organization Size R1 R2 R3 % R1
Small 53 2 2 92.98%
Medium 33 1 2 91.66%
Large 47 2 0 95.91%
Unidentified 9 1 0 90.00%
Total 142 6 4 93.42%

Insights:

  • There is near universal acknowledgement that privacy is a fundamental right and has to be built into the product development process, across organizations of all sizes.

  • In small and medium organizations, three out of every four authority roles require thinking about privacy and compliance issues. This number drops to two out of three for large organizations, suggesting they can afford to create specialized roles.

“The first thing we do is create a map of the data, which is called a metadata drive, so the first thing to know is what data is there in your system.”

“Nowadays every organization has something called a Chief Data Officer (CDO), especially in European companies which are GDPR compliant. The CDO lays down the rules for their organization, maps those rules to that metadata [drive], and either automates compliance for those rules, or manually encodes the rules to satisfy compliance so that all these issues will be caught in a certain timeframe. It’s the lineage of the entire data. If something goes wrong at one place, within minutes, teams are able to identify the whole datamap, and identify what went wrong.”

—A startup founder, during an FGD

“If you have to set rules for the whole company, it has to be done right at the top.”

—A VP of engineering of a medium-sized health-tech organization, during an FGD

Maturity on people

Question 17: Do investors, customers, or other stakeholders of your company care about privacy?

  1. Investors, Customers, Stakeholders care about Privacy (R1)
  2. Investors, Customers, Stakeholders ask us to Prioritize Privacy aspects (R2)
  3. They don’t and we often spend time educating them about privacy (R3)
  4. Don’t know / Can’t say (R4)

R4 has been excluded as it does not convey any meaningful information.

Organization size R1 R2 R3 % R1 % R2 % R3
Small 39 6 12 68.42% 10.52% 21.05%
Medium 28 4 5 75.67% 10.81% 13.51%
Large 37 13 2 71.15% 25.00% 3.84%
Unknown 9 3 2 64.28% 21.42% 14.28%
Total 113 26 21 70.62% 16.25% 13.12%

While intent is a good starting point, the next step is to hire competent people in authority roles to establish processes, for which stakeholders must be on board.

Question 25: Does your organization have a Chief Data Officer (CDO), or Legal department which looks into risk and compliance?

  1. Have a Chief Data officer, Legal department for handling Privacy/Compliance (Yes)
  2. Don’t have a department (No)
Organization size Yes No Total
Small 23 46 69
Medium 28 13 41
Large 49 8 57
Unknown 11 4 15
Total 111 71 182

Respondents who answered “Yes” were also asked for the team size as an indirect metric for a privacy and compliance budget:

Question 28: What is the approximate size of the team in your organization that looks into legal/compliance/regulatory aspects?

  1. 1–5
  2. 6–50
  3. More than 50
Organization size 1–5 6–50 > 50
Small 18 4 1
Medium 28 0 0
Large 6 31 12
Unknown 7 2 2
Total 59 37 15

Question 24: If you had to find out more about regulations applicable to your business, you can ask someone in your organization.

  1. Yes
  2. No
  3. Don’t know / Can’t say
Organization size Yes No Don’t know / Can’t say Count
Small 43 18 8 69
Medium 31 6 4 41
Large 51 3 3 57
Unknown 12 1 2 15
Total 137 28 17 182

Question 18: I have a peer group within my organization, where I can discuss and freely express my views about privacy and data security.

  1. Yes, and it is a very active peer group (Yes, good)
  2. Yes, but it is not very active (Yes, okay)
  3. No, I don’t have one (No)
Organization size Yes, good Yes, okay No % No
Small 29 18 22 31.88%
Medium 22 13 6 14.63%
Large 36 10 11 19.29%
Unknown 5 6 4 26.67%
Total 92 47 43 23.62%

Question 19: I have a peer group or safe space outside my organization where I can express my views and learn from other practitioners about privacy and data security.

  1. Very supportive and helpful peer group (Yes, good)
  2. Somewhat helpful peer group (Yes, okay)
  3. No Peer group (No)
Organization size Yes, good Yes, okay No % No
Small 29 18 22 31.88%
Medium 15 10 16 39.02%
Large 21 10 26 45.61%
Unknown 3 9 3 20.00%
Total 68 47 67 36.81%

Insights:

  • Small organizations find it harder to get stakeholders to invest in privacy practices
  • This directly correlates to not having a specialized department to handle privacy risks and compliance
  • This also correlates to not having a go-to person to ask internally about regulatory compliance

Over a fifth of respondents in organizations of all sizes don’t have a peer group to learn about privacy and data security, indicating a capacity gap in the larger ecosystem. Smaller organizations fare better with external support than larger organizations, suggesting that organizations get insular as they grow.

“Why I have this mindset is because we are a healthcare company. The whole company is built on trust. Anything in healthcare, you trust the provider. Our revenues come from clients who trust us with their data. So it is ingrained into how we think about data. We’ve had multiple rounds of product audits, keeping all compliances in mind. We revisited policy, both technically and in implementation and identified legacy issues to upgrade our data governance systems when legislation changes.”

—A VP of engineering of a medium-sized health-tech organization, during an FGD

“Compliance being cosmetic and lack of care for privacy has nothing to do with India. It has to do with the organization and the people you work with. Although, in my company, we do not collect that much data from the user and the customer, but the clients who are sharing their content with us, their security matters. I think the privacy and data, it’s about the clients and the users, and the scale of the company, and the kind of folks you work with.”

—A senior engineer at a large media-tech company, during an FGD

Maturity on process

In mature organizations, processes are well defined, and specialized officers have authority over privacy-related decisions.

Question 27: Does your organization’s CDO/Risk and Compliance/Security/Legal team have veto power over the product/engineering team when questions of data security or privacy are raised?

  1. Yes
  2. No
  3. Don’t know / Can’t say
Organization size Yes No Don’t know / Can’t say % Yes
Small 19 16 34 27.53%
Medium 23 6 12 29.26%
Large 26 1 30 52.63%
Unidentified 5 3 7 46.66%
Total 73 26 83 45.60%

Question 31: What are the data governance practices and policies that your organization has adopted for complying with these regulations? (Multiple choice)

  1. Current Privacy/Data Retention policy (R1)
  2. Infrastructure/Cloud storage policy (R2)
  3. Accuracy in data collection, data minimisation (R3)
  4. Data localisation (R4)
  5. Anonymisation (R5)
  6. Data security audits (R6)
  7. Purpose and Time limitation (R7)
  8. Data erasure on request (R8)

Some respondents did not pick any of the options, possibly implying that their organizations have no data governance practices.

Organization size R1 R2 R3 R4 R5 R6 R7 R8 Any None
Small 30 26 26 14 22 17 21 27 59 10
Medium 30 32 20 17 21 29 18 24 38 3
Large 45 48 25 29 28 42 26 29 55 2
Unidentified 10 9 5 6 9 9 6 7 12 3
Total 115 115 76 66 80 97 71 87 164 18

Expressed in percentages (based on totals per row)

Organization size R1 R2 R3 R4 R5 R6 R7 R8 Any None
Small 43% 38% 38% 20% 32% 25% 39% 46% 86% 14%
Medium 73% 78% 49% 41% 51% 71% 44% 59% 93% 7%
Large 79% 84% 44% 51% 49% 74% 46% 51% 96% 4%
Unidentified 67% 60% 33% 40% 60% 60% 40% 47% 80% 20%

Question 32: What are the consent management policies and practices that your organization has adopted for complying with regulations?

  1. Consent management tools and processes (R1)
  2. More user friendly cookie policy and privacy policy (R2)
  3. Paper trails, transparency, and accountability (R3)
  4. Privacy, data security audits (R4)
  5. Social engineering audits (R5)

Some respondents did not pick any of the options, possibly implying that their organizations have no consent management practices.

Organization size R1 R2 R3 R4 R5 Any None
Small 23 27 16 24 3 54 15
Medium 17 16 12 29 4 38 3
Large 38 20 17 35 9 51 6
Unidentified 5 7 6 9 5 13 2
Total 83 70 51 97 21 156 26

Expressed in percentages (based on totals per row):

Organization size R1 R2 R3 R4 R5 Any None
Small 33% 39% 23% 35% 4% 78% 22%
Medium 41% 39% 29% 71% 10% 93% 7%
Large 67% 35% 30% 61% 16% 89% 11%
Unidentified 33% 47% 40% 60% 33% 87% 13%

“At least for the fintech organizations, it is quite important privacy is built as part of the culture. If an organization is small, boot-strapped, it can be quite tedious to look into all compliance. Generally, companies move in a self-compliant manner. But as the organization grows bigger, it is important to realize the fact that potentially you will bring in a lot of new people who probably may or may not be aware of the sensitivity of the data. So compliance processes and audits can help ensure that best practices are followed, and that the organization has better vulnerability analysis checks.”

—A software engineering manager of a medium-sized fintech organization, during an FGD


  1. Dourish, P. and Anderson, K., 2006. Collective information practice: Exploring privacy and security as social and cultural phenomena. Human-computer interaction, 21(3), pp.319-342. http://www.douri.sh/publications/2006/DourishAnderson-InfoPractices-HCIJ.pdf ↩︎

Comments

{{ gettext('Login to leave a comment') }}

{{ gettext('Post a comment…') }}
{{ gettext('New comment') }}
{{ formTitle }}

{{ errorMsg }}

{{ gettext('No comments posted yet') }}

Hosted by

Deep dives into privacy and security, and understanding needs of the Indian tech ecosystem through guides, research, collaboration, events and conferences. Sponsors: Privacy Mode’s programmes are sponsored by: more

Supported by

Omidyar Network India invests in bold entrepreneurs who help create a meaningful life for every Indian, especially the hundreds of millions of Indians in low-income and lower-middle-income populations, ranging from the poorest among us to the existing middle class. To drive empowerment and social i… more