Privacy practices in the Indian technology ecosystem
A 2020 survey of the makers of products and services
2020 might have been heralding the decade of privacy. From leaks in video conferencing software to doubts over COVID-19 apps, privacy concerns have been front and centre. While most of the commentary is when things go wrong, there is an angle not talked about: the attitudes of the people and organizations who make the technology we use on a daily basis. Their work today shapes our experience of tomorrow, but we know little of their perspective.
Between April and December 2020, the Privacy Mode team at Hasgeek (henceforth “we” and “us”) surveyed the makers of tech products and services in India. We sought to understand whether privacy is a guiding factor in product development, or an afterthought in response to regulation and public sentiment.
We started by conducting interviews with experts, followed by focus group discussions. The learnings from these sessions helped frame hypotheses and a qualitative questionnaire. See our note on framing and review, and the detailed findings and analysis.
The survey finds that 90% of the respondents believe that their organization treats privacy as a fundamental right and that it attempts to realize that intent while building products and services. At the baseline, at least three in four respondents in authority roles require thinking about privacy and compliance issues across all organization sizes.
However converting that intent into practice remains an onerous challenge. Small and medium-sized organizations fare worse than large organizations. All three struggle with:
Some industry sectors are governed by regulatory regimes such as HIPAA, CCPA and GDPR. The responses within these sectors did not deviate greatly from the general responses. This suggests adding regulatory pressure on small and medium organizations does not automatically change their internal product development dynamics to make them more privacy-friendly.
Over a fifth of respondents across all organization sizes say there is a lack of a peer group across the tech ecosystem to discuss and find solutions to privacy concerns. This lack was felt both within the organization and outside of it. This points to a larger concern where even those who may want to implement privacy-respecting features in products or services do not have the adequate support or use-cases to guide them. As organizations get larger, their access to external peer groups reduces, indicating the development of an insular culture.
As one startup founder we interviewed said:
“Companies are doing a poor job communicating this (the need for user privacy and security)... I don’t think they are malicious. They’re lousy in their implementation and security, but there is no mal-intent or ill intent to begin with, they are lousy sometimes.”
The good news is that awareness about privacy and recognition of its importance as a practice – and as an end-user benefit – exists across leadership in large, medium and small organizations. The not-so-good news is that intent by itself is not sufficient to move organizations towards the ideal of protecting user privacy and user data.
As the data in this research shows, specialized personnel, budgets, processes and community resources have to be available in abundance if India wants a smooth transition towards a data protection regime. This is where the gap is.
Here are our recommendations for different stakeholders to move towards the privacy-preserving ideal, one step at a time.
Small and medium organizations struggle to establish a business model with repeatable unit economics. This is a paramount concern in the early stages. While there is intent to embed privacy practices in the product-development cycle, small and medium organizations do not have the skills or budgets.
As our data indicates, adding regulatory pressure does not improve outcomes. On the other hand, regulation can increase the compliance burden, thereby adversely affecting small and medium organizations and turning them into non-viable businesses.
This research recommends to the Joint Parliamentary Committee (JPC) and the Data Protection Authority (DPA) for the Personal Data Protection (PDP) bill that the bill’s provisions must provide for exemptions for organizations below a certain revenue or fundraise threshold, and that the government must invest in the development of community resources to make the law feasible.
Across all the segments, including large organizations, software engineering practices have not matured enough to be ready for PDP compliance from day one. There is also a lack of clarity at the operational level on the implementation of privacy processes such as consent management, data-security audits, data-erasure requests, anonymisation, and purpose limitation.
Technical standards have to evolve – both from within and by knowledge-sharing at the community level – to address these issues. The JPC and DPA must provide at least two years’ time for organizations to become compliant with the standards prescribed in the PDP bill’s provisions.
Business organizations, especially early and growth stage startups, generally refrain from sharing internal knowledge and practices with other organizations. Insecurity about growth and organizational secrets, and pressure from investors to grow Intellectual Property (IP), has led to a culture of withholding knowledge, making it hard to develop capacity.
The structural capacity issues that this research refers to – skill gaps and lack of knowledge resources – can be resolved if leadership, legal, and data privacy officers in small and medium organizations become forthcoming about systematic best-practice and knowledge sharing. Journeyman documentation, which lays out the map of these practices at operational, and technical levels, will be more beneficial than broad directions.
While simplistic solutions may get widespread adoption because they substitute for missing capacity, they will hamper small and medium organizations from competing on a global scale, mainly because these organizations may have to maintain and invent different sets of practices to comply with standards such as GDPR and CCPA. This will increase the cost of compliance.
For investors, advisors and external stakeholders, as well as community organizations and founders of small and medium organizations, this research makes clear recommendations:
Build capacity by contributing to a shared knowledge pool beyond your organization, in the form of meetup groups and their knowledge archives
Encourage thinking about privacy at the product and engineering levels. Allow your practitioners to gain public recognition for their work.
Take privacy preserving ideas and practices in product development cycles to external forums, to policy makers and such, and encourage adoption across the ecosystem.
Scroll down for the individual chapters.
Nadika Nadja is a researcher at Hasgeek. She has worked across advertising, journalism, TV & film production as a writer, editor and researcher.
Anand Venkatnarayanan has worked as a cybersecurity engineer for two decades. He is currently consulting editor for Privacy Mode and a non-executive director at Hasgeek.
Anish TP illustrated the report. Umesh PN, Zainab Bawa and Kiran Jonnalagadda edited. David Timethy was the project manager.
We would like to thank the following individuals from tech, legal, and policy backgrounds who provided feedback during different stages of the research. Their feedback helped the team fine-tune and bring rigour to the research process. While Hasgeek sought funding from organizations, the research itself was conducted – with full disclosure at all stages – independently and objectively, and the findings do not reflect any individual organization’s needs.
Updates and corrections are posted in the Updates section. Use the “Follow” button to get notified.
06 Apr 2021
09 Apr 2021
28 Apr 2021
06 Apr 2021
18 Apr 2021
25 Apr 2021
09 Apr 2021
Hosted by
Supported by