The participants on this panel discussion were:
* Kailash Nadh - CTO at Zerodha
* Uthara Ganesh - Head of Public Policy, India, at Snap Inc
* Sherlyn Stanley - Privacy Engineer at Zeta Suite
Udbhav Tiwari, Senior Manager - Global Public Policy at Mozilla, moderated the discussion.
Emerging regulations like the upcoming Data Protection Bill (DPB) and existing regulations such as the GDPR are the main drivers for adopting more rigor in design, strategy and engineering of products and services. The panelists talked about practical (and tactical) approaches they have adopted in their organizations to implement privacy by design principles.
A robust privacy by design practice is a design development process that considers safety and privacy implications of new product features at the front end of the development process. Companies should think about this at the get-go. This sets a baseline for internal product reviews that every update, feature, and product needs to pass.
However, making specific design choices inadvertently throws up questions about a trade-off between higher user engagement and privacy. There is a way to make intentional choices that ensure one tips the balance in favour of privacy.
The following are some principles from a design and data management perspective that are central to a private by design experience:
1. Privacy and legal compliance by design framework: All features are reviewed by a product attorney and a privacy engineer before launch.
2. Limited retention of data: For example, no PII based ad targeting, limiting the number of interest categories through which one targets ads to a certain person, etc.
3. User controls and transparency: Allowing users the option to opt-out of third party data collection for targeting.
4. Incremental protections for children: Not targeting ads to them at all or not profiling their data.
Here are the 7 foundational principles of Privacy by Design.
At FinTech companies, there is a need to comply with very strict regulations pertaining to absolutely every bit of information that they collect. All of this information is mandated, so the idea of “lean data” (collecting minimum data) doesn’t apply to them. It is necessary that companies comply with regulations first, and then build their product to fit into the regulatory compliance framework.
When speaking about Zerodha specifically, Kailash talked about how there is no profiling done on data they collect to send recommendations or push certain types of engagement. At Zerodha, this is known as user disengagement. In this practice, one builds a product with which they offer quality services and keep everything transparent. If people like it, they will use it. There is no mining or profiling of people’s data that is done. Similarly, for many businesses, there is no need to mine and extract value out of people’s private and personal data to offer quality products and services.
Lately, an unfortunate trend has been seen wherein certain kinds of FinTech apps, especially ones that lend money (unsecured, risky, collateral-less loans, etc), infringe upon all possible common sense first principle ideas of privacy by design. These apps mine people’s SMSs or their contacts to do dubious ways of profiling and pressuring people to take out loans that they can’t afford.
To prevent such misuse of people’s data, it is important that user-facing applications have a transparent and comprehensible policy explaining to the user how the organisation is going to use their data. It is also necessary that organisations maintain an inventory to record where and why various types of data have been stored. This aids in better governance of data, which ultimately leads to easier regulatory compliance.
Another privacy by design practice, which is core to Snapchat’s design, is ephemerality, a separation of social content from media content. Effectively, the app does not have a news feed where one’s posts as well as the news are all in one place. However, one can also have a public profile that people can react to and have a highly public conversation. On the other hand, in the social media space, most apps have an open architecture which makes it more compelling for users and hence creates more engagement.
Data minimization is a data management practice wherein apps collect only that data which is absolutely necessary. This practice is key to limiting the possibility of data breaches. Similarly, with regards to cookies that are procured on the apps, the function and need of specific cookies needs to be communicated in a transparent manner to users via cookie policies and other methods.
Snapchat’s privacy by design practices also includes deletion by default and a short data retention period, wherein media and text shared by users are automatically deleted after 24 hours. Apps must also have user controls for people to protect themselves and make their own privacy choices. Another effective practice includes end to end encryption.
More than technical measures, privacy comes down to the culture and philosophies that run a business. For example, no matter what dashboard, reporting system, and customer support system is built for an organisation, access should only be given to people who really need to use it. And within those dashboards, screens, and views, there should be field level permissions to control access. Similarly, a piece of software that one writes should only be able to access what it truly needs to access, and everything else should be closed off by design.
Privacy has to be intentional and should permeate the entire organisational process. Incorporating privacy preserving practices should be done from the start of the software development process, because there is an increasing consumer demand for it as well. The general awareness in society of some of these issues is expanding, and regulation, in many ways, is playing a similar role.
It was suggested that policies be made on a principle basis, rather than with very specific technological recommendations. The implementation of these policies should be left to broad industry discussions among the tech and business community.
Building a bridge between the tech and the policy communities is necessary to further ideate on tech policies such as the Data Protection Bill, IT Rules, etc. to bring in various perspectives on how certain policies would affect different verticals. Policy-making also requires hands-on technological expertise, and regulators who are not hands-on technical should work with people from the tech industry.