Signal in Bangalore

Signal in Bangalore

Signal Foundation's President - Meredith Whittaker - and the Signal team talks about AI, encryption and more.

About the public events

This public event is hosted in collaboration with the Signal Foundation, The Fifth Elephant and Bangalore International Centre (BIC). The event consists of a talk and a panel discussion.
This event is free to attend.

How do organizations like Signal sustain in an environment of mass surveillance - talk by Meredith Whittaker

The tech ecosystem, as it is today, is shaped by concentrated power. The natural monopolies of the telecommunications sectors were not, in fact, disrupted when networked computation was commercialized, and when the internet took shape in the late 1990s. Big Tech simply replicated this monopolistic form, acquiring data and infrastructural monopolies via the surveillance business model, which incentivized mass surveillance and profiling, and the growth-at-all-costs mindset that has got us where we are today.

This concentrated tech power resides in the hands of a handful of corporations, primarily Big Tech companies, which are based in the US and China. As they extend their reach and authority on the wings of the current AI hype, they are shaping an ecosystem that is increasingly hostile to new entrants and small players. Indeed, most startups or small tech endeavors must, in order to function, license infrastructures and use frameworks and libraries controlled and shaped by these large companies.

So, how can organizations like Signal confront and deal with mass surveillance of this industry, sustain and grow in this environment? Join the President of Signal Foundation and renowned AI scholar, Meredith Whittaker, along with members of the Signal team, to discuss the approaches that Signal has adopted in pushing back against the conjoined threats of mass surveillance and enhanced social control of the AI hype wave.

This talk will be followed by Q&A with Meredith Whittaker, Joshua Lund (Senior Director), moderated by Kiran Jonnalagadda, CEO at Hasgeek.

Panel discussion - AI: Beyond the hype cycle

In the aftermath of ChatGPT-fueled AI hype, there is an equally charged conversation on how the public and governments should respond to present (and future) harms related to these technologies. It is a crowded space, with AI industry voices and existential risk (x-risk) doomers trying to shape the narrative on regulation alongside civil society advocates and government agencies.

With many combined decades of experience critiquing and working within the tech industry, Meredith Whittaker, Amba Kak, Udbhav Tiwari, and Vidushi Marda will share their insights and perspectives on the current AI hype wave and the related policy landscape. The panel will particularly focus on the threats this poses to privacy, and the ways that the dominant narratives are getting AI wrong.

Vidushi Marda will moderate the panel discussion.

About the speakers

Meredith Whittaker is the President and a member of the Signal Foundation Board of Directors. She is also a scholar of AI and the tech industry responsible for it, and the co-founder and Chief Advisor to the AI Now Institute.

Amba Kak is a technology policy strategist and researcher with over a decade of experience working in multiple regions and roles across government, academia, tech industry, and the non profit sector. Amba is currently the Executive Director of the AI Now Institute, a New York based policy research center focused on artificial intelligence. She is also on the Board of Directors of the Signal Foundation; and on the Program Committee of the Board of Directors of the Mozilla Foundation.

Vidushi Marda is an independent lawyer working on technology regulation, asymmetric power relations and fundamental rights to advance social justice. She is the co-Executive Director of REAL ML, a non profit organization that translates algorithmic accountability research into impactful interventions that benefit the public interest.
Vidushi’s work has been cited by the Supreme Court of India in a historic ruling on the Right to Privacy, the United Kingdom House of Lords Select Committee on Artificial Intelligence, and the United Nations Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression, among others.

Kiran Jonnalagadda is a tech enthusiast and community builder, passionate about technology and its societal impact. He co-founded Hasgeek in 2010, where he has created a space for technologists to thrive, share knowledge, and network.
Kiran has been actively involved in the Free and Open Source Software (FOSS) movement, and with the SaveTheInternet.in movement for net neutrality in India.

Udbhav Tiwari is the Head of Global Product Policy at Mozilla, where he focuses on cybersecurity, AI, and connectivity. He was previously with the public policy team at Google, was a Non-Resident Scholar with Carnegie Endowment for International Peace, India and was a program manager at the Centre for Internet and Society (CIS) in India. He is also a former member of the Advisory Council at the Digital Equity Accelerator run by the Aspen Institute.

Key takeaways for participants

  1. For startups and small tech endeavors - how to navigate the tech ecosystem that Big Tech continues to shape and control.
  2. Awareness about the harms related to AI technologies, and how should the public and governments respond to them.
  3. Knowledge about the AI policy landscape, and how is it being shaped on a global scale.
  4. Awareness about the dominant narratives around AI hype, and what are they getting wrong.

Who should participate

  1. Public interest technologists
  2. Engineers and open source developers who care about privacy.
  3. Founding teams from start-ups and series A companies who have questions about business models.
  4. Technology leaders - to understand the current context of AI and privacy, and emerging trends.
  5. Policy professionals working on tech/AI policy.
  6. Human rights advocates interested in the intersection of technology and rights.

Contact information

Join the @fifthel Telegram group or follow @fifthel on Twitter. For inquiries, call Hasgeek at +91-7676332020.

Hosted by

All about data science and machine learning

Anwesha Sen

@anwesha25

Signal's Sustainability in an Environment of Mass Surveillance

Submitted Oct 12, 2023

On 3rd October 2023, a public event was hosted in collaboration with the Signal Foundation, The Fifth Elephant and Bangalore International Centre (BIC). The event consisted of two sessions. The following is a summary for the first session, which was a talk by Meredith Whittaker, President of Signal Foundation, moderated by Kiran Jonnalagadda, co-founder of Hasgeek.

What is Signal?

Signal is a private messaging app, built with end-to-end encryption to ensure that only the users that are part of a conversation have access to its contents. The Signal team is working to bring back the norm of private communication that existed before the development, and then commercialization, of networked infrastructure in the 90s. This led to the surveillance business model where data is monetized for advertising and AI. This is the engine of the tech economy. However, Signal does not participate in this surveillance business model and does not collect, let alone monetize, user data. So, how does Signal ensure encryption and sustain in such a tech economy?

Trevor Perrin and Moxie Marlinspike had developed the Signal Protocol in 2013 which is used for message encryption across different messaging apps such as WhatsApp, in addition to Signal. This ensures that even those with access to Signal’s servers will not be able to decipher messages. As a result, Signal themselves do not have access to user communications. On top of this, they have also developed novel cryptographic techniques to protect user’s metadata, i.e. their profile name, profile photo, who they are talking to, etc. This is unique to Signal.

Why is privacy important?

Privacy is about power and power relations. One of the one ways in which those who have power, ensure and secure their power is through developing an advantage in information asymmetry. So, the more information they have about those who they govern or oppress, the more they can control them. Privacy is fundamental to the ability to participate in a meaningful democracy, to organize, to have an independent union, and to converse freely. With more privacy and private communications, there is a better chance of shaping a livable world rather than a world in which a handful of corporations have an inordinate power over lives and institutions across the globe through the surveillance capabilities and infrastructures they have developed.

How do organizations like Signal sustain in an environment of mass surveillance?

Signal is a not-for-profit organization and has to have a charitable aim. This requires them to follow certain transparency protocols. They also cannot receive investments in the classic venture capitalist sense, and in the case that they are acquired, the executives and the board would have to reinvest the money in charitable causes.

These are some of the key barriers, or key protections that allow them to continue their mission of providing private communications. This is particularly important in the tech industry because the barrier that they are protecting against is the primary business model in tech, i.e. monetizing surveillance. This allows them to not compromise on their privacy focus just because surveillance is more profitable.

Signal is donation based and had received a substantial investment from the co-founder of WhatsApp Brian Acton, who was disappointed in how WhatsApp evolved after being acquired by Meta and became less privacy-friendly. This led him to invest in Signal which allowed them to work on developing a more sustainable revenue model that is outside of the surveillance business model.

Why is Signal a centralized application?

Decentralization can be incredibly insecure and often does not promote the kind of collaborative consensus that is required for encryption, since encryption needs to operate on every end. Signal has a very human centered approach and doesn’t want to leave a section of users vulnerable because the people hosting their servers are unavailable. A decentralized infrastructure would also cost Signal hundreds of millions of dollars a year. Moreover, they cannot rely on volunteer labor and need to ensure that they have the same robust coverage in every part of the world. Using a decentralized protocol that lets anyone join and participate does not ensure decentralized power.

How does one know that Signal’s servers run the same encrypted software they publish as their open source code?

The end-to-end encryption protocols used at Signal are open source and have reproducible builds on Android. One can build an APK that is the same as the build that they download from the Play Store, if people want to verify that. But since everything is end-to-end encrypted, there is not very much that the service can do that would be malicious. So a malicious server is not a threat model for Signal in the same way that it might be for other messaging apps where the server is the client and everything on the server is plaintext.

At https://signal.org/bigbrother/ one can view the requests that Signal has received for access to data from governments. The amount of information that they have been able to turn over is extraordinarily limited to when somebody registered for the service and the last time they connected to the service. Signal, as well as their hosts, can only view encrypted data which cannot be deciphered.

How do you operate in an environment where almost every government in the world will require a backdoor via some kind of legislation?

Signal’s position has always been to never implement a backdoor. Encryption either works for everyone, or it’s broken for everyone. If it came down to the choice between being forced to implement a backdoor or leave, they would leave after trying alternatives to continue providing their service, such as by using proxy servers and other circumvention techniques.

Comments

{{ gettext('Login to leave a comment') }}

{{ gettext('Post a comment…') }}
{{ gettext('New comment') }}
{{ formTitle }}

{{ errorMsg }}

{{ gettext('No comments posted yet') }}

Hosted by

All about data science and machine learning