MS. ZAKRZEWSKI: Well, thank you so much for joining us, and we have a lot to talk about today. So I want to dive right in to an op-ed that ran in The New York Times in December that got a lot of attention online. Effectively, this op-ed basically argued that encrypted messaging apps are promoting, quote, “a rather extreme conception of privacy.” The op-ed argued this is dangerous because this extreme version of privacy could be exploited by criminals. What’s your response to that op-ed?
MS. WHITTAKER: Look, this is not a new argument. It’s actually an argument that has been debunked in a number of ways over the past 20 years as a kind of hotly contested debate around whether the human right to privacy should extend to digital communications has played out, and this has been called the “crypto wars.” This has had other names, but there are actors on one side who believe that this human right should extend to any mode of communication or, you know, creation or what have you, digital or analog, and then there are powerful actors who would like to have access to the information that would be otherwise shielded if we did extend the right to privacy. So this is not a new debate.
I think what is–what’s interesting about the op-ed is more the op-ed metadata than the contents. The contents was, you know, honestly kind of a mess, and I would point you to folks like Jennifer Granick or Matt Green or Eva Galperin who, you know, offered really compelling takedowns on Twitter and otherwise. There are, you know, folks who have been studying and engaging with these issues for many, many years who could have easily dissected the weakness of those arguments.
But I think what is unique is that the op-ed is coming at a time when attacks on privacy are accelerating. We see dangerous regulation in the UK and the EU that is set to potentially go into effect this year, and we see increasing calls for things like age verification and other anti-privacy measures.
We saw a bill passed in California recently that would require certain sites to verify the age of people who are visiting those websites, right, creating a pretty comprehensive surveillance apparatus that would be necessitated for enabling that type of validation.
So this op-ed is being placed at a time when there is a renewed attack on the human right of privacy, and it is–you know, it’s notable that this op-ed, you know, in my view–and I kind of did a thread breaking this down on Twitter for those who are interested in more, but I think this op-ed is weak arguments that are packaged in the imprimatur of authority–it has–you know, The New York Times always helps with that–that will most likely be used as a kind of hollow citation by those who are in the next year set to likely unveil anti-privacy regulation and political platforms. And I think we really need to watch out for that. What instrumented use do op-eds like this–particularly because it was written by somebody who I’d never heard of. This is not someone who is known in this space. The author owned a fireworks company by turning–you know, before turning to sort of ethics consulting for tech. So there’s not–you know, this is not somebody who seems to have a lot of groundwork in the privacy debates or a deep understanding of the kind of nuances and particularities of this issue. Nonetheless, an op-ed that says privacy is bad can be easily leveraged by opponents of privacy, so…
MS. ZAKRZEWSKI: And I wanted to go back to a point that you made in the beginning of your answer that this is a debate that we’ve been having since the early 1990s around what is the role of encryption, and in your view, I mean, is there any way to preserve the benefits that we get from these privacy-enabling services while allowing government or law enforcement access to encrypted communications?
MS. WHITTAKER: I think there has been a kind of–you know, again, this is magical thinking that you see sort of cropping up like mushrooms, because the will remains, even if the facts are sort of inconvenient, right, is that law enforcement, governments, large institutions, corporations would like access to this data, right? However, you cannot break privacy for some people and leave it intact for others. It’s a kind of all-or-nothing proposition, and this is–you know, this is the facts at the kind of technical level. You can’t have a backdoor encrypt in it, you know, for encryption that only the good guys–as, you know, sophomoric as the framing of good guys and bad guys is in a nuanced and adult world–you can’t have encryption that has a back door for only the so-called “good guys” that isn’t then exploited in some ways by the bad guys. That is then a vulnerability that is open for exploit by everyone else, and it ultimately breaks the privacy promises that all of these institutions also rely on, right?
Most people in government use Signal. Most people in law enforcement, you know–well, I don’t–I don’t actually know about law enforcement, but I was in government for a minute, and that is very clear, right? People understand the value of privacy when it applies to themselves. It’s this magical thinking that somehow leads people to hope, to wish, to sort of falsely believe that there is a way to break privacy but only so that I can get to those other people’s information. Somehow my information will become safe.
MS. ZAKRZEWSKI: And I wanted to ask you too. You mentioned the global push for regulation, and we’ve seen a real willingness, especially in Europe and in California, to address online harms and really severe issues we’re talking about here, like child exploitation online. What are some of the ways you think regulators could try to address some of those issues that we’re seeing, particularly when it comes to online harm on social media, that wouldn’t damage people’s right to privacy?
MS. WHITTAKER: Yeah. I mean, there are–there are hosts of–you know, methods for addressing the real-world material harm to people, to children, to others, right? And this is–you know, I am–you know, I think there are a lot of methods. You saw–you know, you see law enforcement prevail in a number of cases that don’t require sort of back doors, right? And I–you know, I kind of want to lead that–leave that speculation to the experts who are working on those issues in particular.
What I can say is that we are seeing child endangerment, which all of us agree children should be protected, right? That’s not a–that is not a controversial–you know, that is not a controversial stance. However, we are seeing the issue, the very emotionally charged issue of child endangerment, being used now, as it has been in the past, as a pretext for arguing that we need to implement, you know, fundamentally unworkable, you know, mass surveillance capabilities.
So you have the–you know, there’s a bill in the UK called the UK Online Safety bill and folks like the Open Rights Group. I would point people to the Open Rights Group and their work if you want to get a better understanding of this bill and its dangers. But this is, you know, sort of moving through the UK and includes requirements for private messaging, to intercept and scan all content for sort of, you know, bad content, right? That content includes extremely ill-defined terms like “grooming,” right? What is grooming?
I think we can sort of marinate on that question and look at a lot of the anti-LGBT campaigns and a lot of the, you know, kind of–you know, I think dangerous and exclusionary rhetoric that we’re seeing in the U.S. and elsewhere and understand the–you know, this is not even a slippery slope. This is like a, you know, cavernous drop that happens when we begin to implement this for a seemingly good cause, which is protecting children, and suddenly, these definitions expand and expand and expand and are sort of instrumented so you have mass surveillance of all communications in a world where people increasingly do–you know, they require these messaging services to go about their daily life, to participate in commerce, to participate in their workplaces.
And, you know, then there is just a–you know, there is a mass surveillance device in front, you know, between you and the person you want to message that is being controlled by an entity who can define and redefine the terms of what is acceptable and unacceptable content, right? This is an extraordinarily dangerous precedent, and I think history shows how friendly such policies are to authoritarianism and oppression.
MS. ZAKRZEWSKI: And, on that point, I wanted to ask you a little bit about some of the obstacles that Signal has faced around the world. Recently, you led a push to make sure that Signal could still be available to protestors in Iran, despite government efforts to block the app. Can you tell us a little bit about how effective that push was? Do you have any data on how many people were able to access the service from Iran?
MS. WHITTAKER: Yeah. I mean, I think before I answer that question directly, I just want to highlight, like, that’s a great kind of, you know, almost a crude example that I think proves the point of how threatening to oppressors and authoritarians, the ability of the public to communicate privately is, right, that there was a–you know, when social unrest and popular uprising fomented and began manifesting on the streets, one of the first actions of the government was to shut down communications because that was seen as endangering their authority and their modes of social control.
So, yeah, they blocked signal. They blocked some other apps, and we put out a call to our community, which is, you know–love you all–really robust and wonderful community out there to run proxy servers, which basically would allow people in Iran to circumvent that block and get to Signal through another means.
And we don’t–you know, those servers weren’t run by Signal, right? Those servers were run by community volunteers, but we did hear from people that the traffic stats on those servers were, in many cases, fairly high. We heard reports from people in Iran that they were able to use Signal. We still had issues with registration blocking and other things, but, you know, we did see that that contributed to the ability of people in Iran to communicate privately and coordinate with each other at a time when it was particularly critical.
MS. ZAKRZEWSKI: And you mentioned that Signal community, and I think one of the things that’s really unique about Signal is how different it is from a lot of the for-profit companies that make other messaging services that people are familiar with. And I wanted to ask you, because you don’t have advertising as part of your business model, will there ever become a point where people will have to pay for Signal?
MS. WHITTAKER: No. Signal will always be free to use. So, you know, one of the reasons for that is we don’t want to implement a privacy tax just per our mission and ethically. We don’t think people who can afford it should be the only folks to avail themselves of privacy.
But, beyond that, there’s a practical reason, right? If I am the only one in my friend group who pays, then Signal doesn’t work to me–for me because I can’t talk to anyone, right? So we really do need to leave it open and sort of allow that network effect of encryption to take hold.
Now, that doesn’t mean we might not have sort of paid storage or other sort of add-ons in the future. We’re exploring the possibility of some of those potential revenue streams, but nothing is on the roadmap right now. But Signal as a service will always be free to use, and that’s pretty core to our mission.
MS. ZAKRZEWSKI: And I did want to ask you also about some of the recent product changes to Signal. There’s been a lot of attention on the fact that Signal has added these ephemeral Stories that are somewhat similar to what we might see on Instagram or Snapchat, and that was a bit of a controversial decision. I saw some pushback from some heavy Signal users, and so how do you think that rollout is going? And any stats on how many people are using it?
MS. WHITTAKER: Yeah. So I don’t have stats to share, but I can say, like, the limited information we do collect and emphasis on limited because we don’t collect the kind of user analytics that a surveillance app company would normally collect, but, you know, we do see Stories used. And we have heard a lot–a lot of good reviews from people who are just enjoying them and feeling kind of a newfound refreshing feeling of, like, sharing intimate contact–content and not–you know, and feeling like you, you know, trust the platform, that it’s actually ephemeral, that this isn’t, you know, smoke and mirrors that’s going to lead to some crappy targeted ads or what have you. So we are seeing also positive–you know, positive feedback.
But I think what’s interesting about this is that a lot of the pushback we were getting seems it’s–you know, we’re U.S.-based, U.S. and Canada, but, you know, that’s the time zone we work in as a remote team. And a lot of the folks who work for Signal are U.S.-based, and a lot of the people we hear from are folks who are speaking English. You know, they’re on Reddit. They’re on Twitter. Many of them are very technically conversant. I would say some percentage of them are probably in the Bay Area. It’s a very specific demographic, and the choice to add Stories, to prioritize that was not made to, you know, prioritize a kind of Western, U.S.-based, you know, kind of tech-centric demographic. We were really looking at populations of people in the majority world, particularly in South Asia and South America, where Stories have evolved as like just a regular kind of normative form of communication, much different–you know, much different than the use case in the U.S. or other places in the Western world. And we really didn’t want to leave those folks behind, right? We’d heard repeatedly for a number of years like, “I can’t switch to Signal because Stories are the way that I communicate with my friends,” and it’s just a non-starter not to have these.
So I think what’s interesting about that is the kind of–you know, the kind and quality of pushback that you get when you deprioritize a sort of, you know, kind of hegemonic tech-centric, you know, always has available bandwidth, you know, Western population and begin building for, you know, different populations in a heterogeneous global world.
MS. ZAKRZEWSKI: And it’s interesting when you talk about that international focus. We’ve seen certainly that WhatsApp, the Meta-owned encrypted messaging app, is quite popular outside of the U.S., and I wanted to ask you, when it comes to privacy protections, how does Signal compare to WhatsApp? What are the biggest differences that consumers should be aware of?
MS. WHITTAKER: Yeah. That’s a–that is a great question. First, WhatsApp does use the Signal protocol, which is the state-of-the-art to encrypt the contents of WhatsApp messages, at least for sort of consumer WhatsApp. WhatsApp for business doesn’t do this, but that’s a different use case. So that’s great, and I want to commend them for a visionary choice there, because, you know, at the time they implemented that, you know, that wasn’t the norm.
However, there are some major differences that really, really do matter and lead me to be as secure in saying that WhatsApp cannot be considered truly private, and those differences are primarily on the kind of structural factors of the business model and the attention to metadata.
Now, Signal goes out of its way not to collect metadata, and metadata, for those of you who don’t spend your days mired in technical language, is just sort of like, you know, the information about who is talking to whom. It is the kind of, you know, meta information about who, what, where, why, when that is in addition to the contents of the messaging or the other substance.
Anyway, so WhatsApp collects metadata. Signal collects almost no metadata. We have no information about you, and you can look at–you know, you can look at Signal.org/BigBrother. You can look at how the vanishingly small amount of information we’re able to provide when we are forced to comply with a subpoena.
In contrast, WhatsApp collects information about who’s in a group. It collects information about who’s talking to whom. It collects profile information. It collects photos, and it collects other sort of really key information that is extremely revealing and often more powerful than the kind of contents of the chat.
I think moving from the metadata distinction, which is pretty big and we need to focus on that, we also need to recognize that, look, WhatsApp is owned by Meta, which is a–you know, Facebook, right? And their business model is the surveillance business model, right? They’re one of the–you know, the big players in the surveillance business model. They have huge amounts of extraordinarily intimate information that they collect and create via Facebook profiles, Instagram, et cetera. On top of that, they can buy additional information from data brokers and do buy this information to create sort of staggeringly precise dossiers about us and our communities and who we talk to and what we might be interested in buying and et cetera, et cetera. So the metadata that WhatsApp has may be–you could argue it’s limited in one term or another, although it’s already fairly powerful. But it’s owned by Meta, right? And that can then be joined with Facebook data and other data that is sort of at the heart of the business model of that company.
Now, am I saying that they do this routinely? No. I don’t know. Right? Like, it’s a proprietary company. That information is not made available. I’m saying that that is the engine of their business model. They have it available, and I wouldn’t trust them to sort of keep that promise if sort of some dreary earning reports and sad, you know, revenue growth numbers sort of prompted the board to reexamine that, right?
And, you know, the difference there is, you know, Signal is a nonprofit. We don’t have a board that’s going to be pushing us to maximize growth and revenue, and we are governed by completely different incentives.
MS. ZAKRZEWSKI: And so what does that mean in the context of a government order or a subpoena? What information would WhatsApp be able to hand over to law enforcement?
MS. WHITTAKER: You know, I’d actually have to check this. I don’t believe they publish that information with the transparency that Signal does, right? We–again, Signal.org/BigBrother will give you everything we’re able to hand over, but WhatsApp–you know, WhatsApp collects my profile information, so my name, any other information I put in my bio. It collects photos. So my photo, you could match that to a Clearview database, Clearview being a massive facial recognition company. It collects information about who’s talking to whom, who is in a group. So you can begin to map my network. If you have my name, you have someone I’m talking to, you begin to map who I’m talking to, when I’m talking to them. So that is already a fairly–like, you know, that’s a powerful constellation of data points that could then be obviously handed over to law enforcement. It could also be used to sort of, you know, input into a machine learning classifier to make predictions and determinations. And it could be mapped to Facebook data once you have my name and profile, run that through, you know, DeepFace or something, find all my images on Facebook. You know, I’m sorry. I could extrapolate for a while around sort of the kind of dark potentials here, but I think the emphasis on like this is–this is powerful data, and when joined with other data or other capabilities, it can do a lot. And, you know, it certainly breaks privacy.
MS. ZAKRZEWSKI: And how does that compare if Signal were to receive a government order? What information would you turn over?
MS. WHITTAKER: Almost nothing, and that is by design. Again, we spend a lot of time in rooms together talking about how to limit the collection of data, even though that collection is also often the norm. So we can provide phone number. That’s the phone number some account signed up with, not necessarily yours. It could be a Google Voice phone number. So we don’t really know that. We don’t have that information. We can provide information about when someone signed up. We can provide information about when they last accessed Signal.
But, you know, again, I want to actually backtrack. When a phone number signed up, we don’t have any information about, you know, who’s whom. So we don’t have profile information. We don’t have information about who’s in a group. We don’t have information about who’s talking to whom, and we can’t match that or the other. And, again, you can find how little information we’ve been able to give if you go to Signal.org/BigBrother.
MS. ZAKRZEWSKI: Got it. And I wanted to ask you too, though, about Signal’s own dependence on some of these large tech companies. Do you view yourself as dependent on companies like Apple, Amazon, and Google, and what privacy or security risks does that present?
MS. WHITTAKER: Well, I mean, I think “dependent” is a tricky word here, right? Like, let’s, like, zoom out and look at the political economy of the tech industry right now. The surveillance business model itself trends toward consolidation. So, you know, you have–you know, when you have the data, you have the infrastructure, you have the market reach, those things are self-reinforcing, and that’s one of the reasons we are now–you know, we went from these, like, you know, kind of primary-colored startups in the early 2000s, you know, the Googles and, you know, et cetera, to an ecosystem that is dominated by a handful of sort of surveillance companies. And those are the companies that have the infrastructure and the reach, right?
So every startup, every sort of, you know, broadly distributed app that isn’t owned by these companies is, like, floating on their infrastructure at some level, right? It’s licensing, you know, AWS servers or Azure or Google Cloud. They are to meet the standards of always available, you know, instantly kind of performance that is now just what people expect tech to work like, that defines whether tech works or not. You have to have sort of, you know, global reach, and you have to have failover capacity, and you have to have the types of infrastructure that would cost us hundreds of millions of dollars a year if we were going to try to bootstrap our own data centers and our own site reliability engineers and our own sort of failover capacity and maintain and care for those infrastructures indefinitely.
So, yeah, there is a big issue in tech with the concentration of power and the concentration of power in the hands of the companies that own the hardware, the infrastructure, the data, and the access, right? And then, you know, we can talk about App Stores. We can talk about the Play Store. We can talk about all of these dependencies that everyone who operates in tech needs to work around. So, you know, we could just go into a little cave and create sort of an ideologically pure proof of concept, right, that’s, like, fully distributed and federated and et cetera, et cetera. But, like, no one is going to use that, right? So we might feel really good and righteous about ourselves, and, like, four cryptographers use it to talk to each other, and then, you know, one of them has a kid and is like, “I can’t–I don’t have time,” right? But, if we want to actually provide a service that allows human beings who are not, first and foremost, driven by an ideological commitment to actually have privacy, to actually communicate safely and privately and intimately, then we need to, like, work with the–you know, work with the landscape we’re given.
And so we do the–you know, I would say you added a question like what are the privacy and security considerations of sort of, you know, licensing those infrastructures. We do a lot of work to make sure that, you know, the encryption means that neither Signal nor the owners of those infrastructures can see anything, but nonetheless, we do have to work with the–work with the world we are in. And we–you know, our decision is to prioritize the norms and expectations of the humans who are using us, not sort of, you know, ideological North Stars.
MS. ZAKRZEWSKI: And you mentioned the consolidation in the tech industry, and you worked at the Federal Trade Commission under Lina Khan. There’s been a lot of excitement that Lina Khan would usher in a new era of tech accountability in that role. How do you think that’s been going?
MS. WHITTAKER: Well, you know, it is an uphill battle. More power to her and all the brilliant staffers and lawyers who are working with her. But it’s not–you know, I would say the best thing that people who care about privacy, who care about the distribution of the power that is currently concentrated in the hands of a handful of large tech companies can do is hold their feet to the fire. Give them as much leverage as you can to be able to push things through an agency that itself is often recalcitrant and sort of constrained and contains folks who might not agree or might want to get a job as general counsel somewhere else afterward and don’t want their name on some strong legislation.
I would also–you know, I would also look to the lobbying dollars that the tech industry is spending for another indication of the kind of uphill battle.
But I think, you know, again, it’s going to take–you know, it’s going to take folks outside. It’s going to take folks inside. It’s going to take a concerted push to get meaningful regulation that cognizes the sort of reality of the tech industry over the finish line.
MS. ZAKRZEWSKI: And we have just about a minute left, and so I wanted to ask, what can people expect from Signal in 2023?
MS. WHITTAKER: Well, we’re going to keep doing what we’re doing. We are–you know, I have talked before we are working on usernames. We don’t have a launch date, but we are expecting those this year, hopefully first half of this year. But, again, they will be done when they’re done because we’re going to do them right, and we’re going to do everything we need to make sure that they are robust and ready to launch.
And, yeah, I don’t have another, you know, part of the roadmap to preview here, but we do love hearing from folks who use Signal. There’s a Reddit forum. There are community forums. There’s Twitter. You can join the ranks of the helpful reply guys there, and we will–you know, we’re going to continue with a sole focus on privacy, on building the best, most usable app we can that also protects privacy in a robust way. And we hope that we can lead by example, and some of the other companies will start coming along with us not only encrypting message contents but encrypting metadata. And we do have specifications for how we do that, that are also available openly, and we are always happy to share if there are people who are interested in sort of meeting us at the high bar.
MS. ZAKRZEWSKI: Well, unfortunately, that’s all the time that we have left for today. Thank you so much, Meredith, for joining us for such an informative and important discussion about the future of privacy.
MS. WHITTAKER: Wonderful. Thank you so much, Cat. Have a great day, and thank you all.
MS. ZAKRZEWSKI: And thank you all for joining us at home. If you want to join for future Washington Post Live discussions, you can find that information on our website at WashingtonPostLive.com.
I’m Cat Zakrzewski. Thank you.