Holistic privacy: How to remove the target on encryption's back
May 13, 2022 / Alex Linton
Encryption, encryption, encryption. When it comes to the digital world, encryption is the number one tool we've got for protecting people's security and privacy. If you had to pick a singular cybersecurity thing which was the most important, it would be encryption. Without encryption, the entire internet would catch on fire and we'd be left with a pile of ashes containing everything from your texts to your grandma, to your bank password, or even top secret government memos.
Losing—or weakening—encryption would be a catastrophe — and that's why so many good people fight tooth and nail to protect it. However, as the fiery debate over tech (and how to regulate it) rages on — the focus on encryption is starting to get really intense. And with that intensity, it's starting to feel like other aspects of security and privacy aren't getting the attention they deserve. Things like metadata, data storage, and digital identities also have enormous privacy and security implications, but they're often left out of the discussion when talking about regulation and technical designs.
Instead, regulators focus their efforts on trying to weaken or ban encryption: it's all about backdoors, lawful access, and wanting access to things which are, well, impossible to provide access to (that's the whole point of end-to-end encryption, after all). When it's only about encryption, the conversation about tech—which is often extremely complex—loses nuance. Don't get me wrong, encryption has to be a core part of any discussion about privacy and security, it's just not the only thing that matters. The conversation about whether something is (or isn't) private does not begin and end with 'is it encrypted?'. Sure, if it's not encrypted, that's a big red flag. But if it is encrypted, does that immediately equal privacy?
Is WhatsApp private?
We pick on WhatsApp a lot around here, but the reality is they're just a really good representation of a lot of the issues that matter to us. We've discussed the concept of privacy-washing before – the various ways that technology companies will try to convince users that their platform is private (because they know people care about privacy) while they're selling their data for profit behind the scenes.
WhatsApp is end-to-end encrypted — they even have a full implementation of the lauded Signal protocol. A huge tick for WhatsApp. WhatsApp is also owned by Meta, the literal supervillain of internet privacy. On its own, that is enough to raise an eyebrow but not necessarily condemn the entire application. However, last year WhatsApp introduced some very controversial privacy policy changes which caused a heap of people to ditch the platform. For a lot of people, those privacy policy changes felt like a smoking gun which proved what they already suspected: being owned by Meta did (or at least, could) compromise WhatsApp and jeopardise their privacy.
This isn’t just theoretical, either — according to this ProPublica article, U.S. law enforcement have utilised WhatsApp metadata to put multiple people in jail — including Treasury Department whistleblower Natalie Edwards. There are ways that encrypted platforms can be undermined through things like exploiting metadata — and Meta (heh) is already playing this very dangerous (and very dirty) game.
WhatsApp is the perfect example of why there are other things that we need to consider when talking about privacy. It uses the exact same encryption as Signal, but is it as private as Signal? Obviously not. But, because of the extreme focus on encryption, a lot of people might think that WhatsApp is as good as it gets when it comes to app privacy — it's end-to-end encrypted, after all. Other people might realise that WhatsApp has issues, but they're not really sure what they are. All of this confusion and misunderstanding ends up making it harder for people to find and use the apps that really offer the protections they want, as well as leading regulators to believe that encryption is the root cause of all of their problems.
Regulators need educators
We cannot compromise on encryption. I was recently asked by a policymaker what I, as someone working on secure tech, thought would be a 'proportional limitation on encryption' given the goals of law enforcement agencies and national security and intelligence agencies. I don't think they liked my answer — no limitation on encryption. Any weakening of encryption is too much, there is no way to compromise encryption 'a little bit'. It's either secure, or it isn't. There simply is no middle ground that can be found here — and this is the cause of a lot of angst and tension between technology-makers and technology-regulators.
At the moment, there are two kinds of maker-regulator discussions that feel common. The first involves regulators wanting to find some kind of encryption compromise when none exists — the compromise they're looking for can't involve encryption at all, but will need to focus on other aspects of tech. The other is more hostile, where regulators have already decided that encryption is the ultimate enemy and attempt to squish it without even inviting the tech-makers to the discussion.
Obviously both have their problems — and both seem to be rooted in a lack of technological understanding from the people making regulatory decisions. I recently heard an anecdote from one of my colleagues in the digital rights space, wherein a law enforcement official speaking at a conference was lamenting how impossible the process of crime-solving had become due to encrypted messaging. Upon further discussion, it was revealed that the Fort Knox of messaging they were referring to was... Messenger. Encryption wasn't the issue at all, but it still copped the blame thanks to years of anti-encryption rhetoric.
Encryption isn't going anywhere
Despite the numerous attempts from governments, lawmakers, and regulators around the world — encryption isn't going anywhere. Encryption is here to stay — and no matter how hard they may try, they will never convince us that it is in people's best interest to get rid of encryption.
With that in mind, our conversations around tech need to start assuming that encryption is a fundamental part of safety, security, and privacy instead of trying to figure out how it can be removed, weakened, or banned. With this shift, we can hope to see the merry-go-round of attempted encryption regulation—which is achieving nothing—finally stop.
Three reasons to choose a decentralised private messaging app
December 25, 2024 / Session
How to stay safe on Session
December 18, 2024 / Session
The Privacy Risks of Digital IDs: What You Need to Know
November 26, 2024 / Session
Session User Survey: What Drives You to Join Online Communities?
October 21, 2024 / Wesley Sukh
Celebrating Global Encryption Day
October 20, 2024 / Session
Introducing the Session Technology Foundation
October 15, 2024 / Session