Physical privacy defenders: Learning from these awesome low-tech privacy tools
August 19, 2022 / Alex LintonPrivacy
Encryption that's impossible to decipher, complicated routing techniques, convoluted network setups — we have to go to some pretty extreme lengths to preserve privacy in the digital world. But...it wasn't always like this. In the physical world, privacy is something that we inherently recognise and respect — you wouldn't just open someone's letterbox and start thumbing through their mail, and even just overhearing someone else's conversation in a quiet café can feel a little...off.
On the other side of the coin, when people are invading your physical privacy it's normally pretty obvious — you can feel when someone is looking at you, that awkward eye contact when you catch them gazing, them trying to play it off or quickly darting their eyes away. The physical world maintains this state of mutual observation—because you're actually physically there—where people can easily be caught out, and the repercussions (or at least preventative actions) are immediate. Social norms help us defend our privacy, too — people know it's weird and creepy to stare at strangers in public, for example.
There are lots of ways that physical privacy and digital privacy play out differently, which means that the way we approach, protect, and even conceive of privacy in these two contexts is often radically different — but that doesn't mean we can't learn from the many, many ways privacy has been preserved over the years.
Easy access required
One of the most common metaphors that people use when it comes to privacy, security, and trust is the simple locked door. Okay, I trust my neighbours — but I still lock my front door. The locked door is a go-to metaphor for explaining digital privacy because it's something that everyone can relate to really easily. Taking this one step further, one of my favourite contraptions — the fake rock. Leaving a spare key outside is common so that you can get into the house if you forget or lose your keys, someone coming to visit can unlock the door, or any number of reasons, really. When I was growing up, my family had one of these fake rocks which was hollowed out on the inside so that it could conceal a key. Put it in the garden and it's just a rock — a secret rock. Our fake rock was actually the only rock in our otherwise leafy garden, and I always thought to myself, 'well anyone who bothered to look would find this straight away!'
Well, therein lies the answer as to why nobody ever found the hidden key inside our secret rock. Nobody bothered to look! In all likelihood nobody really wanted to break into our house, but there were also some social norms acting as a silent defence against would-be intruders of our home's privacy. Anyone even trying to look would be suss, right? Somebody who wants to find this key is going to have to rummage through a garden to find this hidden rock. Anybody walking by could notice them, ask them what they're doing, or just straight up call the authorities. Because of this, the privacy-invader only has a limited time that they can actually (physically) access the property to attempt to find the key and breach the house (and our privacy).
Because of this limited access, really simple obfuscation techniques (like a fake rock that only kind of looks like a real rock) can be very effective. This is a level of obfuscation which would be laughed out of the room in a digital context, though — because the same guarantees around access just don't exist.
There might be some situations—like a malicious hacker gaining unauthorised access to a company's database—where someone might only have access to your data for a limited time, but even in cases like that it's common for all that juicy data to be transferred before the hacker gets booted — meaning they (effectively) have indefinite access to all that info. Besides, most of the time your biggest concern should be the companies themselves — Meta, Google, and Microsoft have constant access to your personal info. Because there is nobody passing by on the sidewalk spotting global mega-corporations sifting through your personal property, it's hard to call it out sometimes — and it's hard to stop them as well.
After 20 years in prison, Mary, Queen of Scots wrote a letter on the night before she was to be executed. This letter, addressed to her brother-in-law, would contain Mary's the last will and testament. Back then, paper was super expensive, so envelopes weren't really a thing. Instead, Mary went through a series of thirty-odd steps to produce something called the spiral lock, a mega-complex form of letterlocking. So complicated and impressive was Mary's lock, that it wasn't until 2021 that researchers finally figured out how she actually did it.
This spiral lock gave a simple guarantee — if the seal wasn't broken, nobody had opened the letter. Its contents were safe and secure. To open the letter, you had to tear the paper — causing permanent physical damage that the receiver, Henry III of France, would easily notice. Now, there are cryptographic ways to provide similar guarantees, it's true. But the idea of things being permanently altered or deleted is something which holds true in the physical world much more so than the digital world.
If you need to delete a document in the real world, you can rip it up and throw it in the bin, pop it through the shredder, or straight up set it on fire. Boom, gone. In the digital world, you can just right-click delete...right? Not quite. With the prevalence of SSD storage these days, data deletion guarantees are, well, non-existent. Digital forensics research has shown that data deleted from SSDs is not always purged from the disc, meaning that even 'deleted' data could be later recovered. And that's if you truly have the only copy — in the world of the internet, how can you ever be sure there aren't a million copies of that file floating around on some data centre in Singapore? The answer is you can't, unfortunately — and because it's hard to delete things, and it's hard to prevent people (other than yourself) from accessing specific data, privacy enhancement gets really difficult.
The bottom line
Privacy is built into the human experience. Nobody wants to feel like they're being watched, listened to, or spied on. It's just plain creepy. People want privacy, and they always have. The measures taken in the past are still useful — when they're relevant. The issue is that we are living in an increasingly tech-dominated world, and a lot of the techniques we have been relying on in the real world don't quite translate into ones and zeroes.
Some regulation, like the GDPR, has recognised some of these basic pre-requisites for privacy — like the right to access and the right to deletion. The issue is that, as much as this regulation may be well-meaning, the way that privacy manifests in the tech world means that we need to be a bit more imaginative about how we conceive of the problems and the solutions if we want to translate the privacy we've enjoyed for hundreds of years into the modern world.
Part of that means building privacy enhancing technology. Having an ethics-first approach to building technology is critical, and if privacy is an afterthought then (not to be overly pessimistic) we're going to fail. There is light at the end of the tunnel, and we have already started taking steps towards a more private future — but there is work yet to be done.
Session Release Roundup #17: The long awaited syncing update
July 18, 2023 / Kee Jefferys
Protecting a Free and Secure Internet: World Press Freedom Day
May 03, 2023 / Alex Linton
Session Beta: Your all access pass.
April 19, 2023 / Wesley Sukh
Encrypted messaging apps urge for changes to the UK Online Safety Bill
April 17, 2023 / Alex Linton
Session Release Roundup #16: Trio of Changes
October 18, 2022 / Kee Jefferys
Update: Important Changes to Session
October 05, 2022 / Alex Linton