The anonymity argument: Debunking the real name fallacy
November 03, 2021 / Alex Linton
When the internet began, anonymity (or really, pseudonymity) was built into the experience. For the first time in history, we had mass-scale pseudonymous connection and communication. People were posting on bulletin boards, messaging on mailing lists, and working every day with people that they didn’t really know. It didn’t take long for questions to start bubbling up around whether this way of interacting was really healthy, productive or…human.
If you grew up any time after the early 90s, your parents, teachers, and weird uncles were probably in your ear warning you about how people—friends and strangers included—behave online. The internet has an arcane ability to make strangers feel like friends without ever actually knowing who they really are, which we’ve seen result in after-school special stories about unlikely friends or late-night horror stories about exploitation and abuse.
This is a real effect, and it’s called the online disinhibition effect.
Online disinhibition: Would you say that to my face?
The online disinhibition effect basically just means that people are more willing to say things from behind a screen than in real life. Most people who have grown up with technology are probably familiar with this feeling, and anyone who spends a bit of time interacting with people through an LCD screen has no doubt seen this effect in action.
Online disinhibition is often thought of as a bad thing — something that makes us meaner, angrier, and more compulsive. I remember sitting through multiple online safety seminars throughout my high school years which tried to make sure we all remembered ‘there’s someone—a real person—on the other side of the screen’. Obviously a worthwhile message to send to young teens, but those pleas were drowned out by the cacophony of likes, shares, and DMs that greeted us the second we walked out of the school gymnasium and unlocked our phones. There’s no doubt that the hyper-stimulating nature of social media worsens online disinhibition, and as the almighty algorithms get better at feeding us content that manipulates us, impacts us, and outrages us — the problem is getting worse. There’s just such an overwhelming amount of stuff going on that there’s no space left for calmness, consideration, or mindfulness.
To make matters worse, there are also some pretty valid concerns that technology is having a negative impact on the conversations and interactions we have in-person — especially for young people. One respondent to a 2015 study about the effect of technology on face-to-face communication said they felt “people use technology as a crutch to hide behind,” while another said technology “both enhances what we share online and decreases what we say face to face.” Because of a combination of comfort and convenience, people are leaning more and more on digital tools for communicating.
But in spite of all the negatives, online disinhibition isn’t always a bad thing. Online disinhibition is also the effect that lets people feel safe enough to reveal secret emotions, feelings, or desires that they might not be ready to express in ‘real’ life. The internet has finally given people a place to explore their identities in a safe, comfortable way. Online disinhibition is exactly the reason that vulnerable people have been able to find (or create) safe communities on the internet.
There is a nuance to human connection, and online disinhibition inherits a bunch of that nuance — but that’s getting lost in the moral panic that is bubbling over the way we treat each other online. Online hate and abuse is a problem, but the online world is also enabling acts of human kindness and generosity which might have otherwise been stifled. Here’s where anonymity comes in: anonymity is being stuck up as the boogeyman behind toxic online disinhibition. The argument being that the reason people feel free to spew hate online is because their online persona isn’t connected to their ‘real’ identity. The next leap of logic: if we remove anonymity, it will reduce the amount of hate, disinformation, and abuse that we experience online. While I agree that the overall quality of people’s online conduct seems to be trending darker and dangerous-er, anonymity is not to blame — and sacrificing anonymity will have catastrophic effects on privacy, especially considering the current technology climate.
The case for anonymity: Why it’s worth fighting for
Harassment and discrimination are widespread societal issues that are experienced by our communities both online and offline. In groups or societies where there’s a culture of violence or conflict, we should expect to see that violence manifest in digital spaces as well. It’s a serious problem in need of urgent solutions, and the design of the technology that we use every day plays a huge role in determining the outcomes for at-risk people. Technology-based intervention is absolutely possible to protect people’s safety and rights. Here’s the problem with the anti-anonymity sentiment: Anonymity protects the most vulnerable people.
Having the choice to remain anonymous gives people more power and agency when they’re navigating their online interactions. Research shows that women—who are much more likely to be targeted by online harassment than men—are much more likely to use pseudonyms or temporary identities than men. This is an important frontline defence against online harassment. Take anonymity away and people become easier targets — especially considering a large proportion of perpetrators of online harassment already know the people they target. If we limit people’s capacity to choose anonymity online, that proportion will rise.
Taking away anonymity can have serious, real-world consequences for a lot of the most vulnerable people in the world. The discussion around anonymity deserves nuance and thought, and we need to consider other factors contributing to online violence before we condemn anonymity/pseudonymity.
Dr J. Nathan Matias wrote an excellent overview of the arguments for online anonymity for the Coral Project; it outlines much of the important research and commentary surrounding digital anonymity.
It’s not about anonymity, it’s about the algorithm
It’s true that the internet has a hate problem. Social media is rampant with harassment, and we should be seeking solutions. As technology designers and digital citizens, we have to acknowledge that the current design of social media is doing something to promote toxic behaviour. Anonymity—or in the case of most social media, pseudonymity—is probably not the biggest driving factor, and even if it were, sacrificing anonymity is a massive trade-off, and there’s a real concern that it would result in more harm, not less.
While it might be easy to point the finger at anonymity, we should be paying more attention to the algorithms that are amplifying hate, harassment, and disinformation on social media. The web 2.0 business model has spawned single-minded algorithms that lust for engagement, but somewhere along the line something went very wrong and precipitated a system that funnels hordes of social media users—also known as people—into very harmful pipelines.
Because of algorithm-driven engagement, social media is fuelled by a very alien version of human communication. The current iteration of social media feels like it’s the most distant online communication has ever been from actual face-to-face communication. That increased dissonance has a psychological effect on people. It’s a horribly played out cliché, but social media really is becoming increasingly antisocial.
If we want to create a digital world where people are more respectful and unified — it starts by tearing down the algorithms. Those very algorithms have filled people’s pockets and made a select few at the top of the technocrat tree very powerful, so there will be resistance. But for the good of the global community, the algorithms must go.
The need for decentralised messaging
August 27, 2024 / Alex Linton
Connecting one million users
August 18, 2024 / Alex Linton
Sign-up in a flash with Session's new onboarding
August 11, 2024 / Alex Linton
Let’s localise: Unifying text strings together
August 06, 2024 / Alex Linton
Disappearing Messages v2: a new way to protect your privacy with Session.
March 12, 2024 / Alex Linton
Upgrading from Oxen Network to Session Network
February 06, 2024 / Alex Linton