Intentional communities in digital spaces: What side of TikTok are you on?

March 21, 2022 / Alex LintonPrivacy

Earlier this week, I was scrolling through TikTok—not the best thing to admit in the opening sentence of a blog about privacy and security, but know thy enemy—when something strange happened. I was happily swiping through videos, being force-fed a three-course meal of surreal jokes, dance videos, and slice-of-life montages when one fateful swipe, a single upwards gesture, had me watching a distressing video of a parent holding a screaming child while explosions rocked their home. It took a second to react. Maybe it was a scene from a movie. Maybe it was a news report and a voiceover would start. Maybe it would evolve into something else completely. The video finished and started again. One more swipe and I was back to watching the latest trend.

The whole situation felt strangely transgressive — it was an extremely unexpected and upsetting thing to see. Although I regularly consume news media content, I certainly don’t do it on TikTok. Why did I see this? On TikTok, nothing you see is a mistake, everything you’re shown is deliberately, programmatically chosen. That’s what makes the app such a fascinating phenomenon — the way its algorithm sorts and delivers content is like looking into a (cyber-dystopian) mirror. 

There are some huge, HUGE issues with placing so much faith in algorithms; letting machines make important decisions about what we do and don’t see is extremely dangerous. This is an especially big issue for people who rely on TikTok as one of their main information sources. Anyway, here’s some crazy conjecture about the TikTok algorithm and the consequences it could have for people to organise themselves into meaningful communities in the future.

The art of manipulation

Algorithms are the great manipulators of the modern age. There are millions of data points all converging on every decision about what content you do and don’t see on social media. What makes you laugh. What makes you cry. What makes you mad. In the end, algorithms are acting with the singular goal of keeping you logged-on and paying attention. 

In the sequence of three TikToks I referenced earlier, the algorithm knew exactly how other people had interacted with them, the feelings and emotions they (on average) evoked, and how likely they were to make you keep scrolling for more. The specific order of the sequence doesn’t quite feel that innocent — there is an irksome feeling that maybe, just maybe, the algorithm knew that there would be a heightened response to a confronting video if it was sandwiched in between two light and casual pieces of content. TikTok gives you minimal control over what you see; there are ways that people try to manipulate their ‘For You’ page to show them what they want to see — but it’s mostly guess work. Generally speaking, TikTok aficionados will advise that you should just trust that the algorithm will (eventually) figure it out and start showing you the good stuff.

The more you lean into algorithms, the less you lean into intention. You slowly lose control over what you do and don’t see: everything you see is algorithm-led recommendations, not human-based curation. TikTok has put all its eggs in the algorithm recommendation basket, and it is paying off. TikTok’s algorithm is so good at recommending content that, on average, an average user session is three times longer than their biggest competitor (Instagram). 

Reverse Engineering the TikTok Algorithm

Read: Reverse engineering how TikTok algorithm works

Other apps, like Twitter, leave more of the curation up to the user, and rely less on recommending content algorithmically. On a platform like Twitter, you can be pretty sure that if you only follow users who talk about, say, digital rights, that most of the content you see is going to be somehow related. On TikTok, there are no such promises. 

Intentional communities on social media

Intentional communities are voluntary organisations of people—generally with aligned goals and ideals—designed to have a high degree of social cohesion. Intentional communities are often useful for things like activism or just to give people a stronger sense of meaning and community. 

Having intention and meaning behind who we associate with is important, it allows people to find places they feel like they belong, speak and organise with people who have similar beliefs to them, and overall feel a stronger sense of meaning in their social interactions. 

A huge problem with the increasingly algorithm-driven social interactions that we have (wherein content acts as an intermediary for social interaction through comment sections, sharing, etc.) is that it removes the sense of intention. Within each social media platform is a kind of digital diaspora, where there are many intersecting communities (as well as the overarching ‘TikTok community’ or ‘Twitter community). However, platforms like TikTok give people noticeably less agency over what communities they can participate in.

Sides of TikTok

Image: ‘The Sides of TikTok’ Stayhipp

Amid all the discussion of people’s ‘For You’ pages is the idea of being on a particular ‘side’ of TikTok. What is meant by a ‘side’ of TikTok is really just a sub-community or sub-culture which has formed around a specific thing. Some examples might include things like: Anime TikTok, Film TikTok, Book TikTok, or Gardening TikTok — each also has more hyper-specific sub-groups, like Carrot-growing TikTok, or Noir Film TikTok. While these communities are extremely rich, they’re also (importantly) gated by the algorithm. You often hear people lament how, for one reason or another, TikTok simply won’t serve them content about one of their interests — effectively icing them out of that community. 

This can have a negative effect when people get pushed into communities they don’t want to be a part of or are unable to access the communities they do want to be a part of. While whether you see TikToks about how to grow carrots or not might seem inconsequential, the hypothetical has a different feeling if the topic is instead something more serious — like activists trying to organise for a protest. TikTok is probably the most radical example of this because of its heavy reliance on algorithm based recommendations. Sometimes virality works in favour of activists trying to spread and share information, but it’s not reliable. If people can’t form intentional communities — activism will not be the same. It will not be as successful. In the future, do we really want non-human algorithms deciding whether a cause lives or dies? 

Agency of association

The solution to all of this? Give people more of a choice about what communities they want to engage with and what kind of content they want to see. Although this may result in lower use of the app (read: lower levels of addiction), this is only really a negative if your goal is to farm audience behaviour metrics that can be sold to advertisers (as opposed to actually making technology which improves the world). 

At Session, we have designed an application which aims to make digital communication feel more human again. There are a lot of reasons this can be beneficial, ranging from social reasons to your fundamental rights.  Weoften talk about how different technological designs can have huge human impacts — and in the case of Session, anonymous accounts, no content algorithms, and serious metadata protection makes it all feel that much more…human. It’s not about selling a product—or about making you the product—it’s about connecting people through technology. 

With things like Session’s open groups, people can openly and voluntarily engage with the communities that matter to them as opposed to getting funnelled into communities a machine chose for them.

Join the movement to keep the internet private!

Chat with like-minded individuals in the Session Community.

Friends don’t let friends use compromised messengers.

Sign up to the mailing list and start taking action!