One weird trick to fix TikTok
When it comes to giving users control over seeing harmful content, TikTok falls woefully short.
Embedded is your essential guide to what’s good on the internet, from Kate Lindsay and Nick Catucci.🧩
Okay, fine, I’ll join TikTok’s board. —Kate
You know Britney Spears’s iconic “dump him” t-shirt? Mine would say “mute them.” Mute the people whose content you don’t like, mute the words you don’t want to see. Liberal use of the “mute” button has always been my number one digital wellbeing tip, but as helpful as it’s been for me on places like Instagram and Twitter, the “mute them” mantra is no use on TikTok.
Your only opportunity to let TikTok know you’re not interested in certain content is after you’ve already been exposed to it. If a video you don’t like appears on your feed, you press down until the “not interested” button appears, which will supposedly advise the algorithm accordingly (although I’ve heard anecdotal evidence that this doesn't work very well). This solution makes sense if lack of interest is really your only issue with the content, but I am not alone in avoiding certain topics and videos for reasons well beyond personal preference—many of us avoid them because they’re detrimental to our mental health.
About a year into the pandemic, I realized I had to significantly limit how much content I consumed about the crisis if I didn’t want to spiral into days-long bouts of existential depression. Something as simple as a pessimistic tweet would fill me with a panic and fear that had no guarantee of subsiding in a timely manner. My blanket banning of the word and all related terms is what got me through Delta and Omicron (somewhat) mentally intact, but at the beginning of December’s surge, I had to stay off TikTok entirely. The app was overwhelmed with people testing positive, showing dire headlines, joking about it being March 2020 again. I had no way of knowing if the next video on my feed was going to be something that would upset me, and no way to prevent it, either, so I logged off.
There are countless other fraught topics (diet and weight loss, sex, etc.) that regularly crop up on TikTok, but the options for dealing with them are limited. In addition to hitting “not interested,” you can also tell the app to stop showing you videos that use a certain sound. This is helpful if there’s a triggering trend using that sound, but often the same sound is uploaded in many (slightly) different versions, so for this tactic to really work, you’d have to find and block all of them.
You can also search out and block creators who talk about those topics in an attempt to teach your algorithm what you’re interested in. But the safest option may be to go all-out and activate restricted mode, which limits content that may not be appropriate for all viewers.
But the fact that not wanting to see something as specific as COVID content means I have to sacrifice seeing any mature or serious content—or write off entire creators because they made one or two videos about waiting in a testing line—is, frankly, ridiculous. Especially because I know TikTok has the technology to make this process easier.
If you can follow a hashtag on TikTok (which you can), then you should be able to block it. If the app knows to put a fact-checking disclaimer on content that mentions COVID or vaccines, then it should be perfectly capable of rerouting those videos away from my FYP if I check a box telling it to. And if it can do it for COVID, then it can do it for other topics other users may want to avoid.
I’m not alone in this ask. While Googling this issue to make sure I wasn’t missing a secret setting, I came across other users asking TikTok for the same kinds of filters.
“Being able to block hashtags/keywords in the same way we can block TikToks using a certain sound would be a game changer for how I use this app,” a user named @thoughtfully.gracie wrote in a video.
So much of the magic of TikTok is that the algorithm sometimes knows us better than we know ourselves, and morphs accordingly. But for something as serious as mental health, we shouldn’t be expected to gently guide the algorithm away from what’s detrimental to us and hope it sticks. We should be able to clearly set a boundary. Otherwise, I might not see you on TikTok until after the next surge is over.