All social media is inherently addicting, but something about TikTok is more extreme. If it isnât the fact that you can waste hours of your life on that app without realizing it, or if it gradually lowers your attention span due to the quick, instant content, then it’s the fact that we all watch TikTok despite being shown disturbing videos and trends on the daily that means that thereâs an issue.Â
If you are familiar with Cocomelon, then you are aware of the effects it has on children who watch the show. The second they HEAR the theme song they will come running to you (as shown by numerous frustratingly adorable videos). As cute as these kids are and how funny their reactions are, I donât ever remember seeing a child have that reaction or me having that reaction to any other kid shows. The thing about Cocomelon is that it is designed to be addicting. There’re really no lessons that Cocomelon teaches you, but it consists of short mind-stimulating clips that rapidly change, which keeps the kids engaged, but makes them a little too overstimulated. Similarly, kids’ content on YouTube is becoming more and more disturbing, with kids having access to videos that are made by people with bad intentions, but grab kids’ attention using popular figures such as Spider-Man or Elsa. This is mainly due to the fact that while YouTube has a lot of regulations on content, they donât regulate who MAKES the content or who WATCHES the content, so you can have kids watching YouTube for hours while the videos and the sources of those videos get creepier and shadier. Similarly with v, it is not directly produced through a YouTube channel, but all episodes are made for YouTube with the intention of getting kids to watch it for hours on end, which is why when kids are even slightly not stimulated by Cocomelon, youâll notice how quickly they become irritable. Not that regular TV is much better in terms of content, but they are better with regulations. Caillou was cancelled and reruns slowly stopped airing because parents complained about it making their kids whiny, which would NEVER occur on YouTube.Â
This all sounds unrelated, but if you think the way YouTube handles media content targeted toward kids is irresponsible, think about how TikTok handles content targeted toward ANYONE. We are no better than the three-year-olds who canât stop watching Cocomelon, because most people I know spend more time in a day on TikTok than any other social media app, and TikTok is arguably the least âsocialâ of the apps. By that, I mean you donât really need to interact with the people you follow or the people who follow you, you just watch the videos tailored to your specific interests and scroll for hours on end.
Except people scroll through a lot of TikTok videos without actually watching them. The âFor Youâ page isnât really for you. Every once in a while, you might notice a bizarre video that doesnât align with your beliefs whatsoever, but if you donât specifically block the person or report the video, TikTok isnât going to think you hated it, and if you even interact with one of the comments on the video, TikTok will take that as a sign that you DID like the video, even if your comment was criticizing it. This is how people end up on really specific niches of TikTok (my roommate for example is on American Girl Doll tok, despite never showing any interest in American Girl Dolls). However, these funny quirky niches can turn dark really quickly.
A couple of months ago, I stumbled upon a duet criticizing a VERY antisemitic video. I wasnât even entirely sure what the duet was criticizing nor did I understand what the comments meant. One comment on the duet (most likely a troll who was angry about being called out for their antisemitism) used a Nazi term, which I was not aware of. I didnât think to look it up, nor did I think it meant anything because it really made no sense whatsoever without context, so when I engaged with it and asked what it meant, I a) was shocked and horrified and b) regretted it immediately. I deleted my comment and blocked the commenter, but the damage had been done already. For the next couple of days, I had to block any videos with dog whistles in them, but I shouldnât have even been getting the videos with said dog whistles in them. TikTok censors a lot of things, but doesnât do a great job of censoring hate speech, and instead makes it easily accessible for people who do not want anything to do with it. There’s an excellent TikTok that goes into detail of this, where misinformation and extremism researcher Abbie Richards highlights how easy it is to be radicalized on TikTok.
In fact, TikTokâs level of censorship is honestly really backwards. Recently, there was a disturbing trend in which (mostly) men would describe how they would kill their partners on a date as a âjoke.â Though I personally never found it funny, it started out as a ridiculous joke with hypothetical scenarios. For example, the first one to become popular involved a shark jumping up and eating the personâs date. Then it turned into men wondering what would happen if they hit their date with a crowbar. Then it turned into violent murder fantasies.Â
This may seem like dark humor, but the glorification of this behavior and brushing aside the actual horrors women go through, leads to people like Ana Abulaban, who shared a famous TikTok account with her husband, being murdered at the hands of their spouse.
At some point, TikTok canât be separated from the real world, but whatever your opinion on this specific type of âhumorâ is, the point is that TikTok is horrendous at picking and choosing what videos to censor and take down. For the most part, none of these videos glorifying domestic abuse were taken down. However, user divergentredhead (now divergentredhed) would duet these videos with disturbing captions and then would link an article of a woman who actually was a victim of the crime that the video joked about. Her account and multiple of her videos had been taken down for violating community guidelines, but the actual original TikTok would never be taken down. And since many of her videos use the song thatâs been used in the trend, if you interact with criticism of the videos, youâll notice slowly the actual trend will pop up on your For You page. If you donât catch it and try to remove it from your For You page (which doesnât even always work) theyâll keep showing up.
Itâs not even necessarily desensitization or the normalization of certain ideas, it is just so easy to scroll away from any content you donât like, which is why it is crucial to pay attention to what youâre scrolling past, and crucial to keep your attention span in check so you arenât unknowingly engaging with these videos. The longer you are exposed to these videos (whether it be antisemitic conspiracies, bigoted jokes or straight-up violence) the easier it becomes to normalize it in your mind. Which is something that you should never aim for. Maybe try to minimize your TikTok screen time (something that I struggle with), but keep yourself in check and more importantly, keep yourself safe.