Like many social media platforms and apps, TikTok feeds are built using a recommendation algorithm that uses a number of tools and factors to personalize it for each person. Now, TikTok has published a new blog post explaining how its recommendation feed works, and it includes tips for personalizing the feed to avoid being served random videos you might not be interested in.
TikTok’s recommendation algorithm is built around input factors in a way somewhat similar to the way YouTube measures and monitors engagement. The way people interact with the app affects the recommendations served, including posting a comment or following an account. If someone only follows cute animal accounts, and solely double taps to like or comments on videos about animals, TikTok will serve them more animals. This also helps inform TikTok’s algorithm about videos people might not be interested in — if you’re only interested in Hype House creators, for instance, TikTok may not serve up videos from the “bean side” subgenre on the app.
User interactions are just one part of the equation, though. TikTok states that video information, which “might include details like captions, sounds, and hashtags,” and device or account settings also have an effect on the feed. Language preference, country setting, and device type will factor in to make sure “the system is optimized for performance,” according to the post. The post also states, however, that device and account settings “receive lower weight in the recommendation system relative to other data points we measure since users don’t actively express these as preferences.”
Again, like YouTube, everything comes down to engagement. If someone finishes a video instead of flipping to the next one halfway through, that action is registered as a stronger indication of interest. The post also stresses that its recommendation system is based on the content, not necessarily the creator. Anecdotally, that means unless Charli D’Amelio — TikTok’s most followed creator — suddenly starts making videos about frogs, beans, or self-deprecating jokes, she’s not going to appear in my feed (and she doesn’t!).
TikTok is often applauded for its recommendation system; once it’s finely tuned, the app becomes one of the best scrolling experiences. My personal theory is that’s why TikTok is so addicting — everything is so perfectly curated to your specific interests, it’s hard to put the phone down once you’re sucked in. But TikTok’s recommendation algorithm still has its own flaws that the company brings up in its post.
“One of the inherent challenges with recommendation engines is that they can inadvertently limit your experience — what is sometimes referred to as a ‘filter bubble,’” the post reads. “By optimizing for personalization and relevance, there is a risk of presenting an increasingly homogenous stream of videos. This is a concern we take seriously as we maintain our recommendation system.”
Some of this can be innocuous — people who only like horse videos might only see horse videos. Some of it can also be exclusionary. The app might not surface videos from the Black Lives Matter protests or may not recommend disabled or queer creators, if a user doesn’t specifically go out of their way to tune the algorithm in that direction. TikTok’s post addresses the filter bubble by explaining its goal of interrupting repetitive content. The “For You” feed “generally won’t show two videos in a row made with the same sound or by the same creator,” the post says.
The idea is that more new types of videos will surface on a feed than ones that feel like more of the same. But that doesn’t always work. I’ve scrolled through three or four videos, one after the other, that all used a popular song for a popular trend on the app. How exactly TikTok chooses which videos to surface for every personalized feed is still a bit of a black box, but it’s an area the company is at least highlighting as one in need of improvement.
Another issue that TikTok takes seriously is not surfacing dangerous content. This is an issue that YouTube in particular has faced criticism over for many years. According to TikTok, content that has graphic material like medical procedures or “legal consumption of regulated goods,” like alcohol, may not be eligible for recommendation because it could come across as “shocking if surfaced as a recommended video to a general audience” — in other words, young kids. That’s why many creators on TikTok will upload a video more than once or talk openly about feeling shadow banned over particular content.
TikTok has faced criticism from marginalized groups for not recommending content, including members of the LGBTQ+ community. It’s an issue YouTube routinely faces, and the Google-owned video site is currently facing a lawsuit after several LGBTQ+ creators claimed YouTube hid their videos in restricted mode and wasn’t surfacing their content in its recommendations. TikTok admitted it had suppressed content from some creators, intending it to be a short-form solution to bullying.
“Early on, in response to an increase in bullying on the app, we implemented a blunt and temporary policy,” a spokesperson told The Verge in December 2019. “While the intention was good, the approach was wrong and we have long since changed the earlier policy in favor of more nuanced anti-bullying policies and in-app protections.”
The full blog has more in-depth instructions about how to personalize your own “For You” page, but it’s refreshing to see the company open up about one of its competitive advantages. TikTok’s algorithm is one of the more fascinating ingredients to its worldwide success — it’s even part of the daily conversation within the app’s fast-growing culture, where TikTok users refer to different growing trends and subgenres as “sides” favored by the algorithm.
Lots of viral-hungry users try to figure out how to game TikTok to get more views and capitalize on new trends — and that comes down to feeding the algorithmic recommendation tool different bits of data to promote videos that might not naturally surface on their own. Now, TikTok is pulling back the curtain a bit more to give people a chance to do it themselves.