Check out this WSJ survey of the TikTok algorithm



[ad_1]

TikTok users often express admiration or dismay at the seemingly odd accuracy of the app’s recommendation algorithm. The Wall Street Journal today posted a video that explains how TikTok personalizes your feed.

WSJ investigators conducted an experiment where they created bot accounts with assigned interests. Bots “watched” videos in TikTok, pausing or replaying those that contained images or hashtags relevant to those interests. the WSJ The team reviewed the results with Guillaume Chaslot, an algorithm expert who previously worked on YouTube.

The results match TikTok’s explanation of how its recommendations work. TikTok previously said that the For You feed is personalized based on the types of videos you interact with, how you interact with them, details about the videos themselves, and account settings such as language and language. ‘site.

If you’re hesitating on a weird video that caught you off guard, there’s no way the algorithm can differentiate that from the content you actually like and want to see more of. This is how some people end up with a bunch of For You recommendations that don’t seem to reflect their interests.

Although humans have more varied tastes than robots, experience shows how quickly a user can be exposed to potentially dangerous content. According to WSJ, TikTok identified the interests of some of the bots in just 40 minutes. One of the robots fell into a den of depressive videos, while another ended up in videos of election plots. Although Will Oremus highlights on Twitter, algorithmic rabbitholes can also lead people to positive content.

The video contains a lot of detail and visualization, so it’s a good way to understand the “magic” of how TikTok works. Watch the video above or on the WSJ site – but beware, it does include clips from TikToks that refer to depression, suicide, and eating disorders.



[ad_2]

Source link