New report examines TikTok’s algorithm in recommending harmful content to vulnerable teen users
Researchers from the Center for Countering Digital Hate look into how TikTok’s For You feed’s algorithmic recommendations react to teen users who express interest in eating disorders, body image, and mental health. By creating brand-new accounts for teen users, the report found that TikTok recommended information related to suicide within 2.6 minutes. Teens on TikTok were given recommendations for videos regarding body image and mental health every 39 seconds. The vulnerable accounts were recommended self-harm and suicide videos 12 times more frequently than the typical accounts.
Researchers from the Center for Countering Digital Hate look into how TikTok’s For You feed’s algorithmic recommendations react to teen users who express interest in eating disorders, body image, and mental health.
By creating brand-new accounts for users in the USA, UK, Australia, and Canada, all of whom were 13 years old, researchers looked at the algorithm behind TikTok. One of these accounts has a username that suggests a preoccupation with one’s appearance. They watched and liked any videos about body image, mental health, or eating disorders for each account, and then they recorded the first 30 minutes of algorithmically suggested content on each account’s For You feed. The resulting recordings were examined to see how frequently eating disorder, self-harm, and body image recommendations were made.
The study found that TikTok gave suicide-related information recommendations in under 2.6 minutes. Within 8 minutes, TikTok offered material on eating disorders. Every 39 seconds, teens on TikTok received recommendations for videos on body image and mental health. According to the study, self-harm videos were recommended to vulnerable accounts with the vulnerable phrase in their usernames 12 times more frequently than they were to regular accounts.