The Chinese social network would limit the content of users with disabilities and other profiles to avoid bullying on the platform. What is the logic behind your recommendation system?
TikTok is a social network for making short videos by lip sync (i.e. playback or lip sync). The app, created by the Chinese company ByteDance, began operating in the United States in 2017 and since that year, began a strong expansion into the West.
In October 2018, TikTok was the most downloaded app in the United States. Today, available in 150 markets, it has over 500 million active users worldwide. The social network is a rage among centennials (born between 1997 and 2012), who make up a large portion of the application’s users.
Its recommendation algorithm is one of the app’s hallmarks. It adapts to the vertical swipe of users because, as they swipe down, they can discover different content creators suggested by the social network.
The accusation of discrimination
Weeks ago, The Verge site uploaded an article indicating that TikTok would limit the scope of people with disabilities, who would suffer from discrimination by the app. These users would be less likely to have viral content on the social network.
According to the article, a source from the Chinese social network spoke to journalists from the German digital rights blog Netzpolitik.org. He pointed out that this discrimination is a policy of the social network, whose algorithm has been designed to detect cases with a high predisposition to bullying.
Then, to avoid harassment, the platform would drastically limit in which public feeds these accounts are exposed. Bytedance has acknowledged Netzopolitik as an “early and failed attempt” to prevent conflict on the platform.
On the other hand, it is not only the algorithm that discriminates against certain profiles, but also the moderators of the application. This group of people has started to make “quick decisions”, blocking or limiting videos of people with certain physical features (such as deformations, for example) or even content from users with Down’s Syndrome.
To limit the exposure of an account, moderators work considering a section called “Images that represent a subject highly vulnerable to cyberbullying”. It covers users who are “susceptible to cyberbullying or harassment based on their physical or mental condition”, according to documents accessed by the German blog.
The moderators then include people with disabilities in a category called “Risk 4”. This means that a video is only visible in the country where it was uploaded. The videos of these users will not be able to reach a global audience.
TikTok would also limit the reach of fat, LGBTI and autistic users. But, these categories could be difficult to detect by looking at profiles or videos. How do you recognize the profile of an autistic person in a 15 second video?
The source to buy tiktok fans has expressed that this instruction is one of the many incomprehensible rules of the social network, which are confusing to moderators.
How the algorithm works
TikTok makes custom recommendations on what to look at based on users’ previous viewing sessions. With the vertical swipe it is possible to discover new videos, which can even be downloaded if the user in his configuration allows it (private accounts are also possible, as in Instagram).
“The algorithm of TikTok works a little bit different from other popular social networks such as Facebook and Instagram, where most of the contacts are usually from people you know and the interaction metrics (“I like” or comments) are key for the content hierarchy (which is shown first)”, explains to Infobae Melisa Avolio, journalist and trainer at “Oficios y Redes”.
Avolio says that, although network algorithms are a kind of “mystery box”, it is possible to infer how they work according to what they show to users. Thus, it expresses that TikTok’s algorithm is based on users’ habits and learns which videos to show from what kind of videos they watch.