
After opening his account on Tiktok, a 13-year-old user started searching for ‘Only Fans’. ‘Only Fan’ is a popular site for uploading essential content for adult entertainment. These users then watched some videos, two of which also sold pornography.
These users then looked at the ‘for you’ feed. Here are some of the most popular videos watched by users. However, Apple did not forget the child user’s desire for sex, and immediately sent out dozens of videos containing sexually explicit material. Also, there were videos where actors acted out their relationships with caregivers. In one video, a man is saying to a young woman wearing very provocative clothes, ‘Cry freely. You know, it’s Daddy’s favorite.
When users began scrolling through videos that had been spent on sexually explicit material, videos of such content and violence soon appeared on the For You feed.
It was one of dozens of automated accounts (bots) created by the American media Wall Street Journal (WSJ). WSJ did this to understand what Tiktok was showing to young users. Such butts, registered between the ages of 13 and 15, were running Tiktak’s ‘For You’ feed. This is a highly confidential and never-ending feed curated by algorithms.
The app’s algorithm drove the user into a rabbit hole, which many users call ‘Kinkcutt’. It contains whips, chains, and torture items, some of which are banned on tickets.
A total of 974 videos related to drugs, pornography, and other sexually explicit material found in children’s accounts were shared with Tiktak. Of these, 169 videos had already been removed before the journal was sent. These videos could not be detected by the ticker or by the uploader. The remaining 255 videos were removed after the journal was sent to the company.
Analyzing the videos submitted by Tiktok through its powerful algorithm on these accounts, it seems that Tiktak will soon be able to send innumerable videos of sex and drugs to children, children being one of the biggest users of this app.
In an account opened in the name of a 13-year-old boy, Tiktak posted at least 569 videos on drug use, cocaine prescriptions, and other topics, including promotional videos on drug production and the online sale of prohibited items.
Hundreds of such videos were also seen on other accounts opened by WSJ in the name of children.
Videos from more than 100 accounts recommending paid pornography sites and sex shops have been viewed on children’s accounts. Thousands of other videos were written by users who wrote their content for ‘adult’ purposes only.
Some were eating and drinking too much alcohol. There were pictures of drunk driving and drinking games.
A total of 974 videos related to drugs, pornography, and other sexually explicit material found in children’s accounts were shared with Tiktak. Of these, 169 videos had already been removed before the journal was sent. These videos could not be detected by the ticker or by the uploader. The remaining 255 videos were removed after the journal was sent to the company. Dozens of these videos were adult content created with adults playing the role of ‘caregivers’. In the video, a young woman who played the role of a caregiver commented that it would be better to remove such adult content from children’s feeds. “My bio mentions 18 plus, but I have no real way to control it,” she wrote in the message.
A spokeswoman for Tiktok declined to comment on the contents of the personal video but said most did not violate the directive. She said Tiktok had removed some of the videos after looking through an account created by the journal. She also said that the distribution of other videos has been tightened to stop the recommendation from one user to another, but did not disclose how many videos were done.
According to the spokesperson, Tiktok does not differentiate between sending videos to adults and children, but Tiktak is thinking of developing a content filtering tool for young and child users, she said.
With this powerful signal, the tick gets information about your repressed desires and emotions. It then takes the user to the ‘rabbit hole’ of their contents. These feeds are often dominated by special titles or themes.
TicTok’s policy states that users must be at least 13 years old and users under 18 must obtain parental permission.

“The safety of children is very important. And, to enhance the teen’s age-appropriate experience and safety, Tiktok has implemented an industry-first step, “the spokesman said in a statement.
She also reminded parents that Tiktok allows parents to manage screen time and privacy settings for their children.
Addiction machine
Previous research by the journal has shown that to find out what a user wants, a tick needs to have one key piece of information: how much time you spend on what content. The app keeps track of whether you have stopped, found it difficult to watch the video, or watched the video again.
With this powerful signal, the tick gets information about your repressed desires and emotions. It then takes the user to the ‘rabbit hole’ of their contents. These feeds are often dominated by special titles or themes.
Other social networking companies, such as YouTube, are struggling to cope. ‘All the problems we see on YouTube are caused by engagement-based algorithms. And, of course, Tiktak is just like that, but it’s even worse, “said Guillaume Chaslot, a former YouTube engineer.” Tiktak’s algorithm can read quickly. Chaslot previously worked on YouTube’s algorithms and now advocates for transparency in the operation of such devices.
The journal added birth dates and IP addresses to all 31 of its children’s accounts.
In most cases, various interests were also programmed. Tiktok came to know about this after finding the relevant hashtag or picture, stuck in the video, or quickly scrolling from one video to another.
Most of the accounts did not search the contents but watched the videos that came in their feed.
This is how it worked
One butt was programmed to be in a video related to drugs. On his first day in Tikatak, the account got caught up in a video: where a woman was walking in the woods. The caption indicated that she was looking for marijuana.
The next day, the account also watched a video of a Gaza-themed cake.
Soon after, thousands of videos were advertised on the children’s account’s feeds about drugs and drug use, including marijuana.
In one video, a man was smoking a cigarette and talking about the ‘420 Friendly (marijuana code)’ website. He also said that he would provide good cannabis to everyone.
The creator of the 420-friendly website did not respond to a request for comment on whether the video was shown on a 13-year-old’s registered account.
In the end, about a dozen of the journal’s 31 accounts had a similar theme.
According to David Anderson, a psychologist at The Child Mind Institute, this can be a problem for children, especially for those who do not have the ability to turn off videos or who do not have supportive adults around. According to him, such teens can experience a ‘complete whirlwind’ and the way social media is promoting drugs or other topics can affect them.
TikTok would sometimes take the accounts of journals programmed to express interest in multiple titles under one heading and send hundreds of videos on the same topic.
Dozens of videos appearing on the feed were urging users to follow links to pages like ‘Only Fans.com’. It is a subscription-based social networking platform, featuring adult content including sexually explicit material.
Tiktak showed hundreds of Japanese films and television cartoons to an account expressing interest in many subjects. With the exception of four Japanese feature animations in a series of 150 videos, most were based on sexual themes.
According to a spokesman for Tiktok, “Bots (automatic accounts) created by the journal do not represent real-world behavior and video-watching experience.” Because he argues, people’s interests are diverse and changing.
She said Tiktak is reviewing how to improve, especially for children and young users.
When users see content they don’t want to see, they can reduce it by clicking the ‘Not Interested’ button, she said.
The journal programmed most of the accounts to repeatedly view videos with sexually explicit words and pictures, and soon such accounts penetrated the ‘rabbit hole’ of sexual content. This is one of them.
Dozens of videos appearing on the feed were urging users to follow links to pages like ‘Only Fans.com’. It is a subscription-based social networking platform, featuring adult content including sexually explicit material.
Other posts advertised sex shops and strip clubs. Some of the recently removed videos show young women looking for “European and American” girlfriends and their phone numbers.
Dozens of videos promoting paid pornography are now out of print. In some cases, even the tick makers are clear about their intentions. They want their videos not to be watched by children. So they have written their video or account as ‘for adults only. However, Tiktak has also sent such written videos to other accounts.
In a wave of 200 videos, about 40 videos were written ‘for adults only. In all, at least 2,800 such videos were credited to the journal’s children’s account.
TikTok is also concerned about such videos promoting sexual activity. Vanessa Pappas, the company’s chief operating officer, asked employees at a meeting in 2020 why this happened after the video, which inspired people to go to ‘Only Fans’, became very popular.
With a combination of algorithms and more than 10,000 people, Tiktak is working to control the videos. The company said in a recent report that 89 million videos had been removed in the last six months of last year. However, the higher the tick, the harder it is to control.
After the meeting, Suru Tiktak decided to ban ‘only fans’ linking content. The staff was convinced that most of the content on this website was sexually explicit. A person familiar with the decision told the Journal.
However, the platform still allows users to link to this site. The decision was made after other employees pointed out that not all content on ‘Only Fans’ was sexually explicit and that other social networks also allowed the site to link to their content.
TickTack users claim that the platform prohibits nudity and sexual content and removes user-directed accounts on sites such as ‘Only Fans’ that post sexual content.
A spokeswoman for Anli Fans said the site was only for people over the age of 18. She declined to comment on whether Tiktak accounts are motivating people to visit the site.
Control policy
With a combination of algorithms and more than 10,000 people, Tiktak is working to control the videos. The company said in a recent report that 89 million videos had been removed in the last six months of last year. However, the higher the tick, the harder it is to control. According to former Tiktak executives, the platform now has about 100 million users in the United States. However, in 2019 this number was 25 million.
According to the company, users upload tens of thousands of videos every minute. To maintain their momentum, moderators often focus on the most popular content. Former executives say videos with very low views will not be reviewed in most cases.
Tiktak said in July that it would work with its algorithms in the United States to identify and remove certain types of videos that violate its rules, and to enforce its rules more effectively. In the past, TickTock’s algorithms have identified breaking videos. But, before it was removed, people had to review it once.
The company made the announcement after the journal sent hundreds of potentially illegal videos sent to its butts account by Tiktak. According to Tiktak, he has been using the new system since last year.
According to a spokesman for Tiktak, no algorithm will be able to fully control the content as the amount of context to understand the video may vary. Earlier, Tiktak had to struggle to remove videos promoting food disorders.
Even though such items promoting food disorders were banned in the app, Tiktak was sending them to the journal’s account.
Former executives and content moderators at the company say it has been difficult to control such content, especially in the United States, due to the company’s decision to lift some restrictions, such as the exemption for displaying leather and bikinis last year.
As a result, many sexually explicit videos have appeared on the platform.
The spokeswoman said the company’s policies were changing in light of changing consumer behavior and industry norms. As Tiktak users become older and more diverse, the company expects new and different content, she said.
And, why is a bot account registered for child users moving to content that promotes sexual activity?
This is one of the extremist videos seen in the journal’s account. Numerous videos have also been used to describe fictional aspects of sexual activity, sexual violence, and rape. At one point, more than 90 percent of the account’s video feeds contained only sexually explicit material.
Of the 1276 videos shown on children’s accounts, 616 videos have now been removed from the app.
Method
Over a period of several months, the Wall Street Journal created 100 TicTok accounts. These accounts were operated with little human intervention. 31 accounts were registered in the name of users between the ages of 13 and 15. The journal also developed software to guide accounts and analyze their behavior. These accounts mentioned the date of birth and the IP address. Most of the accounts were given interest including keywords. Similarly, these accounts could also classify images. Accounts would sit on those videos if they got their interest. Otherwise, the account would immediately go to another video. The journal also collected videos, thumbnail images, description text, and metadata attached to each video. And, internal analysis tools were also created to assist in its investigation. Finally, These accounts watched nearly 400,000 videos. Of these, about 100,000 videos were viewed by accounts opened in the name of users between the ages of 13 and 15.