EPA image right
TikTok says it deleted more than 49 million videos that violated its rules between July and December 2019.
About a quarter of these videos were excluded because they contain nudity or sexual activity in adults, the company said in its latest publication. transparency report.
The video sharing application also revealed that it received about 500 requests for data from governments and police and handled 480 of them.
The US has suggested that it “is looking into” the ban on the Chinese-owned app.
On Monday, US Secretary of State Mike Pompeo suggested that downloading the TikTok would put “the private information of citizens in the hands of the Chinese Communist Party”.
He added that the U.S. government is considering banning Chinese-owned applications: “We are taking this very seriously. We are certainly looking into it,” he said in an interview with Fox News.
The Indian government has already banned the app, citing cyber security concerns.
TikTok is owned by the Chinese company ByteDance. The application is not available in China, but ByteDance operates a similar application, called Douyin, which is available.
TikTok said it has not received any request for data from the Chinese government or police or any request from the Chinese government to delete content.
On Thursday, the The Wall Street Journal published a report suggesting that the company was considering setting up a new headquarters outside of China.
TikTok told the BBC in a statement: “As we consider the best way forward, ByteDance is evaluating the changes in the corporate structure of its TikTok business. We remain fully committed to protecting the privacy and security of our users by creating a platform that inspires creativity. and brings joy to hundreds of millions of people around the world “.
U.S. authorities are examining whether TikTok has fulfilled a 2019 contract that aims to protect the privacy of minors under the age of 13.
The app says it offers a limited experience, with additional security and privacy features for children under 13.
According to TikTok’s transparency report:
- 25.5% of excluded videos contained nudity or sexual acts in adults
- 24.8% violated their child protection policies, such as implicating a child in a crime or containing harmful imitative behavior
- 21.5% had illegal activities or “regulated goods”
- 3% were removed for harassment or bullying
- Less than 1% were removed for hate speech or “non-authentic behavior”
The TikTok transparency report also revealed:
- The 49 million deleted videos represented less than 1% of the videos uploaded between July and December 2019
- 98.2% of deleted videos were detected by machine learning or moderators before being reported by users
TikTok was only launched in 2017 – and, because it is so new, we know much less about the platform than about Facebook, for example.
This report provides at least a small detail about the type of content removed.
Recently, there has been a lot of attention to hatred and extremism on platforms like TikTok, but there are few columns on sexual content or the safety of minors.
However, about half of the videos removed were in these two categories.
What we don’t know, of course, is how much harmful content has been lost by your moderators and machines.