TikTok says it deleted more than 49 million videos that broke its rules between July and December 2019.
About a quarter of these videos have been removed for nudity or adult sexual activity, the company said in its latest transparency report.
The video-sharing application also revealed that it had received about 500 requests for data from governments and the police and had complied with about 480 of them.
The United States has suggested that it “plans” to ban Chinese application or not.
On Monday, US Secretary of State Mike Pompeo suggested that downloading TikTok would “put citizens’ private information in the hands of the Chinese Communist Party”.
He added that the US government is considering banning Chinese-owned apps: “We take this very seriously. We are certainly looking into the matter, “he said in an interview with Fox News.
The Indian government has already banned the app, citing cybersecurity concerns.
TikTok belongs to the Chinese company ByteDance. The app is not available in China, but ByteDance operates a similar app, called Douyin, which is available.
TikTok said it had received no requests for data from the government or the police in China, nor any requests from the Chinese government to delete any content.
The Wall Street Journal released a report on Thursday suggesting the company is planning to establish a new head office outside of China.
TikTok told the BBC in a statement: “As we consider the best way forward, ByteDance is evaluating the changes to the corporate structure of its TikTok business. We remain fully committed to protecting the privacy and security of our users as we build a platform that inspires creativity. and brings joy to hundreds of millions of people around the world. ”
U.S. authorities are investigating whether TikTok has complied with a 2019 agreement to protect the privacy of those under the age of 13.
The app says it offers a limited app experience, with additional security and privacy features for those under the age of 13.
According to TikTok’s transparency report:
- 25.5% of the videos deleted contained nudity or adult sexual acts
- 24.8% violated child protection policies, such as implicating a child in crime or containing harmful imitative behavior
- 21.5% showed illegal activities or “regulated goods”
- 3% were fired for harassment or intimidation
- Less than 1% removed for hate speech or “inauthentic behavior”
TikTok’s transparency report also found:
- The 49 million videos deleted represent less than 1% of the videos uploaded between July and December 2019
- 98.2% of the deleted videos were identified by machine learning or moderators before being reported by users
TikTok didn’t come out until 2017 – and because it’s so new, we know a lot less about the platform than it does on Facebook, for example.
This report provides at least a little detail on the type of content it is removing.
Recently, the focus has been on hatred and extremism on platforms such as TikTok, but fewer columns on sexual content or the safety of minors.
Yet about half of the videos removed fell into these two categories.
What we don’t know, of course, is how much harmful content has been missed by its moderators and machines.