Graphic video showing apparent suicide circulates on TikTok and users warn not to watch it

0
2,561


  • Video clips have been circulating on TikTok since Sunday night and appear to show a man committing suicide, the Verge first reported.
  • The creators posted on social media to alert the TikTok community to the clip and encourage people to walk away from the video if they come across it.
  • In a statement to Business Insider, TikTok said the app’s systems “detect and report” the clips in the video, and that the team bans users who “repeatedly” attempt to upload the video to the platform. -form.
  • This isn’t the first time the company has faced recorded suicide, as earlier this year TikTok employees waited hours to notify police after a Brazilian teenager broadcast live his report. apparent death on the platform.
  • Visit the Business Insider homepage for more stories.

TikTok struggles to clean its platform of video clips appearing to show a man committing suicide.

Snippets of the video in question have been circulating on TikTok since Sunday night, after the video first appeared in a Facebook livestream, TikTok said in a statement.

Despite TikTok’s efforts to remove downloads of the video, first reported by The Verge, disturbing clips of the alleged suicide can still be easily found on the platform. Setting up the app’s “For You” page means that video can potentially appear (and be played automatically) in front of users without them specifically looking for it.

As graphic clips continue to circulate on TikTok, creators and users have created videos and posted on social media to alert their followers to the video and warn them to drag it in in case it appears on their TikTok feed.

TikTok said in a statement that its “systems automatically detect and report” clips of the apparent suicide, and that it prohibits users who “repeatedly” attempt to upload those clips to the platform.

“Our systems automatically detect and report these clips for violating our policies against content that displays, praises, glorifies or promotes suicide,” TikTok said in a statement to Business Insider. “We ban accounts that repeatedly try to upload clips, and we appreciate members of our community who have reported content and cautioned others against viewing, engaging, or sharing such videos on n ‘. any platform out of respect for the person and their family. ”

The video in question has also appeared on other platforms, including Twitter and Facebook.

TikTok’s attempts to rid the platform of video show TikTok’s struggle to adequately moderate its platform in real time. The company was criticized earlier this year for handling another suicide on its platform: The Intercept reported that TikTok employees in Brazil waited almost three hours to alert authorities to the death of a 19-year-old user who had broadcast live his apparent suicide on TIC Tac. The video reportedly sat on TikTok for over an hour and a half and racked up 15 complaints before being taken down.

However, this inadequacy in handling suicides and other violent content is not a problem limited to TikTok. Facebook and YouTube are just two of many social platforms that have drawn criticism for their inability to remove graphical and violent content from their sites, despite policies in place explicitly banning such content.

Facebook has notoriously struggled to rid its platform of live video of the user suspected of a mass shooting at two mosques in Christchurch, New Zealand.

LEAVE A REPLY

Please enter your comment!
Please enter your name here