Microsoft’s purchase of TikTok could lead to social media content monitoring issues


Microsoft CEO Satya Nadella leaves the Elysee Palace after a meeting with French President Emmanuel Macron in Paris on May 23, 2018.Aurelien Morissard | IP3 | Getty Images

If Microsoft proceeded to acquire TikTok, it would gain a company with strong potential for ad revenue growth.But with such a purchase, Microsoft would also face a whole new set of problems.

Microsoft announced on August 2 that it was in talks to buy the TikTok business in the United States, Australia and New Zealand, with a deadline to complete the deal on September 15. The company is currently owned by Chinese tech company ByteDance and has become a target of the Trump administration and other governments over privacy and security concerns. Trump also signed an executive order last week that would ban U.S. companies from doing business with TikTok, but it’s unclear how the order might affect a possible Microsoft acquisition.

In the United States, TikTok has grown to over 100 million monthly users, many of whom are teens and young adults. These users log into TikTok to view full screen videos uploaded to the app by others. These videos often feature lip-syncing to songs, flashy video editing, and eye-catching augmented reality visual effects.

To say that TikTok is a drastically different company from the enterprise software Microsoft specializes in would be an understatement.

For Microsoft, TikTok could become an advertising revenue powerhouse, but this potential is not without risk. Like other social apps, TikTok is a target for all kinds of problematic content that needs to be addressed. This includes basic issues like spam and scams, but more complicated content could become a headache for Microsoft as well.

This could include content such as misinformation, hoaxes, conspiracy theories, violence, prejudice and pornography, said Yuval Ben-Itzhak, CEO of Socialbakers, a social media marketing company.

“Microsoft will have to face all of this and will be blamed and criticized when it doesn’t,” Ben-Itzhak said.

Microsoft declined to comment and TikTok did not respond to a request for comment on this story.

These challenges can be overcome, but they require large investments of capital and technical prowess, two things Microsoft is capable of delivering. And already, Microsoft has some experience in moderating online communities.

In 2016, Microsoft bought LinkedIn for $ 26.2 billion, and while the career and professional-focused service doesn’t have the degree of content issues its peers face, it’s still about a social network. Microsoft has also launched Xbox Live, the online gaming service, since its launch in 2002. Online gaming and social media are different beasts, but they share similarities.

“The fight against disinformation will have to be an essential priority for the mission. Microsoft will be new to this area because it has no experience running a high-level social network on this scale, ”said Daniel Elman, analyst at Nucleus Research. “Having said that, if any business can acquire or develop the skills and capabilities they need quickly, it is Microsoft. ”

But these aren’t small challenges, and these types of issues have become major issues for TikTok rivals.

Facebook, for example, was accused of not doing enough to sidestep fake news and Russian disinformation ahead of the 2016 U.S. election, and four years later the company is still facing criticism for whether it is. does enough to prevent this type of content from happening. appearing on its services. In July, hundreds of advertisers boycotted Facebook for its failure to contain the spread of hate speech and disinformation.

Twitter, meanwhile, began to lose key users, such as actress Leslie Jones, after the company let harassment run rampant on its social network. The company has spent the past two years building features to reduce the amount of hateful content that users have to deal with in their mentions.

These types of issues have popped up on TikTok before. Far-right activists, white nationalists and neo-Nazis have already been reported on the app, according to Motherboard and the Huffington Post, who found users already banned by Facebook and Twitter.

TikTok’s potential content issues, however, may be more similar to Google-owned YouTube. Both services depend on user-generated videos for content, and they both rely heavily on algorithms that learn a user’s behavior to determine what kind of content to suggest next.

“The problem with algorithm-based content feeds is that they typically degrade into the salaciousest content that shows the highest engagement,” said Mike Jones, managing partner at the Los Angeles-based venture capital firm. Angeles Science. “There is no doubt that as the creators figure out how to get more views and attention to the site through the manipulation of algorithms, the content will increase in its salacity and be a constant battle all over. owner will have to face it. ”

Another similarity to YouTube is the amount of content available on TikTok which is geared towards minors. Although TikTok does not allow users under the age of 13 to post to the app, many of its users are between the ages of 13 and 18, and their content can be easily viewed by others.

For YouTube, the challenge of hosting content involving minors became a major issue in February 2019 when Wired discovered a ring of pedophiles who were using the video service’s recommendation features to find videos of minors exposed or under- clothing.

With the number of young users on TikTok, it’s not hard to imagine that Microsoft could end up with a problem similar to Google’s.

YouTube has also become a cesspool for conspiracy theories, like the idea that the Earth is flat. This too could become a problem on TikTok, and there is already some evidence of that. The conspiracy theory that Wayfair uses its furniture for child trafficking has gained particular momentum on TikTok this year.

To deal with these issues, Microsoft would have to invest a significant amount of time and money in moderation of content.

For Facebook, this problem was addressed through a two-pronged strategy. The company continually invests in artificial intelligence technology that can detect bad content – such as pornography, content containing violence or hate speech – and remove it from their services before it is ever viewed. by other users.

For more complicated content, Facebook also relies on thousands of human moderators. These moderators often work for Facebook through third party vendors as contractors, and they are tasked with browsing thousands of content per day under difficult working conditions, at the risk of developing PTSD. These working conditions have been criticized many times, creating public relations headaches for Facebook.

If Microsoft were to buy TikTok, it would likely also have to develop similar AI technology and create a network of human moderators, while avoiding negative headlines for poor working conditions.

TikTok offers Microsoft immense potential in the digital marketing industry, but with all of this advantage will come many new challenges and responsibilities that the company will have to take on.


Please enter your comment!
Please enter your name here