TikTok, the popular video-sharing platform owned by ByteDance, is making a significant shift in its content moderation strategy by laying off hundreds of human moderators and increasing its reliance on artificial intelligence (AI) for content review. This move, which primarily affects employees in Malaysia and the UK, is part of the company's efforts to enhance efficiency and scalability in managing the growing volume and complexity of user-generated content.
The shift towards AI-driven content moderation at TikTok involves the layoff of nearly 500 employees, primarily in Malaysia, with additional cuts reported in the UK.12 This transition is part of a broader strategy to strengthen the platform's global operating model for content review. Currently, TikTok employs a hybrid approach where automated technologies handle approximately 80% of content violations, with human moderators managing the remaining 20%.3 As part of this initiative, ByteDance plans to invest $2 billion globally in trust and safety efforts in 2024, focusing on improving the efficacy of its moderation systems through advanced AI technologies.12
The transition to AI-driven moderation is motivated by several key factors. Efficiency and cost-effectiveness are primary drivers, as AI systems are expected to handle content review tasks more rapidly and economically than human moderators1. Scalability is another crucial consideration, with AI viewed as a more adaptable solution for managing the fluctuating workloads and increasing complexity of user-generated content2. Additionally, TikTok aims to achieve greater accuracy and consistency in content moderation through advanced technology, addressing the challenges posed by the platform's global reach and diverse user base34.
The shift towards AI-driven content moderation on TikTok and other platforms has raised significant concerns among industry experts and workers' rights advocates. These concerns primarily focus on the effectiveness, accuracy, and potential biases of AI systems in handling complex moderation tasks. Here are the key issues:
Accuracy and context understanding: Experts question whether AI can effectively grasp nuanced cultural contexts and subtle content violations that human moderators are trained to detect12.
Job displacement: The layoffs of hundreds of human moderators have sparked concerns about job security in the content moderation industry32.
Exploitation in the Global South: There are worries that the transition to AI may lead to increased exploitation of content moderators in developing countries2.
Lack of transparency: The absence of third-party assessments of AI moderation systems has raised concerns about their reliability and accountability1.
Human oversight: Critics argue that human moderators are still necessary, especially in multilingual and culturally diverse regions like Malaysia1.
Worker unionization: In response to the layoffs, hundreds of TikTok employees in London have formed a union to fight for their jobs and improve working conditions2.
The shift towards AI-driven content moderation is not unique to TikTok, reflecting a broader industry trend among social media platforms. Instagram and Threads, owned by Meta, have also grappled with moderation challenges, recently facing issues with account locks and content down-ranking1. While Instagram's head Adam Mosseri initially attributed these problems to human moderator errors, the company later acknowledged that technical glitches in their moderation tools also played a role2.
This industry-wide pivot to AI moderation is driven by the need to handle massive volumes of user-generated content efficiently. However, it raises concerns about the balance between automation and human oversight. As platforms invest heavily in AI technologies for content review, they must also address the potential limitations of these systems in understanding complex cultural contexts and nuanced content3. The trend underscores the ongoing challenge for social media companies to maintain platform safety while managing operational costs and meeting regulatory demands.