A new briefing paper released on February 4, 2026, highlights significant concerns regarding TikTok’s moderation policies and their implications for political influence and election integrity. The document, produced by the London Social Media Observatory (LSMO) in collaboration with the Westminster Foundation for Democracy (WFD), outlines emerging risks to democratic participation on the platform.
The LSMO and WFD convened a roundtable discussion that brought together 45 policymakers, academics, civil society representatives, and strategic advisors. Participants engaged in two panels focused on the risks associated with TikTok, including current moderation standards and the potential for highly personalized feeds to exacerbate polarization among users.
Concerns were raised about the insufficient digital media skills training available for educators and young users. Participants also emphasized the urgent need for better access to platform data for independent researchers, allowing for accurate study of these pressing issues. The discussions culminated in a series of recommendations aimed at various stakeholders, including policymakers, academic institutions, civil society organizations, and content creators.
Recommendations outlined in the briefing paper call for fair and transparent moderation practices, increased oversight of gendered and polarizing content, and improved access to platform data for researchers. These insights are critical as they aim to inform decision-making processes that can enhance democratic discourse.
The event marked the first public roundtable organized by the LSMO since its launch in December 2025. Dr. Andreu Casas, Associate Professor of Political Communication and Computational Social Science at Royal Holloway, expressed enthusiasm for the collaboration, stating, “Part of LSMO’s mission is to contribute innovation and thought leadership across academia, civil society, and government regarding the authenticity and safety of social media platforms, particularly around politics, democracy, and elections.”
Dr. Casas emphasized the importance of gathering insights from a diverse range of stakeholders to better understand concerns surrounding content moderation and governance on TikTok, which will inform future research on the topic.
Tanja Hollstein, Head of Elections at WFD, stressed the importance of comprehending the risks posed by social media platforms to democratic participation. She noted, “As social media platforms increasingly shape how citizens engage with politics and elections, it’s vital that we understand the risks they may pose to democratic participation.”
The discussions revealed genuine concerns regarding content moderation practices and their potential impact on electoral integrity. Hollstein hopes the briefing paper will serve as a valuable resource for policymakers as they consider measures to ensure social media platforms support healthy democratic debate rather than undermine it.
The full briefing paper, titled “TikTok, Politics, and Power: Implications for Democracy,” is available at the WFD’s official website. The findings aim to foster informed dialogue around the role of social media in contemporary political landscapes, emphasizing the need for balanced governance and stakeholder collaboration.
