Labour Organising and Digital Rights

How the first global content moderators summit will help tackle one of the most precarious work environments in tech. By Martha Dark and Julia Kloiber

Digital Futures - deepdive-dummy_optimized

‘We will know we are winning if content moderators and other precarised tech workers are able to collectively bargain for their rights, no matter where they live and work.’

The Digital Futures Gathering was a springboard for an important discussion connecting digital rights with labour rights. This connection is essential for just and inclusive digital futures. At the gathering, content moderators shared insights into their working conditions and what they thought needed to change to make their work safe and fair. This is where the idea of the content moderator summits was born.

The internet relies on content moderation

Every minute over 500,000 comments are posted on Facebook and more than 2,500 videos are uploaded onto YouTube. Keeping the platforms free from hate and disinformation is a massive undertaking. Mark Zuckerberg would like us to believe that AI will fix content moderation issues, but in reality these systems are years away from being able to moderate platforms as effectively as humans. Without the essential work done by content moderators, the digital platforms we use every day would look very different – saturated with conspiracy theories, hate and violence.  

But while human content moderators play a crucial role, debates about social media often focus on algorithms and automation.

By focusing on AI and the debates about what should or shouldn’t make it into content policies, important discussions about labour rights have been overlooked. 

Tens of thousands of human content moderators work worldwide at Facebook, TikTok, YouTube, Google and their outsourcing companies – from Lisbon to Manila to Nairobi and beyond. Their work is indispensible but harmful, involving regular exposure to hate speech, graphic violence towards people and animals, murder, sexual abuse and child exploitation. Many moderators suffer trauma and develop PTSD, in part because these tech giants fail to provide the sustained psychiatric support required to make the work safe. The situation is made worse by the fact that content moderators often work on precarious, outsourced, low-wage contracts, without many of the benefits full-time employees receive.

Content moderators are organising

The vast wealth of many of today’s tech giants stands on a mountain of undervalued labour. But in recent months, tech workers from Amazon, Uber and Google have been coming together to demand better working conditions and recognition. They have brought groundbreaking legal cases, organized, addressed legislators, campaigned and revealed the realities of their workplaces to the world. Tech workers, including content moderators, have made headlines – for example Daniel Motaung, who was fired from his job at Sama when he led more than 100 of his co-workers in an attempt to join the Digital Futures Gathering remotely from Nairobi.

The groundswell of resistance is growing.

In 2023, Superr, Foxglove, Aspiration Tech and the German labour union ver.di will host a summit series which will build on this momentum. The summits will bring together moderators, union organizers, non-profits, legislators and lawyers from across the world to collaborate and build worker power at Facebook, TikTok, YouTube and more.

Martha Dark is a Foxglove founder and director with over a decade of experience building and managing social justice organisations.

Julia Kloiber is Co-Founder and Managing Director of Superrr Lab.