Kenyan workers win High Court appeal to take Meta to trial

Kenyan workers win High Court appeal to take Meta to trial

In September 2024, the Nairobi Court of Appeal made a significant ruling in favor of 185 former Facebook and Instagram content moderators who had filed cases against Meta. The cases, revolving around poor working conditions and mass-firings, were given the green light to proceed to trial after an 18-month legal battle. Despite Meta’s efforts to block the cases from being heard in Kenyan courts, the Court of Appeal ruled against their appeals.

One of the key figures in initiating the legal action against Meta was former Facebook content moderator Daniel Motaung. He alleged that Meta had exploited him and his colleagues, leading to harm to their mental health. Motaung and his co-workers formed a union to push back against the exploitation, which ultimately resulted in them being unlawfully fired by Meta in an attempt to dismantle the union.

Following the mass-firings, it was revealed that Meta planned to switch outsourcing companies at its Nairobi hub, effectively preventing the fired workers from returning to work. This prompted the workers to collectively launch a second case against their unfair termination.

The Court of Appeal’s decision to allow both cases to proceed to trial in Kenya was a significant victory for the workers. In addition to seeking compensation from Meta and its contractors, the workers are also pushing for improvements in their working conditions, the right to speak out about poor conditions, the ability to join a trade union, and the implementation of a mental health support system similar to those in Meta’s headquarters in Menlo Park and Dublin.

Despite attempts to reach out to Meta for comments on the court decisions, Computer Weekly received no response.

A pattern of multinational exploitation

The Court of Appeal’s ruling has injected hope into the content moderators who have been subjected to exploitation by Meta. Kauna Malgwi, a former Facebook content moderator and chairperson of the Nigeria chapel of the African Content Moderators Union, expressed her elation at the prospect of facing Meta in court after years of legal battles.

Malgwi highlighted the mental strain faced by content moderators, including exposure to disturbing content, and criticized Meta’s attempts to delay the legal process with legal maneuvers and mediation offers that led nowhere.

Meta has hired lawyers to delay our case as much as they can with dirty legal tricks and bad faith offers of mediation that ultimately went nowhere

Kauna Malgwi, African Content Moderators Union

She emphasized the importance of holding big tech companies accountable for their treatment of workers and noted the historic defiance shown by the African Content Moderators Union in fighting for better working conditions.

The exploitation of content moderators in the Global South, as highlighted by the cases in Kenya, reflects a broader pattern of multinational corporations prioritizing cheap labor at the expense of workers’ well-being.

Nairobi has emerged as a hub for AI outsourcing due to high unemployment rates, a growing youth population, and the prevalence of English speakers. However, reports of low wages and poor working conditions for content moderators and microworkers have raised concerns about the treatment of workers in the tech industry.

‘Essential work’

Advocates argue that content moderation and microwork should be recognized as essential work to improve the working conditions of workers in these roles. Mary L. Gray, a senior principal researcher, emphasized the need for better workplace conditions, recognition of the challenges faced by content moderators, and the ability to organize and bargain collectively.

Former TikTok content moderator James Oyange echoed the sentiment, highlighting the need for mental health support, recognition, and equitable treatment for content moderators and AI workers.

A potential step change

The Court of Appeal’s ruling in Nairobi could signal a shift in how big tech companies treat their content moderators and microworkers. Martha Dark, co-executive director of Foxglove, noted that the ruling challenges Meta’s attempts to evade accountability and sets a precedent for holding tech companies responsible for their treatment of workers.

Experts emphasize the importance of public awareness, policy changes, and corporate accountability to address systemic issues in the content moderation industry. The ruling in Kenya could lead to improved working conditions, better mental health support, and fairer compensation for content moderators globally.

Despite the challenges faced in taking on tech giants like Meta, advocates like Malgwi stress the importance of collective action and cross-border solidarity in holding companies accountable for their treatment of workers.

As content moderators continue to fight for justice, the ruling in Nairobi serves as a reminder that no company is above the law and that workers’ rights must be upheld.

Leave a Reply

Your email address will not be published. Required fields are marked *