A Kenyan court recently ruled that a case against Facebook's parent company Meta, filed by a former Facebook moderator Daniel Motaung, for alleged exploitation and a toxic working environment at its Nairobi office can advance, setting a precedent for tech companies that have tried to evade legal repercussions in local courts by claiming lack of jurisdiction. Motaung also sued Meta Platforms Ireland Limited and its local outsourcing agent, Samasource Kenya EPZ Limited.
In his suit filed with the Kenya Employment and Labour Relations Court last year, Motaung and 12 other former employees claim that they suffered psychological injuries from repeated exposure to extremely disturbing, graphic violent content coupled with a toxic working environment.
Through its lawyer, Meta had argued that Meta Platforms, Inc. and Facebook are foreign corporations and neither residents nor trading in Kenya, thus were not under the country’s jurisdiction. However, the Employment and Labour Relations Court ruled on February 6 that Meta can be sued in Kenya. It will be the first time that a lawsuit against a global tech giant is proceeding to a hearing, not just outside of the West, but in Africa, where the wrongdoing happened.
This court decision could open the floodgates for other technology companies to be sued in Kenya and other countries. This could include OpenAI which, according to a recent exposé by the Times magazine, paid Kenyan workers less than USD 2 to train the artificial intelligence, often requiring them to review explicit and traumatic material.
The harms of content moderation
Content moderation, the process of reviewing social media posts, photos, and videos to determine if the content violates a platform's policies, has come under increased scrutiny in recent years as the mental health impact on workers has become more clear.
Despite content moderation being a core function of any social media site, Facebook chooses not to directly employ the 15,000 people who do their content moderation. Instead, it chooses to outsource this critical safety function to third-party contractors such as Genpet in India, Cognizant in the U.S., Covalen and Accenture in Ireland, and — until recently — Sama Source in Kenya.
The stories of content moderators hired by these Facebook contractors are all disturbingly similar: they are subjected to prolonged hours of exposure to repulsive content, given little to no support to counter or deal with the damage from exposure to such content, paid unfairly, and gagged with non-disclosure agreements. Following the May 2020 case in which Meta paid USD 52 million in settlement to more than 11,000 content moderators in the U.S. for mental health issues developed while on the job, the company improved the working conditions of some moderators in the U.S. They did not replicate this with others abroad.
According to Nanjira Sambuli, a Kenyan researcher and policy analyst, Africa became the next best destination for the export of content moderation and with it, the side effects.
Odanga Madung, a Mozilla Research Fellow, independent data journalist, and co-founder of Odipo Dev, says that Motaung has a solid case. As he told Global Voices, going purely by the previous content moderation suits, Meta is likely to settle out of court.
If this case does go to a full hearing, it will most likely expose the intricacies within Facebook. The public will get access to their content moderation practices. What makes them allow content to stay up or be taken down and in so doing, open them up to other suites or more scrutiny.
Odanga was referring to the May 2020 settlement.
‘We have avoided a very dangerous precedent’
But where Meta saw a digital Savannah in Nairobi with its young and tech-savvy workforce that could be exploited with unacceptable labour practices, they were blind to or became ignorant of the country’s judicial system. In October 2021, Kenya’s high court issued yet another landmark ruling against Uber for violating contracts with drivers in Kenya. Just like Meta, Uber tried and failed using the same argument of not being domiciled in Kenya. For years, their drivers fought to prove that Uber Kenya Limited and Uber BV were indeed one and the same company. Eventually, they won.
This ruling is a big deal for us who are in the platform accountability space, especially given the recent attempt by Facebook and Meta to skip accountability in saying that they are not domiciled in the country. The fact that they could escape responsibility for any sort of harm that they would cause the citizens of another country has been troubling.
As a Mozilla Fellow, Odanga has been exposing the disinformation industry that has been thriving in Kenya using research-based reports and pushing social media platforms to be more accountable.
Speaking on why the ruling was so important and of its significance, he said, “I think we have avoided a very dangerous precedent by having such a ruling made within our courts.”
The Butterfly Effect
Odanga spoke on the Butterfly Effect this recent ruling will have for very many people in the industry like him, but more importantly to Facebook users and digital rights activists who have fallen victim to their opaque content moderation practices.
This includes two Ethiopian researchers who, together with the Kenyan rights group the Katiba Institute, are suing Meta for USD 1.6 billion for allowing hateful content to flourish on their platform and fueling Ethiopian ethnic violence.
While speaking to journalists in Nairobi via a video link following the ruling of the Motaung against Meta, on the role that FoxGlove, a Tech Justice nonprofit was playing in the two cases, one of its co-founders and directors Cori Crider spoke on the significance of the two cases and how content moderation (or the lack thereof in the Ethiopians case) was at the heart of both.
As she noted, both cases arose from what happened in the same Nairobi content moderation hub at Sama Source.
Information disclosed by a former Meta employee to FoxGlove revealed that 87 percent of Facebook’s misinformation budget is allocated to the English-speaking U.S. The remaining budget is what is shared by the rest of the world.
Despite the importance of the Sub-Saharan region, Meta has failed to invest in these millions, she said.
As Crider explained, this failure to allocate a commensurate budget for content moderation is what created the untenable staffing levels which led to them enabling violent and hateful posts from Ethiopia to flourish, inflaming the country’s bloody civil war. It is also what led to the exploitation of content moderators and working conditions which, in the case of Motaung and his colleagues, caused psychological trauma and PTSD, she added.
From human rights to an economic argument
Youth unemployment is a real crisis in Africa. According to Statista, the rate in Kenya increased by 0.3 percentage points in 2021 in comparison to the previous year. According to the latest data, Kenya's youth unemployment rate reached a peak of 13.84 percent in 2021.
The lack of policies governing Kenya’s growing gig economy is threatening to turn its digitally savvy youth into digital slaves as growing cases of big tech exploitation continue to hit the headlines. Odanga also spoke of a real fear among his fellow researchers, Kenyan human and digital rights advocates that the political class could quickly turn this from human rights to an economic argument, and in so doing, abdicate their policy-making role that labour laws extend even into the digital space.
Where Crider sees the two cases as an opportunity to reset the relationship between important democracies such as Kenya (and the region) and some of the largest and most powerful tech companies that the world has ever seen, Nanjira cautions that these opportunities will not be fully realized unless “African policymakers set guardrails on how the gig economy is regulated on the continent.”