How cases against technology giants could shape digital jobs in Kenya

High Court judge Byram Ongaya. On June 2, 2023, he ordered five government bodies to review Kenya’s labour laws and file a report in court on whether they are sufficient to cater for stakeholders in the digital workspace.

Photo credit: File | Nation Media Group

When Kenya enacted the Employment Act and the Occupational Safety and Health Act in 2007, Facebook, now rebranded as Meta, was just three years old, and gaining popularity across the world.

Access to computers and the internet in Kenya was shooting upwards, and the yearn for convenient communication propelled Meta to stake its place in the country’s digital space.

The social media platform’s uptake in Kenya was quite good, but it was not clear enough whether the US-based tech giant would ever have a direct impact on the country’s labour policies.

But nearly two decades later, a suit filed against Meta and two firms it hired to assist with content moderation – Samasource Epz Kenya and Majorel Kenya – could shape how employment contracts are enforced.

Meta, Samasource and Majorel are defending a case that is expected to determine how players in the tech industry will pay workers and offer them other forms of welfare support.

But the Meta court case is just one instance of legal action that has opened the debate on whether Kenya’s laws are sufficient to tackle challenges and disputes arising from the complex work in the digital space.

The labour rights disputes come as the government steps up its intentions to get more Kenyans to work for big tech companies.

Recently President William Ruto, while addressing the World Governments Summit in Dubai, UAE, claimed that Apple, the California-based tech giant, has already hired 23,000 Kenyans who are working out of Nairobi.

“The digital economy is delivering attractive opportunities for young people, to work for employers scattered across the world without having to leave their homes in Kenya,” the President said as he pushed for development of the digital economy.

But that push has become a source of trouble as some workers have filed legal challenges claiming that they are not being offered a fair opportunity to earn dignified pay for dignified work.

From allegations of poor pay, discrimination, lack of adequate psychological healthcare for workers in the digital workspace and several other allegations, legal queries have dragged some of the world’s biggest tech companies into cases that are expected to deliver landmark decisions.

At the National Assembly, owners of the ChatGPT platform are defending a petition which seeks to investigate treatment of employees hired to carry out tasks for big tech firms.

Last year, Tiktok was served with a demand letter with a threat to sue for alleged violation of labour laws and inhumane treatment of Kenyans hired to moderate its content. ByteDance Inc, its parent firm, has denied the allegation and maintained that there is no case against it.

On June 2, 2023 High Court judge Byram Ongaya ordered five government bodies to review Kenya’s labour laws and file a report in court on whether they are sufficient to cater for stakeholders in the digital workspace, following a case filed against Facebook, now rebranded as Meta.

The Kenya National Commission on Human Rights and Equality Commission, Central Organisation of Trade Unions (Cotu), Ministry of Labour and Social Services, Ministry of Health and the Attorney-General are required to file that report before the case against Meta is determined.

Nearly 200 people hired as content moderators have sued Meta’s parent firms in US and Ireland, alongside third parties that offered the employment opportunities – Samasource and Majorel.

The content moderators have sued to quash redundancy notices issued by Samasource, which intends to shut down its Nairobi hub, which serves the Eastern and Southern Africa regions. Majorel had started hiring content moderators as it sought to take over from Samasource.

The employees claimed that Samasource’s exit is pegged on a suit filed by former content moderator Daniel Motuang seeking damages for violation of Kenya’s labour laws.

“The court considers that pending the hearing of the petition, and in view of the lamentations by the content moderators herein, the KNCHEC, Cotu, Ministry of Labour and Social Services, Ministry of Health and the Attorney-General shall review the status of the law and policy for protection of employees’ occupational safety in the sector of digital work, digital workspaces, and digital workplace and improvement of the applicable policy and law and report to the court in that regard including extent of protection of the applicants in the instant case,” Justice Ongaya ruled.

The judge said his decision was based on affidavits from the content moderators and Samasource, which showed that both sides of the fence were in agreement that the job was hazardous to mental health.

Dozens of plaintiffs filed their alleged bad experiences, accusing Samasource and Meta of being negligent in providing a healthy working environment and appropriate treatment to fight off trauma-triggered illnesses.

In their affidavits, the content moderators claim that they would watch dozens of videos depicting vile actions every work day and only provided counselling by personnel who are not medically trained to handle illnesses triggered by trauma.

One of the plaintiffs says she was offered a content moderation job despite only applying for a position at a call centre.

Samasource, the plaintiffs argue, did not sufficiently inform the content moderators of the nature of the job.

Justice Ongaya’s ruling also pointed to a gap in Kenya’s labour laws to deal with the digital workspace, which has become a critical part of everyday life. That gap, the decision indicates, could now be filled by litigation. As Samasource and Majorel were defending the court cases, they found themselves at the centre of nearly similar allegations but stemming from two other digital platforms – social media site Tiktok and free-to-use artificial intelligence system ChatGPT.

Barely a week after Justice Ongaya’s ruling, which also blocked Meta from being struck off the case, another set of content moderators filed a petition before the National Assembly seeking an investigation into welfare of young Kenyans working for big tech firms using third parties to hire Kenyans.

Richard, Mwaura Mathenge, Mophat Ochieng Okinyi, Alex Mwaura Kairu and Bill Kelvin Mulinya are among young Kenyans who were hired by Samasource to train the ChatGPT algorithm and ensure that it does not become a dangerous tool available to the general public.

The four petitioners also accuse Samasource of failing to give them sufficient information on the nature of the job.

They argue that they had to review material depicting sexual violence, defilement, self-harm and murder.

“The outsourcing model has proven to be harmful to tech workers as the outsourced workers are treated poorly and not afforded the same protection as the full-time employees. They are engaged and cast aside at will while simultaneously being required to carry out very harmful work at poor pay. We were sent away without receiving all our dues and without medical care for the harm caused by the job we were required to do,” four ChatGPT workers have told the National Assembly in a petition.

With ChatGPT work, the people hired were allegedly not offered psychological support. The contract between Samasource and Open AI, the ChatGPT proprietor, terminated the deal abruptly and left several people jobless and in need of psychological treatment.

Open AI is yet to respond to the petition before the National Assembly.

James Oyange Odhiambo on his part was hired as a Kiswahili customer care representative following an interview.

But on reporting to work, he was suddenly a content moderator.

Mr Odhiambo has accused Tiktok and Majorel Kenya of forced labour, failure to provide copies of employment particulars, wage theft and failure to provide accurate payslips, failure to provide safe equipment, failure to provide adequate mental healthcare, interference with freedom of association, discrimination, unlawful surveillance and unlawful termination of employment.

In a demand letter to Tiktok’s parent firm ByteDance Inc, Mr Odhiambo has demanded public admission of fault from the platform’s owners, payment of all dues to content moderators and hiring of experts to help protect moderators from harmful content.

He also wants the tech giant to give formal contracts at fair pay to all content moderators, and hiring of trained medical staff to ensure quality treatment for overwhelmed workers. Tiktok, through Los Angeles-based Gibson Dunn & Crutcher LLP, has maintained that it has never hired Mr Odhiambo, whether directly or through Majorel Kenya or any other firm hence the demand letter was misdirected.