Meta’s East African content moderation hub shuts down

Meta’s East African content moderation hub is shutting down as the social media giant’s third-party contractor moves away from policing harmful content, cutting around 200 staff and leaving several employees without work permits.

The owner of Facebook, WhatsApp and Instagram first contracted Sama in 2017 to assist with labelling data and training its artificial intelligence, hiring around 1,500 employees. But within two years, the Nairobi office was moderating some of the most graphic and harmful material on Meta’s platforms, including beheadings and child abuse.

Sama staff were told on Tuesday morning that the company would focus solely on labelling work — also known as “computer vision data annotation” — which includes positioning animations in augmented reality filters, such as bunny ears.

“The current economic climate requires more efficient and streamlined business operations,” Sama said in a statement encouraging employees to apply for vacancies at its offices in Kenya or Uganda. Some Sama staff rely on work permits to remain in the region. Sama’s content moderation services will end in March, allowing for a transition period for Meta’s new third-party contractor.

Meta will continue to employ 1,500 Sama staff for data labelling. The news comes two months after Meta announced it would be cutting its global headcount by 13 per cent, or around 11,000 employees, as the social media company suffers from falling revenue, a slump in digital advertising and fierce competition from rivals including TikTok.

One person familiar with the operations said Meta did not want a gap in services and used its position to push Sama to offer moderation services for longer than the contractor wanted. The Nairobi office focused on content generated in the region, including about the civil conflict in Ethiopia, which Meta is currently being sued for over claims that the posts incited violence. Meta’s policies ban hate speech and incitement to violence.

The social media group, which employs more than 15,000 content moderators worldwide, said it had a new partner in place and that its moderation capabilities were the same. Luxembourg-based Majorel, which has already led moderation services in Africa for short-form video app TikTok, is said to be taking on the contract, according to two people with knowledge of the changes.

“We respect Sama’s decision to exit the content review services it provides to social media platforms. We’ll work with our partners during this transition to ensure there’s no impact on our ability to review content,” Meta added.

FINANCIAL TIMES