Answered step by step
Verified Expert Solution
Question
1 Approved Answer
SECTION A [100 MARKS] Read the case study carefully and then answer the questions that follow. Inside Facebooks African Sweatshop By: B. Perriogo, TIME, 17
SECTION A [100 MARKS]
Read the case study carefully and then answer the questions that follow. Inside Facebooks African Sweatshop
By: B. Perriogo, TIME, 17 February 2022
In a drab office building near a slum on the outskirts of Nairobi, Kenya, nearly 200 young men and women from countries
across Africa sit at desks glued to computer monitors, where they must watch videos of hate speech, murders, rapes and
suicides.
These young Africans work for Sama, which calls itself an ethical AI outsourcing company and is headquartered in
California.
Sama says its mission is to provide people in places like Nairobi with dignified digital work. Its executives can often be
heard saying that the best way to help poor countries is to give work, not aid. The company claims to have helped lift more
than 50,000 people in the developing world out of poverty.
This benevolent public image has won Sama data-labelling contracts with some of the largest companies in the world,
including Google, Microsoft and Walmart. What the company doesnt make public on its website is its relationship with its
client Facebook.
Here in Nairobi, Sama employees who speak at least 11 African languages between them toil day and night, working as
outsourced Facebook content moderators: the emergency first responders of social media. They perform the brutal task of
viewing and removing illegal or banned content from Facebook before it is seen by the average user.
Since 2019, this Nairobi office block has been the epicentre of Facebooks content moderation operation for the whole of
Sub-Saharan Africa. Its remit includes Ethiopia, where Facebook is trying to prevent content on its platform contributing to
incitement to violence in an escalating civil war.
Despite their importance to Facebook, the workers in this Nairobi office are among the lowest-paid workers for the platform
anywhere in the world, with some of them taking home as little as $1.50 per hour. The testimonies of Sama employees
reveal a workplace culture characterized by mental trauma, intimidation, and alleged suppression of the right to unionize.
The revelations raise serious questions about whether Facebookwhich periodically sends its own employees to Nairobi to
monitor Samas operationsis exploiting the very people upon whom it is depending to ensure its platform is safe in
Ethiopia and across the continent. And just as Facebook needs them most, TIME can reveal that content moderators at
Sama are leaving the company in droves due to poor pay and working conditions, with six Ethiopians resigning in a single
week in January 2022.
The work that we do is a kind of mental torture, one employee, who currently works as a Facebook content moderator for
Sama, told TIME. Whatever I am living on is hand-to-mouth. I cant save a cent. Sometimes I feel I want to resign. But then I
ask myself: what will my baby eat?
TIME is aware of at least two Sama content moderators who chose to resign after being diagnosed with mental illnesses including post-traumatic stress disorder (PTSD), anxiety, and depression, which they developed as a result of working at
Sama. Many others described how they had been traumatized by the work but were unable to obtain formal diagnoses due
to their inability to afford access to quality mental healthcare. Some described continuing with work despite trauma because
they had no other options. While Sama employs wellness counsellors to provide workers with on-site care in Nairobi, most
of the content moderators TIME spoke to said they generally distrust the counsellors. One former wellness counsellor says
that Sama managers regularly rejected counsellors requests to let content moderators take wellness breaks during the
day, because of the impact it would have on productivity.
Workers say Sama has also suppressed their efforts to secure better working conditions. In the summer of 2019, content
moderators threatened to strike within seven days unless they were given better pay and working conditions. Instead of
negotiating, Sama responded by flying two highly-paid executives from San Francisco to Nairobi to deal with the uprising.
Within weeks Daniel Motaung, the attempted strikes leader who was in the process of formally filing trade union papers,
had been firedaccused by Sama of taking action that would put the relationship between the company and Facebook at
great risk. Sama told other participants in the labour action effort that they were expendable and said they should either
resign or get back to work, several employees told TIME. The workers stood down before the seven days were up, and
there was no pay increase.
At Sama, it feels like speaking the truth or standing up for your rights is a crime, a second employee tells TIME. They
made sure by firing some people that this will not happen again. I feel like its modern slavery, like neo-colonialism.
Foxglove, a legal NGO based in London, says it has informed Sama it is preparing legal action in relation to its alleged
wrongful termination of Motaung. Firing workers for trying to organize is against the law, says Cori Crider, Foxgloves
director. Daniel did a brave thing by blowing the whistle hereas was his legal right. The Katiba Institute, a Kenyan public-
interest law firm, is assisting with the case.
When asked for their comment, a Sama management representative stated that, We value our employees and are proud of
the long-standing work we have done to create an ethical AI supply chain. We exist to provide ethical AI to our global
customers and we are proud of the role our employees play in building new online experiences and cleaning up the internet.
Its a tough job and its why we invest heavily in training, personal development, wellness programs, and competitive
salaries.
Facebook says it spent more than $5 billion on safety measures in 2021. It contracts the services of more than 15,000
content moderators globally, most of whom are employed by third-parties like Sama. In response to a detailed set of
questions for this story, a spokesperson for Facebooks parent company Meta said: We take our responsibility to the
people who review content for Facebook seriously and require our partners to provide industry-leading pay, benefits and
support. We also encourage content reviewers to raise issues when they become aware of them and regularly conduct
independent audits to ensure our partners are meeting the high ethical standards we expect of them.
In an era where Facebook has come under sustained fire for failing to stem the flow of misinformation, hate speech and
incitement to violence on its platforms, the company is often praised when it says it is increasing the number of dollars it
spends on safety.
But hiring content moderators in the U.S. and Europe is expensive compared to the cheap labour available in Kenya and
other countries in the Global South like India and the Philippines. The rise of content moderation centres in these countries
has led some observers to raise concerns that Facebook is profiting from exporting trauma along old colonial axes of
power, away from the U.S. and Europe and toward the developing world.
Outsourcing is a scam that lets Facebook rake in billions while pretending worker exploitation and union-busting is
somebody elses fault, says Crider, the Foxglove lawyer who is currently preparing a legal case against Sama. But its not
just Sama, she added. Foxglove has been working with Facebook moderators around the world for years and these
people have had it with exploitation, the strain of toxic content, and suppression of their right to unionize.
Almost all of the employees TIME spoke to for this story described being profoundly emotionally affected by the content
they were exposed to at Sama trauma that they said was often exacerbated by the way they have been treated in their
jobs. Many expressed the opinion that they might be able to handle the trauma of the job even take pride that they were
sacrificing their own mental health to keep other people safe on social media if only Sama and Facebook would treat them
with respect, and pay them a salary that factors in their lasting trauma.
Adapted from: Perrigo, B. (2022) Inside Facebooks African Sweatshop. TIME. [Online]
https://time.com/6147458/facebook-africa-content-moderation-employee-treatment/
Step by Step Solution
There are 3 Steps involved in it
Step: 1
Get Instant Access to Expert-Tailored Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started