X Welcome to International Affairs Forum

International Affairs Forum a platform to encourage a more complete understanding of the world's opinions on international relations and economics. It presents a cross-section of all-partisan mainstream content, from left to right and across the world.

By reading International Affairs Forum, not only explore pieces you agree with but pieces you don't agree with. Read the other side, challenge yourself, analyze, and share pieces with others. Most importantly, analyze the issues and discuss them civilly with others.

And, yes, send us your essay or editorial! Students are encouraged to participate.

Please enter and join the many International Affairs Forum participants who seek a better path toward addressing world issues.
Sat. December 21, 2024
Get Published   |   About Us   |   Donate   | Login
International Affairs Forum
IAF Editorials
Neocolonialism in the Digital Era: Meta’s Bid to Outsource Psychological Trauma to Africa
Comments (0)

In 2017, a woman, ‘Hannah B.’, hopped on her Facebook account to check the news and catch up with friends. Upon logging on to the popular social media networking site, she was unexpectedly confronted with a perturbing livestream that depicted a Thai man killing himself and his young daughter. Confused about why someone would post such a graphic video to her timeline and baffled about how Facebook could allow such content to be shared, she was unsettled by the experience. “I couldn’t stop thinking about it,” said Ms. B., a 27-year-old lifelong Boston resident, “seeing that content was extremely psychologically disturbing.”

After Facebook introduced a new live-streaming product in April 2016, the rising number of instances of content involving violence, abuse and assault prompted an outspread of public criticism. In response, Facebook’s parent company, Meta Platforms Inc., notably enhanced its content moderation capabilities to protect users from content deemed graphic or violent. Previously reliant on users, themselves, to report harmful content, Facebook amended its content moderation strategy and employed over 15,000 “content moderators” who would view and remove banned content on Facebook’s platforms before it could reach everyday users.

Recent 2022 lawsuits involving human rights abuses among contracted content moderators in Kenya and issues of failed content moderation efforts in Ethiopia reveal inequities in Meta’s consideration for employees and customers on the African continent. Contracted Meta employees and everyday social media users in Africa demand the same standard of protection and support as their United States counterparts. Today, Meta is confronted by a precedential, globally consequential decision— fundamentally change its current content moderation strategy in developing economies or risk a new, digital-aged reconstruction of exploitation and profit-extraction along traditional colonial power axes.

Despite the significance of content monitoring for a technology company, Meta’s reformed content moderation strategy relies entirely on outsourced, third-party contractors. In 2019, Meta rolled out a new Nairobi-based content moderation operation, partnering with self-proclaimed “ethical AI” company, Sama EPZ Ltd. Meta was optimistic that its partnership with Sama would reduce harmful social media content globally while simultaneously stimulating the local Kenyan economy with new employment opportunities. Four years later, the media giant’s increasingly publicized legal disputes with the Government of Kenya reflect its inability to effectively execute these pursuits. In a massive underestimation of the strength of Kenyan political autonomy, Meta faces legal repercussions associated with alleged human and labor rights abuses among contracted content moderators in Nairobi.

In early 2019, Daniel Motaung, a university graduate from South Africa, was excited to come across a Sama advertisement recruiting native Zulu speakers. It was not until after relocating his family to Nairobi for the employment opportunity and signing an obligatory non-disclosure agreement that Sama informed him that his job would include regular viewing of instances of beheading, sexual abuse, and murder. Eventually, the 27-year-old began to experience psychological consequences such as flashbacks, nightmares, and anxiety, which ultimately culminated in a post-traumatic stress disorder diagnosis. Frustrated by his psychological distress, relocation under false pretenses, and modest pay compared to industry standards, Motaung organized a 7-day strike among his coworkers in the summer of 2019. Within two weeks of the unionization attempt, Motaung’s employment was terminated due to the way in which his actions put Sama’s relationship with Facebook “at great risk” according to his dismissal letter. A recent TIME investigation finds that Sama Nairobi salaries were the lowest among Meta employees in the world and reports numerous accounts of psychological abuse with one employee describing the role as “a kind of psychological torture.”

In May 2022, Motaung issued a lawsuit with the Kenya Employment and Labor Relations Court, alleging forced labor, human trafficking, exploitation, pay discrimination and union busting. According to Motaung, these injustices violate Kenyan constitutional protections of freedom of association, expression, dignity, privacy, fair remuneration and reasonable working conditions. Despite Meta’s insistence that it is not subject to legal charges because it is “not a resident, trading or domiciled in Kenya,” presiding judge Kariuki ruled that Meta will not be stricken from the lawsuit as it is considered a “proper party” in the case. Although Meta foregoes claiming any responsibility in the Sama case, in 2021 the company paid an $85 million settlement to content moderators affected by psychological trauma contracted within the U.S. but refuses to replicate these compensations with moderators abroad.

Further highlighting Meta’s disregard for the psychological safety of African citizens, the social media conglomerate is also facing repercussions for underinvestment in content moderation services for Facebook users located in Africa. A December 2022 lawsuit filed by researchers at the Katiba Institute alleges that Meta’s algorithms and moderation mechanisms failed to censor hateful content that ultimately fueled ethnic violence in Ethiopia amidst the ongoing civil war. Although Meta’s own in-house studies have found correlations between its algorithm blueprint and episodes of political extremism, the company failed to mobilize its capacity to quickly remove provocative content during a crisis within the Ethiopia context. Furthermore, internal whistle-blowers have disclosed that the $300 billion company does not employ adequate content monitors that speak local languages with 87% of Facebook’s misinformation budget allocated to the United States.

So here we have it – the world’s most prominent technology company is outsourcing psychological trauma to East Africa and subsequently refusing to assume responsibility based on jurisdictional technicalities. To Mark Zuckerberg and the leaders of Meta Platforms Inc.— you are ethically responsible for the actions of your outsourcing partners and legally and morally obligated to abide by domestic laws within your operational processes. Motaung’s lawsuit exposes the inconsistencies in your claims of requiring outsourcing partners to provide “industry-leading pay, benefits and support” and your geographical bias with respect to which employees are protected under this provision. Furthermore, your failure to adequately monitor and remove harmful content within African countries to the same degree as in the United States and other developed countries highlights an unethical, institutionalized prioritization of the psychological well-being of U.S.-based customers. In order to address the underdeveloped content moderation operation in Africa, Meta must offer contracted content moderators equivalent mental health support, compensation and benefits to U.S. employee standards and invest in content moderation in native African languages, with specific focus on historically conflict-affected populations.

Meta has taken deliberate action to protect Ms. B. from viewing distressing content on Facebook as content moderation supports customer retention and thus retained profits. Why should African citizens— among the most historically marginalized people in the world— be denied the same status of protection?

Emma Parece holds a Masters in International Relations & International Economics Graduate of Johns Hopkins University School of Advanced International Studies (SAIS)

Comments in Chronological order (0 total comments)

Report Abuse
Contact Us | About Us | Donate | Terms & Conditions X Facebook Get Alerts Get Published

All Rights Reserved. Copyright 2002 - 2024