Campaigners allege that Meta has inflicted severe psychological trauma on over 140 content moderators in Kenya, with diagnoses of PTSD made by Dr. Ian Kanyanya. These findings were part of a lawsuit against Meta and Samasource Kenya, which managed the moderation work. The psychological toll on moderators highlights critical issues concerning workplace mental health support within the tech industry, echoing similar claims from moderators of other platforms.
Campaigners have accused Meta, Facebook’s parent company, of inflicting severe psychological harm on content moderators in Kenya, with over 140 individuals diagnosed with Post-Traumatic Stress Disorder (PTSD) and other mental health issues. These diagnoses, performed by Dr. Ian Kanyanya, head of mental health services at Kenyatta National Hospital in Nairobi, were part of a lawsuit filed against Meta and its subcontractor, Samasource Kenya. Content moderators, who are responsible for filtering disturbing content, have long faced criticism regarding the mental toll of their work, particularly in developing countries where support systems may be inadequate. Although Meta has refrained from commenting on specific medical reports due to legal proceedings, it emphasized its commitment to the well-being of its moderators, citing established guidelines for counseling and training.
Dr. Kanyana highlighted the extreme nature of the content viewed by moderators, including graphic violence and sexual abuse, leading to alarming rates of severe PTSD among those assessed. According to the findings, 81% of the 144 individuals examined suffered from this debilitating condition. The legal action emerged following allegations from a former moderator who claimed wrongful termination after advocating for better working conditions, bringing to light systemic issues at Samasource Kenya that reportedly resulted in the redundancy of all moderators in Nairobi.
Medical evaluations from the moderators depicted severe consequences, including recurrent nightmares and anxiety linked to graphic material. Notably, former moderator Martha Dark from Foxglove stated, “Moderating Facebook is dangerous, even deadly, work that inflicts lifelong PTSD on almost everyone who moderates it,” emphasizing the extensive psychological impact of this occupation. This incident underscores a broader pattern of legal challenges facing social media companies concerning the mental health of their content moderation workforce, as similar lawsuits have emerged from platforms like TikTok in recent years.
The topic surrounding the mental health of content moderators has gained increasing attention, particularly given the nature of their work which involves exposure to graphic and distressing content on social media platforms. Prior criticisms have arisen over the psychological implications of such labor, especially when conducted under the auspices of third-party outsourcing companies in developing nations. In many cases, these moderators lack adequate psychological support and are often at high risk of developing serious mental health conditions due to the trauma associated with the content they review. The ongoing legal battles are indicative of a growing awareness of the responsibilities that platforms like Meta hold for their workers’ welfare and the systemic issues within the outsourcing arrangements that perpetuate these risks.
In conclusion, the accusations against Meta regarding the mental health impact on Kenyan content moderators represent a significant concern in the realm of social media and labor practices. With a staggering percentage of those assessed diagnosed with severe PTSD, the findings reveal the urgent need for reform in how content moderation is managed and the importance of providing sufficient mental health support. The ongoing legal proceedings could set a precedent for accountability within the tech industry and highlight the rights of workers exposed to extreme job conditions.
Original Source: www.cnn.com