TikTok moderators claim ‘PTSD’ after allegedly watching rape, mass murder
Former TikTok moderator Candie Frazier has seen the worst the internet has to offer.
Now, she’s suing the social media company for neglect, the Verge has reported, alleging that it ignored the mental well-being of its moderators, who watched “thousands of acts of extreme and graphic violence” on the job.
Frazier is part of a pending class-action lawsuit filed against ByteDance, which owns TikTok, in California Central District Court. While working for a third-party contracting firm, Telus International, Frazier and her colleagues witnessed countless hours of potentially offensive footage, allegedly including rape, animal cruelty, cannibalism and mass murder.
TikTok has not responded to The Post’s request for comment. But spokesperson Hilary McQuaide said in a statement to the Verge: “Our Safety team partners with third party firms on the critical work of helping to protect the TikTok platform and community, and we continue to expand on a range of wellness services so that moderators feel supported mentally and emotionally.”
In a Stanley Kubrick-esque depiction of aversion therapy a la “A Clockwork Orange,” their allegedly punishing review process entailed new video assignments every 25 seconds, so moderators were forced to watch between three and 10 videos simultaneously to keep up with the untenable pace. Their 12-hour shifts include one 1-hour break, then as many as five 15-minute breaks accrued throughout the day, they claim.
Frazier said she’s suffered “severe psychological trauma including depression and symptoms associated with anxiety and PTSD” as a result of her work with TikTok. According to the lawsuit, she has “trouble sleeping and when she does sleep, she has horrific nightmares. She often lays awake at night trying to go to sleep, replaying videos that she has seen in her mind. She has severe and debilitating panic attacks.”
The lawsuit, filed by California’s Joseph Saveri Law Firm, would have TikTok and its partners offer more frequent breaks, psychological support and visual safeguards, such as blurring.
TikTok is only the latest social media company to hear criticism over its content moderation policy. Other social media companies have been accused of failing to safeguard against harmful content, or doing harm to their moderators.
Attorneys recently won a $52 million settlement from Facebook in a similar lawsuit launched in 2018 on behalf of moderators, including one plaintiff who said he developed PTSD after being forced to watch child porn on the platform. They alleged a quota of more than 10 million potentially objectionable posts every week.
Artmotion U.S.A