If a science fiction writer were to devise a digital age torture, it might be the one that Reece Young and Ashley Velez experienced in their work for TikTok and for which they are currently seeking class action relief.
Young and Velez prevailed Monday on the first leg of their quest to hold TikTok Inc. and its Mountain View parent company Bytedance Inc. accountable for allegedly failing to protect them from the psychological injuries they suffered from “moderating” gruesome content on the TikTok platform.
U.S. District Judge Vince Chhabria ruled that the plaintiffs had plausibly alleged facts that, if proven to be true, would be sufficient to hold the defendants liable for the injuries the plaintiffs suffered, even though Young and Velez were “contracted employees,” not directly employed by the platform companies.
TikTok operates a social media platform on which users post, view and share short videos.
The platform is massive. According to statistics amassed by the plaintiffs, the TikTok app has been downloaded 1.3 billion times, and as many as 90 million videos are uploaded to the platform every day.
In 2020, Bytedance allegedly generated advertising revenues of $34 billion.
Young and Velez were employed by staffing firms who contracted to supply personnel to TikTok to serve as “content moderators,” screening videos for offensive content.
They worked 12-hour shifts interrupted only by two 15-minute breaks and a one-hour lunch. They performed their work online, logged into a demanding app known as TCS that served them a virtually continuous feed of 25-second video clips that had already been flagged as potentially objectionable. Their job was to view the clips and make immediate decisions on whether the content should be removed or muted.
TCS was allegedly a merciless master, continuously monitoring their work, flagging them for unapproved breaks or even short pauses, and holding them to performance quotas so challenging that at times they would try to meet their required output by watching multiple videos at once.
According to the complaint, the content they reviewed included a steady diet of heinously graphic and objectionable content, including “child sexual abuse, rape, torture, bestiality, beheadings, suicide and murder.”
In addition, they were repeatedly exposed to “conspiracy theories, … genocide deniers, false flag stories and other damaging distortions of present events and historical events, ‘challenges’ that involve high-risk behavior, fringe beliefs, hate speech, and political disinformation.”
As a result of their work, Young and Velez allege that they, and the class of other content moderators that they seek to represent, suffered “immense stress and psychological harm” for which they have had to seek counseling.
The suit contends that the defendants were well aware of the gruesome nature of the material that the content moderators viewed day in and day out, knew the range of harms, including PTSD, that could result from prolonged exposure to the toxic material, and nonetheless failed to implement ameliorative and harm-reduction strategies that could have reduced the psychological injuries that the plaintiffs and the potential class suffered.
The defendants raised a raft of arguments why the plaintiffs’ lawsuit should be dismissed without a trial, primary among them the fact that Young and Velez’s employers were the staffing companies, not the defendants, and the general rule in California is that a company that hires an independent contractor “is ordinarily not liable if that contractor’s workers are injured on the job.”
The rule has two exceptions; one if the company has retained virtually complete control over the work, and second if the “equipment” given to the worker to do the work is unsafe.
Chhabria concluded that both of the exceptions applied. He found that the work done by Young and Velez was thoroughly controlled by the defendants and their failure to include precautionary safety measures directly contributed to the injuries plaintiffs allegedly suffered.
He said that the videos could, for example, have been modified by “changing the color or resolution of [a graphic image], superimposing a grid over the image, changing the direction of the image, blurring portions of the image, reducing the size of the image and muting audio,” all as recommended by industry watchdogs.
He also found that the harm could be mitigated “if the software had better sorted the videos into graphic and non-graphic categories, so that moderators could have taken breaks from the graphic content,” again as recommended by industry groups.
On the unsafe equipment exception, Chhabria found that the plaintiffs had plausibly alleged that the TCS software that ran the moderation program was defective because it delivered and displayed the content in a way that prevented workers from employing harm reducing strategies.
Chhabria’s order does not end the case.
The plaintiffs now may take discovery and advance the litigation toward trial when they will have to prove the facts that they allege, but they have climbed a substantial hurdle on the road to securing relief for themselves and, potentially, the class of others who endured the same toxicity they experienced.
Neither plaintiffs’ nor defendants’ lawyers immediately responded to requests for comment on the decision.