.

Initiating a much Needed Discussion

With the world constantly caught up in a rush of activity, we often fail to achieve clarity in many regards. Such a dynamic often ruins the quality of our assessments, thus setting us up for some hugely negative ramifications. Now, the detriment brought into play by these ramifications is always up in the air, so there are times when they end up having an exceedingly negative impact, even playing with a possibility of irreversible damage. Hence, to better protect the larger interests in the said situations, we have set up dedicated regulatory bodies across the board that are purposed solely around spotting inconsistencies within their area of concern, and if they happen to find any, these bodies are also given the power to dish out punishments. However, while the introduction of regulation has left the world with no option but to have a more ethical take on everything, the concept has also narrowed down our focal point. We now focus on being ethical only within the contexts where concrete regulations are established, leaving the other ones dangerously unprotected. The consequences of doing so were depicted well by a recent case involving social media giant, TikTok.

A content moderator named Candie Frazier at TikTok has sued the company over a claim that she developed psychological trauma from reviewing disturbing content for the platform. As per some case filings, Frazier’s allegations talked to how she and her fellow content moderators are required to sit and go through disturbing content, which features  graphic violence, conspiracy theories and other disturbing imagery, for stretches as long as 12 hours. Filed in California federal court, the case initiates a much needed discussion on unsatisfactory safety measures in place for content moderators, who are responsible for managing an impossibly wide expanse of social media.

“As a result of constant and unmitigated exposure to highly toxic and extremely disturbing images at the workplace, Ms. Frazier has developed and suffers from significant psychological trauma including anxiety, depression, and post-traumatic stress disorder,” the official complaint stated.

Frazier is essentially not a TikTok or ByteDance employee. Instead, she is a worker at Telus International, which contracts out content moderators to TikTok and a host of other social media platforms. However, if we go by Frazier’s words, her day-to-day operations still fall under TikTok and ByteDance domain.

“We strive to promote a caring working environment for our employees and contractors. Our Safety team partners with third party firms on the critical work of helping to protect the TikTok platform and community, and we continue to expand on a range of wellness services so that moderators feel supported mentally and emotionally,” said a TikTok spokesperson.

Hot Topics

Related Articles