TikTok shows footage of child sexual assault to content moderators: Report

Chinese short-form video app TikTok reportedly shows sexually exploitative videos of kids as part of its content moderators' training, the media reported.
Chinese short-form video app TikTok. (IANS)
Chinese short-form video app TikTok. (IANS)
Published on

Chinese short-form video app TikTok reportedly shows sexually exploitative videos of kids as part of its content moderators' training, the media reported.

According to Forbes, a largely unsecured cache of pictures of children being sexually exploited has been made available to third-party TikTok content moderators as a reference guide.

"These parents don't know that we have this picture, this video, this trauma, this crime saved. If parents knew that, I'm pretty sure they would burn TikTok down," Whitney Turner, former moderator for TikTok, was quoted as saying in the report that came out on Friday.

Turner worked for third-party moderation company Teleperformance's TikTok program in El Paso, Texas.

She was given access to a shared spreadsheet "filled with material determined to be violative of TikTok's community guidelines, including hundreds of images of children who were naked or being abused".

The document called Daily Required Reading (DRR) "was widely accessible to employees at Teleperformance and TikTok as recently as this summer".

Sources told Forbes that hundreds of people across both companies had free access to the document.

A TikTok spokesperson said the "training materials have strict access controls. (Pixabay)
A TikTok spokesperson said the "training materials have strict access controls. (Pixabay)

"The DRR and other training materials were stored in Lark, internal workplace software developed by TikTok's China-based parent company, ByteDance," the report noted.

Whitney even reported this to the Federal Bureau of Investigation (FBI) but no avail.

A TikTok spokesperson said the "training materials have strict access controls and do not include visual examples of CSAM (child sexual abuse material)".

However, the spokesperson said that it works with third-party firms "who may have their processes".

Teleperformance also denied that it showed employees sexually exploitative content.

The report mentioned that Teleperformance showed employees graphic photos and videos as examples of what to tag on TikTok.

"I have a daughter, and I don't think it's right -- just a bunch of strangers watching this," another former Teleperformance employee Nasser was quoted as saying. (AA/IANS)

logo
NewsGram
www.newsgram.com