TrendingVideosIndia
Opinions | CommentEditorialsThe MiddleLetters to the EditorReflections
Sports
State | Himachal PradeshPunjabJammu & KashmirHaryanaChhattisgarhMadhya PradeshRajasthanUttarakhandUttar Pradesh
City | ChandigarhAmritsarJalandharLudhianaDelhiPatialaBathindaShaharnama
World | United StatesPakistan
Diaspora
Features | The Tribune ScienceTime CapsuleSpectrumIn-DepthTravelFood
Business | My MoneyAutoZone
UPSC | Exam ScheduleExam Mentor
Don't Miss
Advertisement

TikTok showing child sexual abuse videos to content moderators: Report

Unlock Exclusive Insights with The Tribune Premium

Take your experience further with Premium access. Thought-provoking Opinions, Expert Analysis, In-depth Insights and other Member Only Benefits
Yearly Premium ₹999 ₹349/Year
Yearly Premium $49 $24.99/Year
Advertisement

IANS

Advertisement

San Francisco, August 6

Advertisement

Chinese short-form video app TikTok reportedly show sexually exploitative videos of kids as part of its content moderators’ training, the media reported.

According to Forbes, a largely unsecured cache of pictures of children being sexually exploited has been made available to third-party TikTok content moderators as a reference guide.

“These parents don’t know that we have this picture, this video, this trauma, this crime saved. If parents knew that, I’m pretty sure they would burn TikTok down,” Whitney Turner, former moderator for TikTok, was quoted as saying in the report that came out on Friday.

Advertisement

Turner worked for third-party moderation company Teleperformance’s TikTok programme in El Paso, Texas.

She was given access to a shared spreadsheet “filled with material determined to be violative of TikTok’s community guidelines, including hundreds of images of children who were naked or being abused”.

The document called Daily Required Reading (DRR) “was widely accessible to employees at Teleperformance and TikTok as recently as this summer”.

Sources told Forbes that hundreds of people across both companies had free access to the document.

“The DRR and other training materials were stored in Lark, internal workplace software developed by TikTok’s China-based parent company, ByteDance,” the report noted.

Whitney even reported this to the Federal Bureau of Investigation (FBI) but to no avail.

A TikTok spokesperson said the “training materials have strict access controls and do not include visual examples of CSAM (child sexual abuse material)”.

However, the spokesperson said that it works with third-party firms “who may have their own processes”.

Teleperformance also denied that it showed employees sexually exploitative content.

The report mentioned that Teleperformance showed employees graphic photos and videos as examples of what to tag on TikTok.

“I have a daughter, and I don’t think it’s right—just a bunch of strangers watching this,” another former Teleperformance employee Nasser was quoted as saying.

Advertisement
Show comments
Advertisement