Digital Traits could earn a fee whenever you purchase by means of hyperlinks on our web site.
The U.S. Division of Homeland Safety has reportedly launched an investigation into TikTok over how the platform handles content material depicting child sexual abuse and the moderation controls put in place. The company is trying into the alleged exploitation of a function known as “Solely Me” on TikTok that was allegedly abused to share problematic content material, one thing Financial Times claims to have verified in partnership with little one security teams and regulation enforcement officers.
The Solely Me function lets customers save their TikTok movies with out posting them on-line. As soon as a video’s standing has been designated as Solely Me, it might probably solely be seen by the account’s proprietor. In TikTok’s case, credentials of accounts that shared content material depicting Baby Sexual Abuse Imagery (CSAM) have been handed on amongst dangerous actors. In doing so, the abusive movies by no means made it to the general public area and averted detection by TikTok’s moderation system.

TikTok isn’t any stranger to the issue
This isn’t the primary occasion of such a critical probe into TikTok. The variety of investigations by the Division of Homeland Safety masking the unfold of kid exploitation content material on TikTok has reportedly shot up by seven occasions between 2019 and 2021. And regardless of making daring guarantees relating to strict coverage enforcement and punitive motion in opposition to abusive content material depicting kids, it seems that dangerous actors are nonetheless thriving on the platform.
“TikTok talks consistently in regards to the success of their synthetic intelligence, however a clearly bare little one is slipping by means of it,” little one security activist Seara Adair was quoted as saying. Curiously, the federal company banned TikTok on all techniques, together with telephones and computer systems owned by the division’s info know-how techniques, in March this yr over knowledge safety considerations.
This additionally isn’t the primary occasion of TikTok hogging consideration for the incorrect causes. Final month, a few former TikTok content material moderators filed a lawsuit in opposition to the corporate, accusing it of not offering enough assist whereas they dealt with excessive content material depicting “little one sexual abuse, rape, torture, bestiality, beheadings, suicide, and homicide.”
A BCC investigation from 2019 revealed predators focusing on kids as younger as 9 years of age with sleazy feedback and proposals. Elizabeth Denham, the U.Okay.’s info commissioner, launched a probe into TikTok the identical yr over the platform’s dealing with of private knowledge belonging to underage customers. And given its immense recognition amongst younger customers, the choice of deleting it isn’t actually as easy as Fb’s.
The dangers are more and more excessive, with media regulator Ofcom claiming that 16% of toddlers within the age group of three to 4 years devour TikTok content material. As per the U.Okay.’s Nationwide Society for the Prevention of Cruelty to Kids (NSPCC), on-line grooming crimes reached a file excessive in 2021, with kids being at significantly excessive threat. Regardless that Instagram and Snapchat are the popular platforms for predators, reports of horrific little one grooming on TikTok have surfaced on-line on a number of events up to now few years.
TikTok has currently enforced measures to maintain its younger person base protected. Final yr, TikTok announced that strangers will now not be capable of contact TikTok accounts belonging to kids under 16 years of age, and their accounts will default to non-public. The brief video haring platform even tightened the restrictions round downloading videos posted by customers below the age of 18. TikTok additionally added assets to its platform to assist sexual assault survivors final yr, bringing in consultants from the Rape, Abuse & Incest Nationwide Community (RAINN) and offering fast entry to the National Sexual Assault Hotline.
Editors’ Suggestions