🔗 Share this article The Popular Video Platform Allegedly Leads Child Accounts to Explicit Material In Just a Few Taps According to a fresh inquiry, the widely-used social media app has been observed to direct children's accounts to adult videos after only a few taps. Testing Approach Global Witness established fake accounts using a date of birth for a minor and activated the "restricted mode" setting, which is meant to limit exposure to inappropriate content. Study authors observed that TikTok proposed sexualized and explicit search terms to the simulated accounts that were created on unused smartphones with no prior browsing data. Troubling Search Prompts Search phrases suggested under the "suggested searches" feature included "provocative attire" and "very rude babes" – and then progressed to keywords such as "graphic sexual content". Regarding three of the accounts, the adult-oriented recommendations were suggested immediately. Quick Path to Pornography After a "small number of clicks", the investigators found pornographic content from women flashing to penetrative sex. The organization stated that the content tried to bypass filters, typically by showing the content within an benign visual or video. Regarding one profile, the process took two taps after logging on: one click on the search feature and then one on the suggested search. Legal Framework Global Witness, whose remit includes examining big tech's impact on societal welfare, reported performing several experimental rounds. One set occurred before the enforcement of child protection rules under the British online safety legislation on 25 July, and additional tests after the measures took effect. Alarming Results Investigators stated that multiple clips included someone who seemed to be below the age of consent and had been reported to the online safety group, which tracks exploitative content. The campaign group asserted that the social media app was in breach of the Online Safety Act, which requires social media firms to prevent children from viewing inappropriate videos such as adult material. Official Reaction A spokesperson for Britain's media watchdog, which is responsible for monitoring the law, said: "We value the research behind this study and will review its results." Ofcom's codes for following the act specify that tech companies that pose a significant danger of presenting inappropriate videos must "configure their algorithms to block dangerous material from young users' timelines. The platform's rules ban adult videos. TikTok's Statement The social media company stated that after being contacted from the organization, it had deleted the offending videos and made changes to its suggestion feature. "Immediately after notification" of these assertions, we acted promptly to examine the issue, take down videos that breached our guidelines, and launch improvements to our search suggestion feature," stated a spokesperson.