TIKTOK was used to spread Covid misinformation videos to children as young as nine, a new probe has revealed.
The study by NewsGuard, found that children were exposed to fake news about vaccines and Covid conspiracy theories on TikTok – despite not searching for the information.
Conducted between August and September this year, the nine participants, aged nine to seventeen, were split into two groups and told to create and interact with a TikTok account for 45 minutes.
One group were asked to watch coronavirus videos when they appeared, scroll through videos quickly, click on creator profiles and “like” posts with misinformation.
Other children were instructed to resist “liking” posts and clicking on hashtags.
In the shocking findings, all but one of the children were shown misinformation related to Covid-19 within 35 minutes, regardless of which group they are in or not.
Among the spew of misinformation, the video allegedly contained claims that vaccines kill people and can be treated with the dangerous hydroxychloroquine.
Another video also claimed that the use charred orange and brown sugar can restore your sense of taste and smell – which is yet to be proved.
Only 23 minutes after signing up for TikTok, one 13-year-old was shown a video purporting to be about a man who died three days after being vaccinated.
In the 20 minutes that followed, TikTok’s algorithm served up three more videos implying that vaccines were dangerous or lethal.
Alex Cadier, UK managing director at NewsGuard, said: “TikTok’s failure to stop the spread of dangerous health misinformation on their app is unsustainable bordering on dangerous. Despite claims of taking action against misinformation, the app still allows anti-vaccine content and health hoaxes to spread relatively unimpeded.”
He added: “The difficulty in really knowing the scale of this problem is that TikTok hold all the information and get to mark their own homework. If social media platforms aren’t going to be fully transparent about this, then legislation like the Online Safety Bill might be the only option to force their hand.”
Ofcom, the media regulator, has reported that the app grew rapidly last year amid the Covis-19 pandemic and now reaches 14 million adults in the UK.
But within the last 12 months, the app removed 194 million videos last year, with child safety concerns commonly cited, Ofcom said.
It included more than 30,000 videos containing coronavirus misinformation in the first three months of 2021.
The app said: “The safety and wellbeing of our community is our priority, and we work diligently to take action on content and accounts that spread misinformation, while also promoting authoritative content about Covid-19.”
We pay for your stories!
Do you have a story for Central Recorder news desk?