Nine-year-olds added to malicious WhatsApp groups

Child scrolling on a mobile phoneImage source, Getty Images
Image caption,

Parents have been warned to be aware of children's WhatsApp use

  • Published

Children as young as nine have been added to malicious WhatsApp groups promoting self-harm, sexual violence and racism, a BBC investigation has found.

Thousands of parents with children at schools across Tyneside have been sent a warning issued by Northumbria Police.

One parent, who we are calling Mandy to protect her child's identity, said her 12-year-old daughter had viewed sexual images, racism and swearing that "no child should be seeing".

WhatsApp owner Meta said all users had “options to control who can add them to groups" and the ability to block and report unknown numbers.

It comes after the minimum age for WhatsApp users in the UK and Europe was reduced from 16 to 13. The government said it had received no notice of the change in advance.

Schools said pupils in Years 5 and 6 were being added to the groups, and one head teacher discovered 40 children in one year group were involved.

The BBC has seen screenshots from one chat which included images of mutilated bodies.

Mandy said it took some coaxing, but eventually her daughter showed her some of the messages from a WhatsApp group which had 900 members.

“I immediately removed her from the group, but the damage may already have been done," she said.

"I felt sick to my stomach - I find it absolutely terrifying.

"She's only 12, and now I'm worried about her using her phone."

Image caption,

Warnings were sent to parents via schools

The question of who set up the group and for what purpose is the subject of a police investigation.

Northumbria Police said it was investigating a "report of malicious communications" involving inappropriate content aimed at young people.

“We would encourage people to take an interest in their children’s use of social media and report any concerns to police,” a spokesperson said.

'Playing on their mind'

The WhatsApp messaging app has more than two billion users worldwide.

The NPSCC (National Society for the Prevention of Cruelty to Children) has expressed concern the app reduced the age limit "before ensuring effective protections were in place".

Richard Collard, the NSPCC’s associate head of child safety online policy, said there were practical steps which platforms could take to stop users being added to harmful groups - but warned against a "blanket ban" which punishes children for platforms "that have been too easy to exploit".

The children's charity said experiences like Mandy's daughter's were not unusual.

Senior officer for children's safety online, Rani Govender, said content promoting suicide or self-harm could be devastating and exacerbate existing mental health issues.

“It can impact their sleep, their anxiety. It can make them just not feel like themselves and really play on their mind afterwards,” she added.

Image source, Getty Images
Image caption,

WhatsApp recently reduced its minimum age in the UK and Europe from 16 to 13

Groups promoting harmful content on social media have featured in high-profile cases, including the death of Molly Russell in 2017.

An inquest concluded the 14-year-old ended her life while suffering from depression, with the "negative effects of online content" a contributing factor.

Her father, Ian Russell, said it was "really disturbing" there was a WhatsApp group targeting such young children.

He added the platform's end-to-end encryption made the situation more difficult.

"The social media platforms themselves don’t know the kinds of messages they’re conveying and that makes it different from most social media harm," he said.

"If the platforms don’t know and the rest of the world don’t know, how are we going to make it safe?”

Image source, PA Media
Image caption,

Molly Russell took her own life after struggling with images of self-harm

A government source said the decision to lower the minimum age for WhatsApp access "flies in the face of growing concerns from parents about children's social media use and rising adolescent mental ill-health".

"Policymakers will rightly ask what evidence Meta has to prove it's safe for 14 year olds to access encrypted WhatsApp groups and what protections they're putting in place to empower parents to keep their kids safe."

Prime Minister Rishi Sunak told the BBC that, as a father of two children, he believed it was “imperative that we keep them safe online”.

He said the Online Safety Act was “one of the first anywhere in the world” and would be a step towards that goal.

“What it does is give the regulator really tough new powers to make sure that the big social media companies are protecting our children from this type of material," he said.

"They shouldn’t be seeing it - particularly things like self-harm - and if they don’t comply with the guidelines that the regulator puts down, there will be in for very significant fines.

"We want our kids to be growing up safely - out playing in fields, or online.”

  • If you have been affected by any of the issues raised in this story you can visit BBC Action Line.

Mr Russell said he had doubts about whether the Online Safety Act would give the regulator enough powers to intervene to protect children on messaging apps.

It was "particularly concerning that even if children leave the group, they can continue to be contacted by other members of the group, prolonging the potential danger," he said.

He urged parents to talk to even very young children about how to spot danger and to tell a trusted adult if they see something disturbing.

Speaking to BBC Radio 4 on Friday, Pepe Di'Iasio, general secretary of the National Association of School and College Leaders, likened WhatsApp to "a gateway drug" which takes children "beyond messaging" and into the realms of potentially harmful content.

"They believe they’re in a closed group, but that closed group is gained access from exterior predators," he told the Today programme.

He added schools were "incredibly worried" suggesting teachers and school leaders were battling with unsafe use of social media "on a daily basis".

Image caption,

Ian Russell, father of Molly Russell, now runs the Molly Rose Foundation

Mandy said her daughter been contacted online by a stranger even after deleting the chat.

“She also told me a boy had called her – as a result of getting her number from the group - and had invited ‘his cousin’ to talk to her too," she said.

"Thankfully she was savvy enough to end the call and reply to their text messages saying she was not prepared to give them her surname or tell them where she went to school. ”

'Profits after safety'

Mr Russell said parents should never underestimate what even young children are capable of sharing online.

“When we first saw the harmful content that Molly had been exposed to before her death we were horrified,” he added.

He said, at the time, he did not believe global platforms would either carry such content or allow their algorithms to recommend it.

"We thought the platforms would take that content down, but they just wrote back to us that it didn’t infringe their community guidelines and therefore the content would be left up," he said.

“It’s well over six years since Molly died and too little has changed.

"The corporate culture at these platforms has to change; profits must come after safety.”

Follow BBC North East on X (formerly Twitter), external, Facebook, external and Instagram, external. Send your story ideas to northeastandcumbria@bbc.co.uk.

Get in touch

Are you a parent whose child has been affected by the issues raised in this story?