Home » Education & Family » April Jones: Is the internet safer after April Jones’ murder?
Education & Family

April Jones: Is the internet safer after April Jones’ murder?

The murder of April Jones spawned policies that were “groundbreaking” in protecting children, a regulator says.

April, five, from Machynlleth, Powys, was abducted and killed in 2012 by Mark Bridger who had a slew of child abuse images on his computer.

Her parents pushed for change and the Internet Watch Foundation was given powers that have since seen it remove millions of images.

The organization said it was a “wake-up call” but they continued to face problems.

They said that with the ability to identify and remediate abuse, they have encountered a “national crisis” in which children are being enticed into filming themselves at home, meaning they are more vulnerable than ever.

  • Relive the dark moments of April Jones’ disappearance
  • Have lockdowns put children at risk of grooming?
  • Online reports of child sexual abuse increase by 93%
  • Instagram ‘biggest for child care online’

April Jones went missing near her home on October 1, 2012, sparking the largest search in UK police history.

She was kidnapped and killed by local resident Mark Bridger, a “dangerous fanatic” and pedophile, who was sentenced to life in prison for her murder.

A library of child sexual abuse images was found on his computer, and references to search terms such as “naked young five-year-old girls” were found, as well as images of murder victims, including Soham victims Holly Wells and Jessica Chapman.

He also had Facebook pictures of young local girls, including April and her sisters.

April’s remains have never been found.

After a campaign by her parents to say the police weren’t doing enough, then Prime Minister David Cameron gave the IMF in the UK additional powers to work with law enforcement to look for and remove abusive images.

Mr Cameron was vocal after her initial disappearance, branding it “a parent’s worst nightmare”, especially as she suffered from cerebral palsy like his son.

He said about reform methods of finding materials, he wanted to “look their parents in the eye” and helped.

Ten years after April’s death, IMF executive director Susie Hargreaves said the powers they were granted after April’s killing were a “game changer” for cybersecurity.

She said April’s death was a “wake-up call” and she praised the “bravery and courage” of April’s family to “do whatever it takes to make sure it doesn’t happen to any other child.”

Ms Hargreaves said: “It meant that we were able to massively increase the amount of child sexual abuse images and videos that we were able to remove from the internet and I don’t think that would be possible without the tragic events happened around April death.”

At the IWF, 50 analysts are working to remove the images, trace the source and where they are hosted, and are working with prosecutors and UK police forces.

Ms Hargreaves: “In the year of April’s death we removed 13,000 websites from the Internet – last year we removed 252,000 websites.

“Each single web page we remove may contain thousands of images — that equates to millions of images of children being sexually abused.”

The IWF is one of the few non-law enforcement organizations in the world authorized to search for and remove this type of footage.

However, Ms Hargreaves said online child abuse is “at an all-time high” as UK police forces are recording more online child sex offenses than ever before and across more platforms.

“I have to face the fact that we’re probably never going to eliminate child sexual abuse online because it’s a global problem,” she said.

She added the IMF has seen a huge increase in this type of crime in lockdown.

Before the pandemic, there were 300,000 people classified as ‘posing a risk to children’ in the UK.

In 2021, that number rose to 850,000.

The charity said there were eight million attempts to access child sexual abuse images in the UK via three internet service providers in the first three weeks of the pandemic.

Ms Hargreaves said it was also more common now for young people to be “tricked, encouraged and coerced” into filming the content themselves.

“These children are in their homes, their parents often think they are safe because we can hear domestic chatter in the background, and yet the parents seem totally unaware that their children are being exploited by pedophiles,” he said Mrs Hargreaves.

She added that the images are often of girls between the ages of 11 and 13, but said they are getting younger, and in the first six months of 2022 they removed 20,000 reports of self-generated content from children aged seven to 13 10 years.

“We have a national crisis and we must do everything we can to stop it,” Ms Hargreaves said.

Dai Davies, a former Metropolitan Police superintendent, assisted in the Madeleine McCann case but also advised April Jones.

He said although changes were made then, they didn’t go far enough and both “coordination” and “education” were needed to address the issue.

“It was 10 years ago now and what was shocking was the discovery that the defendant had so many pictures of children on his computer. That was indicative of the problem of child pornography on the internet even then,” Mr Davies said.

He said it was especially important now that more and more children are being enticed to create the images themselves.

“Every image is a crime scene and a crime against children and we have to deal with that as a society.”

He added that crimes must be prioritized and big companies held criminally accountable if they are found to have acted negligently.

“Ten years later, where do we stand? We are still dealing with this issue and the scale is horrendous.”

He said an unexpected tax on these companies could be a way to help police and organizations protect children.

Companies like Google and Microsoft have blocked thousands of search terms online, sometimes the searches are code words known only to perpetrators.

Claire Lilley, Google’s head of child safety, said they are “deeply committed to protecting children and our users from harmful content” and take “an aggressive approach” to targeting and blocking child sexual abuse material.

She added that they have “invested heavily in teams and technology” to take down material and help other companies do it, like the IMF and the National Center for Missing & Exploited Children in the US.

The UK government recently said that online safety legislation, which will oblige tech companies to protect users from child abuse images, is to be re-submitted to Parliament and they have been asked to comment.

Add Comment

Click here to post a comment