THE number of online grooming crimes has soared by nearly 60% in Scotland while children have been waiting for online safety laws, it can be revealed.

The British child protection charity, the NSPCC first called for social media regulation to protect children from sexual abuse in 2017 and has been campaigning for robust legislation ever since.

In the three years before they made the call, the number of communicating indecently with a child offences was averaging 399-a-year.

Police Scotland data shows that over the past three years that has risen to 637.

The charity said the number of offences is likely to be far higher than those known to police.

The new analysis of the scale of child sexual abuse taking place on social media comes ahead of MPs and Lords making final decisions next month on the Online Safety Bill - which aims to protect children by making social media platforms more responsible for their content.

In response, they are urging politicians on all sides to support the buill in its final stages and pass what it calls "vital legislation".

READ MORE: Cost of Scotland's growing army of civil servants soars by £600m

The charity has been fighting for changes to the law to mean senior managers of social media sites are held criminally liable if children are exposed to preventable abuse.

It has been calling for a statutory child safety advocate to give voice to children and sexual abuse victims.

It says that unregulated social media was fuelling online child sexual abuse and behind every offence could be multiple child victims who are continually re-victimised as images are shared.

The Herald:

A draft Online Safety Bill was published over two years ago but the charity say regulation was first promised by the government in 2018 following the NSPCC’s call for action.

The charity has been campaigning for strong legislation ever since, working closely with survivors, government, parliamentarians, and other civil society groups to ensure it effectively tackles the way social media and gaming sites contribute to child sexual abuse.

NSPCC analysis of Police Scotland data showed that more than 3,500 online grooming crimes have been recorded in Scotland while children have been waiting for bill to become law.

And some 1,873 of the offences took place against primary school children, with under-13s making up more than half of victims The legislation will mean tech companies have a legal duty of care for young users and must assess their products for child abuse risks and put mitigations in place to protect children.

They say it will give the regulator Ofcom powers to address significant abuse taking place in private messaging and require companies to put safeguards in place to identify and disrupt abuse in end-to-end encrypted environments.

The NSPCC said these measures were "vital" to effectively protect children from "this most insidious abuse".

It is still seeking assurances that the legislation will effectively regulate Artificial Intelligence (AI) and immersive technology and wants an online child safety advocacy body specifically to speak with and for children as part of the day-to-day regulatory regime.

They argue that this will help spot emerging risks and fight for the interests and safety of children before tragedies arise.

The Herald:

An NSPCC source said that crimes are likely to be prevented if companies abide by the planned regulations and "get their house in order".

The charity says companies should not wait any longer to act, should accept regulation is coming and work with Ofcom and child safety groups to finally provide the duty of care children deserve online.

Sir Peter Wanless, NSPCC chief executive said: “The number of offences must serve as a reminder of why the Online Safety Bill is so important and why the ground-breaking protections it will give children are desperately needed."

He said he was pleased that the Government has looked to strengthen the legislation.

The Online Safety Bill aims to introduce major regulation to social media for the first time.

Rules would be brought in for social media sites and user-generated platforms to compel them to remove illegal material, with emphasis on protecting children from seeing harmful content.

Companies that break these rules would face large fines from Ofcom.

The NSPCC has been seeking amendments to improve its response to child sexual abuse.

They believe the creation of a child safety advocate would mirror statutory user advocacy arrangements that are effective across other regulated sectors.

The amendment would give Ofcom access to children’s voices and experiences in real time through an expert child safety advocate similar to Citizen’s Advice acting for energy and postal consumers.

And after the UK Government committed to holding senior managers liable if their products contribute to serious harm to children the charity says this must also include where sites put children at risk of sexual abuse.

The move would mean bosses responsible for child safety would be held criminally liable if their sites continue to expose children to preventable abuse – which they say is backed by an overwhelming majority of the public.

They want girls to be given specific protections as Ofcom will produce guidance on tackling Violence Against Women and Girls for companies to follow.

They want companies to be made to crack down on so-called tribute pages and breadcrumbing - the act of sending out flirtatious but non-committal social signals - that use legal but often stolen images of children and child accounts to form networks of offenders to facilitate child sexual abuse.

Among those supporting a call to give children a voice in the online safety regulation is nurse Ruth Moss, whose 13-year-old daughter Sophie Parkinson took her own life in March 2014. The teenager had accessed websites about self-harm and suicide before her death.

YouGov polling, which took place at the end of April, showed that nine out of ten of the 153 surveyed Scots wanted an independent advocacy body to be created with an amendment to the bill.

A vast majority also said it was necessary that Ofcom listens to the opinions and experiences of children in its role as a social media regulator.