By Joanna Barrett, NSPCC Scotland policy and public affairs manager

A COUPLE of decades ago, when the internet was still in its infancy and our mobile devices were used primarily for making phone calls, few could have predicted the spread of social media with websites that would become so powerful they would come to shape our mood, our relationships, even our politics.

And while, Facebook, Twitter, Instagram and the like have much to offer, social media can also be an online “wild west”: unregulated and unsafe – especially for children.

The dark side of social media is never far from the headlines. In the past two weeks alone a Scottish Parliamentary Committee called for research on the impact of social media on children’s mental health and Police Scotland has launched a campaign targeting those who seek to groom children online.

At the NSPCC, we are clear that online safety is the biggest child protection challenge of this generation. And social media platforms must play their part in facing that challenge.

Inaction from social media providers has fuelled the scale and extent of the risks of online abuse that children face: from the production and distribution of child abuse images, to the failure to tackle harmful and inappropriate content on suicide and self-harm, and the growing scale of online grooming.

Most of the levers for change in this area are reserved to Westminster. During a recent session of First Minister’s Questions, Nicola Sturgeon called on the Scottish Parliament to unite to call on the UK Government to review online regulation without delay.

But as the home to a growing gaming industry, the Scottish Parliament should also consider whether there is more Scotland could be doing to ensure tech companies build child protection in from the outset.

However, the issue is a global one, with children in Scotland as at risk from online harm as their counterparts elsewhere.

The UK Government is due to publish an Online Harms White Paper which presents us with an unprecedented opportunity to protect children from abuse. Social networks can no longer be given the benefit of the doubt. Since 2005 there have been 13 self-regulatory initiatives but every one of these has ultimately failed to keep children safe from the threat of online abuse.

The NSPCC’s Wild West Web campaign is calling for social media and online platforms to be subject to a legally enforceable duty of care. This means that sites must proactively identify foreseeable risks on their platforms and take steps to mitigate them through the design and function of their technology. This would ensure existing sites become safe and those of the future are built to be safe from the start. Other things available to our children– food, toys, clothes, for example – all meet standards ensuring children are safe to have and to use them. Social networks must be the same.

The regulator should be given information disclosure powers to enable it to assess the scale and extent of the risks that children face. Platforms should be legally required to proactively disclose safety breaches to the regulator and risk-assess new products offered to children.

A regulator is the right solution because a body with the right tech expertise will give certainty to the industry. However, the regulator must have the powers necessary to do its job. It must be able to apply measures that encourage compliance, including the ability to levy fines and refer platforms and named directors for potential prosecutions.

For too long children have paid the price for social networks failing to tackle inappropriate content and abuse on their sites. It’s time to ensure every child is finally kept safe online.