LIKE most people, I felt immediate relief when Big Tech pulled the plug on Donald Trump amid the Capitol riot. Silencing him would save lives, I felt, would save democracy.
However, once that momentary relief subsided, it was clear that the way in which Mr Trump was silenced should trouble us. We’re leaving it to the likes of Facebook’s Mark Zuckerberg to decide issues of freedom of speech. Big Tech, in the shape of Twitter and Facebook, is the cause of political havoc in the world. We must make no mistake about that – blame needs laid squarely at the door of these companies. It was Big Tech which handed Mr Trump the weapon he, and those like him in Scotland, and Britain, and across the rest of the western world - used to divide us and spread hate.
Are we now going to leave it to Big Tech to police the chaos it created? We all know what happens when the fox runs the hen house. As America teeters on the brink of violence, with the inauguration of Joe Biden tomorrow, there’s never been a better time to recalibrate our dysfunctional relationship with social media.
At the heart of the problem is this: we’ve allowed social media to exist outside democracy – when, like any arm of the media, it’s central to how democracy functions. However, we can reform social media, and bring it back in under the umbrella of democracy. We can even make social media a positive force – something which strengthens democracy, rather than a tool for weakening it. There’s still time, though the window of opportunity is closing. The answer is quite simply old-fashioned regulation.
We need to think of social media as we think of other industries, like the food industry for example. We don’t allow the food industry to poison our bodies. Why let social media companies poison our minds?
Big Tech needs sharply brought to heel, and we won’t do that through self-regulation, such as with Facebook and Twitter unilaterally deciding to censor Mr Trump. Only strong independent regulation can act in the public interest.
Read more: Why artificial intelligence will either be the saviour or exterminator of the human race
First, send regulators into Big Tech companies. We send inspectors into farms, abattoirs, supermarkets and restaurants, so do the same to digital firms. Regulators must be independent from government and have the power to investigate and scrutinise how Big Tech operates. Are they using technology to deliberately sow unrest? Do their algorithms (as we know they do) give prominence to lies, hate and misinformation? If so, tame the algorithm and bring the firms into line.
Regulators also need the power to find out exactly what Big Tech is doing with our data. What information are they gathering and how, and who are they selling it to and what are they doing with it? Any behaviour that is contrary to public interest would be deemed illegal and subject to punitive fines. License firms, and use the removal of the licence to ensure good conduct. If a restaurant poisons me, it gets shut down, after all.
The second level of regulation would force a complete reorganisation of Big Tech. Rather than allowing Mark Zuckerberg to decide if and when and who is censored, the regulatory framework would obligate Big Tech to remove posts which incite violence, for example. Failure would mean punitive fines. If posts peddled lies or misinformation, companies would be legally obliged to attach disclaimers and fact-checked statements. Defamatory posts would have to be removed. Companies would be legally required to cull fraudulent posts, such as states or political parties masquerading as ordinary users.
Although social media also needs subjected to real world laws, we cannot, however, simply treat Facebook or Twitter, for example, exactly as we treat the BBC in regulatory terms. Social media pages are user-generated, the content comes from members of the public, not trained journalists. However, that doesn’t exempt those users from the full force of real-world laws. Incitement and defamation can and should be punished in the civil and criminal courts, and social media companies should be under a regulatory obligation not just to remove or add disclaimers and factual clarifications to offending posts, but to pass any criminal posts to police, and to alert anyone seemingly defamed on their platform.
So regulation is double-edged: independent regulation by the state, and legally imposed self-regulation.
Next, we need to break up the Big Tech companies. They’re the railway and coal barons of the 21st century. Such a concentration of power – particularly in the realm of mass media – is absolutely contrary to the interests of functioning democracies, and fair competition in the market place. Democracies and monopolies are incompatible.
Importantly, none of this will happen unless we compel our politicians to act. Politicians need to be ordered – by us, the electorate – to legislate for regulation. The greatest beneficiary of the chaos sown by social media in the last decade has been the political class. It’s politicians who’ve fomented social media chaos for their own personal gain: for proof look to both the Yes and No camps in Scotland, the Leavers and Remainers of Brexit, and of course the political division pushed by both sides of the divide in America. The division politicians foster on social media clearly distracts us from issues which matter, like food in children’s bellies, and aids and abets their own, often inimical, agendas.
If a sensible system of regulation had been in place Donald Trump would never have reached the extremes he did. He’d have been checked and counter-checked so often that the regulation surrounding social media would have reined him in – and that’s exactly what lies at the heart of this debate: checks and balances in a healthy democracy. If someone like Mr Trump continued to ramp up violence, say, under a proper regulatory system, then suspending them from social media would feel compatible with democracy and freedom of speech.
Today, though, it’s Mark Zuckerberg who decides what freedom of speech means. The internet is a public space – like a street. Private companies don’t police our streets. In a democracy, the people consent to what form of policing we want – and instruct politicians to act accordingly. It’s time we did just that with social media.
Read more: Post Trump - Can we ever reclaim the internet from extremists?
Our columns are a platform for writers to express their opinions. They do not necessarily represent the views of The Herald
Why are you making commenting on The Herald only available to subscribers?
It should have been a safe space for informed debate, somewhere for readers to discuss issues around the biggest stories of the day, but all too often the below the line comments on most websites have become bogged down by off-topic discussions and abuse.
heraldscotland.com is tackling this problem by allowing only subscribers to comment.
We are doing this to improve the experience for our loyal readers and we believe it will reduce the ability of trolls and troublemakers, who occasionally find their way onto our site, to abuse our journalists and readers. We also hope it will help the comments section fulfil its promise as a part of Scotland's conversation with itself.
We are lucky at The Herald. We are read by an informed, educated readership who can add their knowledge and insights to our stories.
That is invaluable.
We are making the subscriber-only change to support our valued readers, who tell us they don't want the site cluttered up with irrelevant comments, untruths and abuse.
In the past, the journalist’s job was to collect and distribute information to the audience. Technology means that readers can shape a discussion. We look forward to hearing from you on heraldscotland.com
Comments & Moderation
Readers’ comments: You are personally liable for the content of any comments you upload to this website, so please act responsibly. We do not pre-moderate or monitor readers’ comments appearing on our websites, but we do post-moderate in response to complaints we receive or otherwise when a potential problem comes to our attention. You can make a complaint by using the ‘report this post’ link . We may then apply our discretion under the user terms to amend or delete comments.
Post moderation is undertaken full-time 9am-6pm on weekdays, and on a part-time basis outwith those hours.
Read the rules hereLast Updated:
Report this comment Cancel