LAST week, during the MacTaggart lecture at the Edinburgh International Television Festival, Jon Snow attacked Facebook for the “dark, cancerous” way it spreads fake news. During the US elections, he complained, the “same algorithm that prioritised many amazing reports of ours, also prioritised fakery on a massive scale”. The platform, he said, had a moral duty “to prioritise veracity over virality” and it was failing in that.
But isn't it really us who are failing? There are many reasons to be angry at Facebook and, mostly for privacy and data collection reasons, I am not its biggest fan, but it seems to me that, even if the platform were to perfect its algorithms, and create complex fact-checking systems, the problem of fake news would not go away. High-profile bogus stories like the infamous “Pope endorses Trump for president” might be buried or delegitimised, but many of us would still be passing round, liking, and sharing tales that were either untrue or distortions of fact. Because it’s us, as much as Facebook, that are the problem.
Rather than point the finger at Facebook, we need to look at why those stories thrive and trend in the current climate; why there is such an appetite for news from alternative, non-mainstream sources. Facebook, for instance, is not to blame for the decline in trust in the media which has taken place over decades, and created the atmosphere in which fake news thrives.
Bogus news and sensationalism have been around since Johannes Gutenberg invented the printing press and even before. Fake stories of child-murdering, blood-drinking Jews, which have existed from the 12th century, formed part of the foundation of anti-Semitism. The problem is the way the world is speeding up, the rumour-mill now rippling outwards at lightning-speed, and the way our liking for extraordinary or alarming stories has now become exaggerated. Meanwhile, the sheer quantity of news, and its complexity, is leading to a kind of critical faculties fatigue, in which people are so exhausted that they fall back on their emotions and prejudices.
Fake news also thrives in a climate of fear and anxiety. In an interview I did with the influential neurosurgeon James Doty last year, he observed how headlines and news stories draw attention by creating alarm and igniting our flight or fight response. “The media create a narrative of threat," he said. "It immediately grabs your attention. You turn your head towards the danger. Threat is what keeps you interested. People in power know this, political parties know this, and manipulate it.”
Doty described this reaction as “evolutionary baggage”, adding that “the sad thing is that, whether it’s Brexit or Donald Trump, it results in the stimulation of this system. You’re fearful, you’re anxious, you’re scared. Then you can be manipulated”.
I doubt, therefore, that any new algorithm will solve the fake news problem. For, it’s not exactly that Facebook is to blame, but that something in the mechanism of social media naturally leads to a situation in which lies can be passed on swiftly and indiscriminately. It’s like an out of control rumour-mill, in which a scurrilous story that once wouldn't have got out of the pub and down the street without being corrected, is within hours globally spread.
Nor do I think the solution is, as is often suggested, to recategorise social media platforms as publishers and subjecting their content subject to libel laws, thus reversing the provisions made in the English Defamation Act of 2013. Naturally, that’s not what Facebook wants since their model would collapse financially and most likely be unviable. But the real question is, whether it’s really what we want. After all, for most of us, it’s not just a news provider but a means of communication. For a great many families and friends, it’s how they keep in touch, share enthusiasms, debate issues.
More often, after all, it’s what Facebook does censor, not what it doesn’t that bothers me. It’s odd, therefore, to hear Snow suggest that the “moral duty” for rooting out lies and fakery should lie with Facebook and Google, two parties he has already described as having monopolies over information. Do we really want that duty to be put in the hands of these monopolies?
What we need is to teach our children – all children and not just those going into further education – critical thinking. We need to prime our own, and our children’s, bullsh** detectors. And we need to nurture a sense of responsibility over what we share and like, to make that a moral issue in the way that the spreading of false gossip has always been. Here in Scotland, the home of David Hume, the birthplace of Enlightenment sceptical thinking, can’t we start a movement for a new Enlightenment, designed for the digital age?
Why are you making commenting on The Herald only available to subscribers?
It should have been a safe space for informed debate, somewhere for readers to discuss issues around the biggest stories of the day, but all too often the below the line comments on most websites have become bogged down by off-topic discussions and abuse.
heraldscotland.com is tackling this problem by allowing only subscribers to comment.
We are doing this to improve the experience for our loyal readers and we believe it will reduce the ability of trolls and troublemakers, who occasionally find their way onto our site, to abuse our journalists and readers. We also hope it will help the comments section fulfil its promise as a part of Scotland's conversation with itself.
We are lucky at The Herald. We are read by an informed, educated readership who can add their knowledge and insights to our stories.
That is invaluable.
We are making the subscriber-only change to support our valued readers, who tell us they don't want the site cluttered up with irrelevant comments, untruths and abuse.
In the past, the journalist’s job was to collect and distribute information to the audience. Technology means that readers can shape a discussion. We look forward to hearing from you on heraldscotland.com
Comments & Moderation
Readers’ comments: You are personally liable for the content of any comments you upload to this website, so please act responsibly. We do not pre-moderate or monitor readers’ comments appearing on our websites, but we do post-moderate in response to complaints we receive or otherwise when a potential problem comes to our attention. You can make a complaint by using the ‘report this post’ link . We may then apply our discretion under the user terms to amend or delete comments.
Post moderation is undertaken full-time 9am-6pm on weekdays, and on a part-time basis outwith those hours.
Read the rules hereLast Updated:
Report this comment Cancel