THERE are few things as subjective, or rather relative, as time, something demonstrated by one of the last century’s greatest theories. Since Arthur Eddington confirmed (during the solar eclipse of 1919) Einstein’s General Theory of Relativity, we’ve been moving at breakneck speed; the past 100 years have seen unimaginable change, as did the century before it. And yet for most of human history not much changed at all, and when it did, did so very slowly.

As it happens, this week saw the 50th anniversary of a seismic change, in the form of a press release sent out by the University of California at Los Angeles, announcing the creation of a thing that we now call the internet. In those days, it was called ARPANET, though it was that October before they managed to send a message on it from UCLA to Stanford, which, although it was just the word “Login”, managed to crash two computers each the size of a house. It’s been 20 years since Homer Simpson asked “The internet? Is that thing still around?”and the answer is that it is literally all around, and within, almost every aspect of life.

For something which has been going for less time than several of the Glastonbury headliners – including Kylie Minogue and The Cure – it’s been remarkably transformative, creating and destroying whole industries, and making some aspects of life before it almost incomprehensible to those who have grown up with it. To take a trivial example, think of how many film plots, even those from just a decade or two ago, would be impossible to write if set in the present: anything which relies on people not being able to find something out at once, or communicate immediately, or record evidence is now just about unthinkable.

But then, as LP Hartley pointed out, the past is a foreign country; they do things differently there. One of the things they did most differently was the future. It’s also 50 years this month since a man first walked on the Moon; a year earlier, Stanley Kubrick and Arthur C Clarke had predicted we’d be living there by 2001. Of course, science fiction has an extremely poor prediction rate (it’s a genre designed to elucidate the concerns of the present, not second-guess the future); John Brunner’s The Shockwave Rider (1975) is the only book I can think of that anticipated anything like the internet before the cyberpunk revolution of Neuromancer and the film of Blade Runner (both 1984, a notably science-fictional year, though Blade Runner, as chance would have it, is set in 2019).

That’s because dystopian futures make for better dramatic backgrounds than ones in which things get better. But it’s not a failing confined to fiction; in fact, someone clanging a bell and declaring that the end is nigh seems always to attract a ready audience of the gullible, even when all the evidence is demonstrably against them.

We’re certainly in a moment where optimism is the minority view, and when technology, liberal economics, democracy and practically every other post-Enlightenment Western notion, from the Industrial Revolution to basic property rights and freedom of speech is viewed as at the very least “problematic” and, in the more apocalyptic visions, a recipe for the planet’s destruction within decades.

So it’s perhaps worth having a look at the track record of the predictions of disaster made round about the time we came up with the internet and the Lunar landings, and what has actually happened.

One of the great concerns of the late 1960s and early 70s was population growth, because the doomsayers never considered that every human being is an asset as well as a drain on resources. So Paul Ehrlich, the consistently wrong author of The Population Bomb, claimed that 100-200 million people would be starving to death by 1979 because food production would be outstripped by population growth, and that four billion would die in the 1980s. Instead, the global population is now 7.5 bn, and their food intake has gone up from 2,300 calories a day to 2,800.

Another was extreme poverty, bound to get worse under the capitalist system which oppresses the third world. In 1969, around 60 per cent of the world’s population subsisted on less than $2 a day, and 36 per cent on less than $1. Today, the number on less than $1.90 a day (the World Bank changed the way it calculated poverty a few years ago) is under nine per cent, and falling. In fact, every year, 90 per cent of people in extreme poverty move out of the category the following year.

But surely we’re depleting our natural resources? Oil was going to run out by 2000, according to the ecologist Kenneth Watt (who also thought, in 1969, that the temperature would drop by 11 degrees by then); the geochemist Harrison Brown confidently predicted that lead, zinc, gold, tin and silver would be unavailable by 1990, and copper by 2000. Apparently not.

Pollution, which so many now assure us is worse than ever, and killing hundreds of thousands, must surely be one prediction these sages got right, though. Except that between 1970 and 2017, the six most common air pollutants in the US decreased by 73 per cent.

Now, naturally, some of this improvement comes of paying attention to the warnings of the alarmists and altering our behaviour. The current shift in public opinion on single-use plastics may be such an instance. But it also shows how spectacularly wrong the doomsayers have been.

Read more: Andrew McKie: The Tories' only hope is to vote for Boris Johnson