At 9.37am exactly 21 years ago today, a researcher at CERN – the big hole in the ground in Switzerland that brought you the Higgs Boson – posted an entry on alt.hypertext.

His name was Tim Berners-Lee, and he had invented the world wide web.

It has changed the world so dramatically that younger readers may need to be reminded of what happened. The internet (which already existed) is not the same thing as the web, but the web – a series of linked hypertext documents accessed through browsers – is what made it accessible to most people. If one thinks of the internet as writing, what Sir Tim, as he now is, created was like the invention of the postal system.

It seems like an impossibly distant past. Sir Tim's post offering the alpha test download of a simple line mode browser, a hypertext editor and a skeleton server daemon, which he had been developing since 1989, appeared on a usenet group (now rebranded, not for the better, as Google Groups) – then still chiefly used by university academics and computer buffs, and which you needed newsreader software to access. For more evidence that this is ancient history, at the bottom of his message he included his fax number.

Thanks to the web, the internet became available to everyone: more than two billion people are connected, more than 25 billion indexable pages are available, and there are more than a trillion unique URLs. The predictions of science fiction writers such as Arthur C Clarke and John Brunner, who foresaw all human knowledge becoming instantly accessible through networked computers, have become reality.

One result of this change – certainly comparable to the invention of railways or the telegraph, and perhaps even more momentous – is that it is now very difficult to remember what life was like before. A contemporary science fiction novelist, Charles Stross, wrote a blog post a few years ago pointing out just how difficult it would be to describe aspects of cyberspace (his example was of multi-user online games being ingeniously spammed by gold farmers) to someone from a earlier point in history – even if it were as recently as the 1970s.

But the globalisation which technology has brought has changed dozens of everyday transactions. To take a couple of trivial examples, it is now almost inconceivable that only 15 or 20 years ago, large chunks of holidays overseas were devoted to queuing up to change money, or that you had to head off to a reference library if you really wanted to check what record kept Ultravox's Vienna off the number one slot.

Although most of us regard the digital revolution as almost entirely beneficial, some have also wondered whether something has been lost amidst the many gains. The downside of the many things computers can do for us is our vulnerability when they fail, as customers of RBS and NatWest discovered recently when their system crashed. A few years ago Estonia – one of the best-connected countries in the world – was brought to a near halt when Russian hackers attacked its banking system and prevented newspapers from being printed.

Even the ready availability of almost all human knowledge is not necessarily an unalloyed advantage. People have access to information, but it may be of limited use if they have no context in which to frame it, nor a sense of how it is connected. You can look up anything, but that doesn't help you to know what it is that you don't know, any more than cutting and pasting sections of text implies that you have understood the arguments behind them.

Of course this idea of technological progress being a mixed blessing, though thrown into sharp relief by the information revolution, is not unique to computing. The ancient Greeks complained about the invention of written history, on the grounds that it undermined both memory and mythology. The inventions of the industrial revolution, and plenty since, destroyed as many trades as they created, and drove people from the country to the town.

The march of all technology tends to do two apparently contradictory things. It provides more and more opportunities for individuals to increase their wealth, liberty and autonomy and yet at the same time it creates an ever-growing interdependence.

There are, particularly during times of recession, periodic fads for self-sufficiency drawn from Voltaire's Candide, or Tom and Barbara in The Good Life, as people feel the urge to cultivate their gardens. But it is an illusory ambition. We were exiled from the garden long ago – and the reason for our expulsion, as the Bible tells us, was the acquisition of knowledge.

Barack Obama has recently been attacked by conservative Americans for telling the owners of small businesses that they didn't build their enterprises by themselves. This was certainly a very stupid thing to say, politically. You wouldn't need to be a survivalist gun-nut in Idaho to take a dim view of someone telling you that your own hard work counted for nothing beside what the state provided.

All the same, there's something in the President's claim. It was Adam Smith who pointed out how much more could be done when individuals specialised, and their work was brought together in projects which no one person could have achieved on his own. No one, after all, would know how to make a watch, if it involved everything from digging up and smelting the metals to tanning leather for the strap.

We all owe a great deal to other people. But politicians are rather too ready to claim that our debt is therefore to the state. It is true that CERN was funded by European taxpayers – it cost about the same as the Olympics, and offers incomparably better value – and that it is a shining example of public funds well-spent. But Sir Tim dreamt up the web off his own bat (and gave it away to the wider world), and its content is largely created by its users.

A web brings together disparate threads to weave a new, ingenious structure, but it may also be used as a trap. Like any other technology, whether it is a boon or causes us to lose valuable, more ancient wisdom is largely up to us.