LIKE a Uri Geller fever dream, the movie Minority Report envisioned psychics being forcibly plugged into computers to predict crimes before they are even committed. If you don’t remember it, that’s because you’re a Sunday Herald reader. The only Tom Cruise movie you’ve ever watched was Rain Man on Channel 4 one night, too full of organic gin and quinoa flan to turn the channel.

Apart from accurately predicting voice-controlled homes, touch-motion interfaces, facial recognition, personalised advertising and the algorithmic fog that stops us escaping our personalised echo chamber cells, you never missed much. The film was yet another example of Hollywood cashing in by dragging a dead sci-fi author’s satirical fantasy world kicking and screaming into reality. You can clearly see why Cruise signed on.

Nearly two decades later, however, paranoid android Philip K Dick’s pessimistic dystopian vision is eerily precedent. It now seems the war on crime can be solved after all - with technology turning Earth into a perpetually-monitored prison planet where no-one is free from surveillance and algorithmic assessment.

Where once the future of policing was encapsulated by Inspector Gadget and the bobby-on-the-beat fascist brutality of Judge Dredd, it 's clear that law enforcement is quickly advancing towards an even more judgmental, uncompromising marriage of authority and technology.

Mirroring the premise of Minority Report, a paradigm-shifting focus is now being placed upon predicting a crime before it can even take place. With no-one to arrest, the police of the future will likely get home in time for Emmerdale. Certainly the half-eight Corrie.

It’s important to note that no psychics are harmed in the complex new process of “predictive crime mapping”. Taking much of the strain are algorithms, which highlight recurrent patterns in the vast mountains of data on people, places and pilferage held by the police.

Many forces around the world – including the UK – are adopting variations of this highly sophisticated software, programmed to predict exactly where and when crime will happen. And, more importantly, who will be committing it. Police in Glasgow’s Sauchiehall Street at 3am will have no need for algorithms, however. They have eyes.

Some of this shiny new tech has somehow landed in the hands of South Wales Police Force, which was last week forced to defend controversial facial recognition software which debuted during last year’s Champions League final. Apparently it is sticking with it – despite just eight per cent of identifications proving accurate. Only Juventus had a worse result that night. No wonder missing Manic Street Preacher Richey Edwards hasn’t been found yet.

THAT'S THE CHICAGO WAY...

Over in Chicago, an policing algorithm exists that creates something George Orwell may have satirically dubbed a “Strategic Subject List” – if it wasn’t already called that. This is an extensively detailed prediction of all citizens’ potential future involvement in illegal offences.

Even if the only crime you’ve committed is walking hurriedly away from your dog’s poo, an algorithm might send out an armed response unit – or an airborne drone – to gun down Fido before his next bowel movement. It’s been nicknamed the “heat list” – and it simply studies everyone the police wish to keep an eye on from afar. No-one is untouchable in Chicago – and remember, if you put someone in hospital there, Irish ex-cops with Scottish accents will put you in the morgue. That’s the Chicago way.

Like Sir Sean Connery’s accent, this technology remains much the same no matter where it is in the globe. Greater Manchester Police created its own “predictive mapping” software around five years ago and Kent Police introduced a system called PredPol around the same time. This algorithm was once used to detect earthquakes but now predicts crime in targeted locations during specific windows of time. It pins troubled locales on a map for officers to patrol when it predicts a high likelihood of violence or theft.

On the down side, recent research has shown that predictive algorithms might be systematically discriminating against offenders of ethnic origin. Without human qualities such as common sense and decency, the program had apparently “learned” statistical bias without knowledge of ethnic diversity in populations.

Martin Luther King once had a dream, but never in his most comatose of REM states did he ever envision that computers would be racist too.

'THE MAN' HAS A HART AFTER ALL

UNLIKE Minority Report, critics weren’t kind to the similarly dystopian blockbuster Demolition Man. Maybe 1993’s phoneless, internet-free world wasn’t ready for a tale of hidden “underclasses” being electronically tagged and monitored to maintain a world completely without crime. A bit like Kilmacolm – if you count ridiculous house prices as a backdoor application of social eugenics.

In the movie, Sylvester Stallone is at first sold on this utopian ideal, but comes to understand it only exists on an aesthetic surface level. The film soon deconstructs this societal perfection by subtly highlighting how technological “advances” have actually set the human race back centuries emotionally, socially and intellectually. In Demolition Man, our search for control, peace and autonomy led to the loss of what makes us human – our hearts.

HART – entirely coincidentally I’m sure – is the cosy-sounding acronym for a real-life police monitoring tool which is currently proving more than a wee bit controversial down ye olde England way. The Harm Assessment Risk Tool – again, an almost satirically Orwellian moniker – is Durham Constabulary’s experimental partnership with computer science academics.

Picture the white-coated herd with clipboards that crowded Robocop when he first woke up inside his steel prison. That’s them.

As you’ve likely guessed, this AI system uses algorithms to predict whether local criminals are at a low, moderate or high risk of committing further crimes.

The constabulary’s high head yins, however, are at pains to stress that HART does not decide whether suspects should be locked up. Yes, this is a friendly, compassionate algorithm. One that simply helps old-fashioned biological police officers decide if offenders should enter a rehab programme called Checkpoint. Cocaine addicts are perhaps referred to Checkpoint Charlie instead.

MOSAIC IS NOT A PRETTY PICTURE

LIKE Paul O’Grady’s real heart, Durham Constabulary’s crime-predicting HART algorithm is powered by many manmade ventricles – 34, in fact, and one of these is currently proving particularly contentious down south.

To help power HART, the force recently handed out £47,000 of taxpayers’ money to credit check firm Experian for data held by their information service Mosaic UK. This controversial payout has now invoked the ire of campaign group Big Brother Watch. In the Mosaic UK brochure (yes, the all-seeing eye of Sauron has a brochure), the firm boasts about its ability to provide a “wealth of richly detailed information on all individuals in the UK and the neighbourhoods they reside”.

However, far from receiving incisive individual psychological profiles on 66 million people, Mosaic UK seems to have simply handed Durham plod a fag-packet scrawl of crude stereotypes – a small-minded postcode lottery with Al Murray’s pub landlord picking the balls. In the document, folk are herded into highly generalised, meaningless categories like the enviable “Penthouse Chic” yahs, the skint “Bus Route Renters” and those parasitical “Dependent Greys”.

Mosaic UK actually delves further into the realm of deep offence, however. If you fall under its “Renting A Room” branding, its suggestion of accompanying “person names” are Lukasz or Monika. However, if you’re lucky enough to be classed among the “Uptown Elite”, you’re likely to be called either Benedict or Susannah. With such invaluable intel, let’s be glad that £47,000 wasn’t spent on buying each Durham resident a Curly Wurly and packet of Space Raiders instead.

One academic barking his concern over police usage of Mosaic UK’s generalisations is the wonderfully-surnamed Andrew Wooff, a criminology lecturer at Edinburgh Napier University, who said: “I have concern about the primary postcode predictor being in there. You could see a situation where you are amplifying existing patterns of offending, if the police are responding to forecasts of high-risk postcode areas.”

Wooff adds that algorythmic analysis of such data can possibly reinforce existing biases in regards to policing decisions – and the judicial system. Archie Macpherson adds “wooff”.

No alarm bells are ringing within the HART operation, however. In response to criticism, the team have simply stuck two fingers up by announcing an increase in the scope of their ambitions. They will soon “expand beyond the current Checkpoint treatment programme, with the (algorithmic) forecasts influencing many other decisions that are made in the wake of bringing a suspected offender into police custody”. Likely meaning no, no, no to anymore cosy rehab sessions for hardened crims.

Such an expansion of the HART project will not please Big Brother Watch, which brands the scheme’s reliance on postcode generalisations as “truly dystopian”.

These campaigners raise concerns over this software being used as Judge Dredd-style police, prosecution, judge and jury. And when robot bodies are developed, they may become executioner too.