The latest round of PISA results have now been published and, as expected, a lot of the response has been characterised by panic, puffed-up rhetoric and a somewhat tenuous relationship with the concept of accuracy.

So first, some housekeeping.

Looking at the changes in PISA scores only between the latest results and the previous set (in this case, comparing 2022 data only to 2018) is always a complete waste of time, but is particularly meaningless when that four year period included the small matter of a global pandemic.

Scotland’s declining PISA performance is a long-term trend. It predates Nicola Sturgeon’s (destructive) ‘I want to be judged on this’ speech. It begun to emerge before the introduction of Curriculum for Excellence. It was a feature prior to the first election victory of the SNP.

I am well aware that this is thoroughly inconvenient for some people, but that’s just too bad I’m afraid.

Read more: PISA 2022 - Scotland score dips but so does the global average

There are things that PISA can tell us about recent policy changes in Scotland, or the potential problems with aspects of our curriculum, but none of them come in broad brush strokes.

Pretty much everything about PISA, just like pretty much about education, is spectacularly complicated – people selling you a simple analysis of this data, or short-term solutions to the problems they claim it reveals, are acting out of ignorance at best and malice at worst.

This data should provoke some soul-searching in Scotland – it should not become a megaphone through which politicians, pundits, columnists and lobbyists scream that ‘something must be done!’

Panic is good for clicks but it is bad for making policy, and would be particularly ill-advised given the broader patterns we can see in the latest PISA scores.

At an international level the most obvious outcome from these results is that there has been a significant and widespread decline in the performance on 15-year-olds. This isn’t a Scotland problem – it is happening around the world and across a range of different education systems.

Read more: PISA 2022 - teachers well-qualified but lack of staff hits Scottish schools

It would, of course, be tempting to attribute this whole phenomenon to the impact of the Covid pandemic but – as ever – things just aren’t that clear-cut. While there is a link, for example, between schools being closed and pupil performance declining in a general sense (you’d certainly hope so, otherwise what are we spending all that money on schools for?) there doesn’t seem to be a direct link between the length of closures and the testing outcomes now recorded by the OECD.

What’s more, some countries have seen performances improve in the latest tests and, while there are always going to outliers in any data set, this does rather suggest that other factors are playing a part.

But there’s another reason to look beyond the pandemic – the declines we’re seeing, both in Scotland and around the world, are actually part of a long-term trend. Yes, we have seen an 'unprecedented' fall in performance this time, but the OECD themselves have admitted that 'falling scores in reading, science and maths' were 'already apparent prior to 2018.' And even that is underselling the reality.

Read more: PISA 2022 - what does the data really tell us about Scotland's schools?

In fact, the long-term declines in PISA scores might raise more questions for the OECD than for any of the individual countries involved. The whole point of this exercise is to give countries the data and insight they need in order to 'pursue reforms to education systems for a brighter, more prosperous future.'

But it isn’t at all clear that PISA has actually done that.

To illustrate this problem, we can look at the experiences of Germany and Finland. Both have been held up as examples of PISA success stories – but the contexts are very different.

When the first PISA results were published, Finland became an overnight educational superstar by unexpectedly topping these new international education league tables. But since then, scores have fallen – sometimes dramatically.

This wasn’t a huge surprise in Finland, because their own national assessments and academic studies had already told them that pupils’ knowledge and skills had slipped over the preceding years. Various different interpretations have been offered for the declining performance, including the need for and then implementation of curricular changes and even the amount of time and energy spent on “explaining the past to thousands of education tourists.”

What is undoubtedly clear, however, is that being a member of PISA, and indeed being a bit of a poster-child for international educational comparisons, doesn’t seem to have benefited Finland when it comes to those PISA measurements.

Read more: PISA 2022 - "We cannot afford to fall into hyperbole and cheap politics."

But the issue is even more striking when we look a little further south.

Germany is sometimes presented as the ultimate PISA success story. Indeed, the OECD themselves like to highlight the fact that the country did far worse than expected in 2000, accepted that it did not have the world-leading education system that many had assumed, and set about making changes to improve the state of schooling in the country.

And it worked. Between 2000 and 2012 Germany recorded huge improvements: PISA scores went up by 24 points in both reading and maths, while in science, they climbed by a quite incredible 37 points. To put that in perspective, the OECD says that 20 points is the equivalent to a normal year of learning.

The OECD is very keen to claim the credit for all that, arguing that PISA 'helped guide Germany’s government with key education reforms.'

The trouble is that it didn’t last: in the most recent PISA data, German scores are four points lower than was the case in 2000; science scores are five points higher, but have still seen a huge drop since 2012; and in maths, scores are now fifteen points lower than they were at the start of this whole process.

On the OECD’s own terms, Germany’s participation in PISA doesn’t seem to have done them many favours in the long run. The same seems to be true for many other countries.

To be clear, this is not to say that PISA data is entirely useless, or automatically harmful. Being able to compare international data on areas like reading and maths to our own national systems can be  valuable, and the amount of work that is done to analyse additional factors such as gender gaps, immigrant backgrounds, students’ sense of belonging, and levels of parental engagement helps to provide some really valuable insights.

So it is not the data that is the problem but rather the interaction with, and application of, that data. Used carefully, it could be a positive force, but given that the default response is so often the opposite of that approach, it is worth asking whether PISA really is a force for good at all.