It is certainly true to say that Scotland’s PISA scores have fallen to their lowest ever level, and that the drops seen in the most recent results are large enough to be significant. 

Reading scores have fallen by 11 points, or 2.2%, since the 2018 tests were administered, but that follows an equivalent increase from 2015, so the end result is that the reading score now is the same as it was back in 2015. Science scores have declined by just 7 points, or 1.4%, in the same period. 

In maths, however, the drop is more severe, with average scores having dropped by 18 points since 2018. For context, the OECD regard 20 points as being the equivalent learning of one full school year, so the implications of this change in maths performance are, potentially, extremely serious.

However, we should always be cautious about drawing conclusions from a single change in any data set, and that is even more important now given the most recent round of PISA testing came just after a devastating global pandemic that has very obviously impacted not just on young people's learning, but on our societies as a whole.

We need to take a longer view and, if we look back to 2006, we get a better sense of the changes that have taken place in Scottish education, as least in terms of those measured by systems such as PISA.

Back then, performance in science was stronger than in maths, and reading scores were lower than both of them, but reading outcomes have remained pretty stable over time while science and maths performance have both slipped with each new testing cycle. 

Between 2006 and 2022 Scotland’s reading score fell by just 6 points, which is a few more than England (3 points) but a lot less than Wales (15 points). If we look further afield, we can see that Scotland has also witnessed a smaller drop in reading scores than Australia (15 points), Slovakia (19 points), New Zealand (20 points), Iceland (48 points) or Finland (57 points).

Most read: 

PISA 2022: Scotland education score dips but so does global average

PISA 2022: Teachers qualified but lack of staff hits Scottish schools

But things look worse when the focus shifts to science and maths. 

In science, Scotland’s score has fallen by 32 points since 2006 – the same as Wales, but more than double the change seen in England. New Zealand and Slovakia recorded slightly smaller decreases in that time, but Finland and Iceland witnessed even more severe declines in their national scores. 

Math scores have fallen by even more – 35 points – since the 2006 round of PISA testing. That means that the decline has been about twice as steep as that seen in Wales, more than ten times as severe as the drop witnessed in England, and more than double the overall OECD average decline in that time. 

And yet, Scotland’s actual score of 471 points is just one short of the 2022 OECD average. 

Of course, not every country has seen results fall, but look across all of the data and it seems clear that, for the most part, countries are seeing their performance either slightly slip or significantly decline. Understanding Scotland’s place in those trends is important, but it is also extremely complicated.

The OECD data shows us, for example, that there is greater variation within schools than between them, and that’s a bit inconvenient when so much of the narrative around education in Scotland (and the UK as a whole) centres on the idea of improving individual or particularly weak schools.

We can also see that pupils with an immigrant background seem to have an advantage and can, therefore, boost a country’s results, so perhaps the best route to improving our education system is to attract more families from around the world? It would certainly make the stats look better.

We know that headteachers have raised concerns about insufficient staff levels in Scotland’s schools, and the OECD data highlights an international connection between this problem and declining performance in the PISA tests.

Even something that might seem obvious, like time spent in school, doesn’t actually seem to sync up very neatly with outcomes like test scores. This points to questions of educational efficiency that could, and should, be addressed, but which are also extraordinarily complex.

Whether it is test scores, and how they change over time, or the insights into school policies and learning conditions, PISA provides information that is undoubtedly useful, but in order to use it properly it must be viewed in its full context, no matter how much more difficult that might be.

Having the data is one thing, but it’s how we use it that really makes a difference.