International comparisons in education are in vogue and a bit like a huge Advent calendar. We like our league tables of performance and seeing how well we do in relation to other nations and states. It is compelling for those of us with a quiet competitive streak to see whether “our” country is rising, dropping, holding its place or frankly just disappearing from view.
Certainly the media like to present the rankings as disasters or successes compared to previous results. Suppose you are placed 15th – and you are either delighted or appalled – or perhaps just resigned. But, like the Advent calendar, the number 15 is just a door which hides a whole bigger package of surprises and rich details. The number on the door doesn’t fully explain the gift hidden behind – or the distance travelled to get there. These comparisons were initially developed to explain the impact on the economic and social health of nations from various educational initiatives and investments. They were descriptive. But not now – they are increasingly seen as policy levers.
So are international comparisons useful or just interesting? And are they disproportionately influential now? Do we tend to look simply at the number on the door and ignore the richness behind it ?
The ‘score on the door’ rankings mask a high level of detail on a huge range of indicators. They can act as a stimulus for action – for example, does one country invest anything like the same levels of resources as others in additional classroom support for children and what impact does this have on eventual outcomes? Should we aim to ‘be more like those at the top’ in this regard?
Recently, I wrote about the various useful tensions that hold the suspension bridge of mathematics education up and prevent it from a dramatic collapse into the waters below. It was intended to be a light-hearted piece to help publicize the importance of a conference being organized by The British Council. The various tensions were things like ‘traditional v. new mathematics’ or ‘pretty mathematics v. functional employability skills’ and so on – but the main point was that the tensions in the debate are healthy and will hopefully never get resolved.
There will always be a similarly healthy debate about the methodology behind the global PISA tests and what the limitations are – and how much account should be taken of cultural variables – things that do not always translate across nations well. Should we have bands of performance or a table of ‘significant’ movers to go alongside the main ones? I’m quite keen on a contextual value added table. Or perhaps ‘divisions’ within the rankings – so we could be ‘top’ of Division 2 if need be – which sounds way better than being bottom of division 1.
Of course if I were to push the advent calendar imagery to destruction I would say that being No. 25 is the best: It’s usually the largest door and holds the biggest prize; so maybe we should all aim to be there. Or perhaps each country should set itself a target number, and not always No. 1. We all know that methodologically, there is little of statistical significance between broad bands of performance: Coming in at No. 17 is not so very different in real terms as coming in at 27. It just feels different.
Shortly the PISA rankings for 2013 will be published. I’m hoping my country will be a ‘key mover mathematically’ – upward naturally – but I’m not that concerned about being in the top five or six really – that’s a closed shop. If we come 25th I think that would be a good result in many ways.
(Only kidding – I’m a secret competitor – we really ought to be No. 1 whatever the methodological debate might say. It is children’s future lives at stake here.)
However, given the importance of such rankings in now shaping policy as global systems converge, perhaps the time is ripe for simpler, visual and more diverse ways of explaining the data. That would let us begin to peek at the detail behind the number on the doors more than we do at present.
Ceri Morgan is one of Her Majesty's school inspectors at Ofsted, the U.K’s Office for Standards in Education, Children’s Services and Skills.Report Typo/Error