OECD’s Adult Skills Survey has been hitting headlines across Europe. Newspapers and magazines in France, Germany, Sweden, Ireland, Australia, Korea and Canada have been full of it – as has much of the British press. But there is a curious silence north of the border, where the Scottish Government decided that it wanted no part of this particular piece of comparative research.
For all I know, the Scottish Government has extremely good reasons. A senior civil servant told me some time ago that the budget for social research had been cut back to the bone. As a result, the Government had decided to withdraw from some existing international surveys (the 2011 wave of the PIRLS survey of schools literacy, for example), and not to take part in the OECD survey of adult skills.
Further, I would expect the Government, if anyone asks, to point out that it published its own study of adult skills in 2009. But this survey used different instruments from OECD (it adopted the same instruments as those used for the previous OECD survey in 1996). Useful though this survey was, it took a different approach from the later survey, covered a more limited range of skills, and analysed them in less depth. And it was confined to one country, thought this did not stop the authors of the report from expressing satisfaction at Scotland’s ‘creditable placement’ against other countries’ performance in 1996.
Whatever the reason, Scotland did not form part of the 2011-12 Survey, which has now been published. On the plus side, the taxpayer has saved some money – or, more accurately, the citizens will enjoy the benefits of spending being allocated elsewhere. But there is a pretty massive down side as well.
Taking part provides a massive volume of data, collected using internationally agreed instruments that have been developed and tested over four years. This allows policy-makers, researchers and the wider public to undertake an informed benchmarking of their own country’s performance and to see how it stacks up against others.
This in turn turns a spotlight onto adult learning. Berni Brady, director of the Irish adult education organisation AONTAS, appeared on prime time explaining what the results meant for Ireland, and calling for the government to recognise the needs of adult learners in its new strategy for further education and training. In Britain, the BBC’s chief business editor, Robert Peston, wrote and spoke about competitiveness and adult skills.
The Survey has also shed light on some discrepancies in national performance levels. In England, media attention quickly seized on the literacy and numeracy scores of young adults, who did notable worse than older generations. Matthew Hancock, the Coalition Minister for Skills, promptly blamed the previous government’s schools policies, neatly side-stepping the fact that whoever is to blame, these 16-24-olds are already of working age.
Incidentally, Hancock’s claim doesn’t say much about his own numeracy skills. Someone who was 24 when the survey took place in 2011 would have entered school in 1991 or 1992, well before Labour came to power. However, there is enough basis in his claim to pose a few uncomfortable questions for Labour education ministers, along with those academics and others who advised them. But at least we have the data. In Scotland, where there would be huge interest in knowing how schoolchildren fared under devolution, we simply lack comparable information.
Of course, the OECD Survey can easily become a flash in the pan. Having bowed and danced in the spotlight, adult learning could soon find itself in the familiar gloom of the margins, as all the fuss and debate moves back to schools and universities. But that is partly up to those who are interested in adult learners and the institutions that support them. The OECD’s results provide us with plenty of material to nourish debate for some time to come – if we want it.