What are the best ways of concluding a conference?

How should you mark the end of a conference? We’ve all attended those gruelling plenaries at the end, where tired delegates report in a round on the deliberations of their workshops. Or sat in depleted rows, trying to concentrate on a final keynote. So can you suggest any creative and energising alternatives?

We recently held a conference on changing professional responsibilities and knowledges. We decided to experiment with social media during the event, and asked our research students to blog about it as it went along. We also identified a hashtag, and set up a Twitter feed on the conference website for people to follow. And in the final session, my colleague Patrick Carmichael put up a couple of slides, showing Wordle images of the most frequently used words in the Twitter feed and in the conference papers. It worked astonishingly well, partly perhaps because people were standing rather than sitting, and drinking water or fruit juice before their final lunch.

But there must be lots of other ways to end a conference. I’ve heard of brass bands, magicians, and of course the award of prizes and honours. When I asked one colleague about this, he said simply “With applause”. If I’d been quicker off the mark, I’d have asked what would have got his hands clapping most enthusiastically. What has done it for you, and what would you recommend?

For Patrick’s “Tweedles”, see: http://www.propel.stir.ac.uk/conference2012/tweedles.php

Advertisements

Kids today: young people and the labour market (with PS)

One car trader claims that over 80% of applicants for apprenticeships are unsuitable for any employment. A major survey of nearly 88,000 businesses finds that about two-thirds of employers who recruit school-leavers report that most are well-prepared for work. Both reports appeared on the same day. Guess which hit the headlines?

Car sales and servicing company Arnold Clark was widely reported for saying that young people had wholly unrealistic expectations of work. The claims, which came in a submission to the Scottish Parliament’s Finance Committee, were prominently reported in the Telegraph, while the Scotsman added a leader comment and front page report.

Such claims are, of course, familiar. In this case, they commanded attention for three main reasons. First, Arnold Clark is a respected firm, which is known widely to invest in training. Second, the claims were precise and factual, referring to the 81% of young people whose applications for apprenticeship places were rejected. Third, the company attacked colleges, describing them as babysitting youngsters, which gave journalists an obvious hook for their reports.

Let me start by saying that Arnold Clark strike me as a decent company by UK business standards. Okay, perhaps not the highest bar of moral probity, but I’d buy – have bought – a used car from them. The firm take apprentices, both in its core trading arm, and in its wholly owned training subsidiary. The parent company employs its apprentices, a tenth of whom are recruited from seriously disadvantaged youngsters, with support from the Prince’s Trust.

While I would love to know what proportion of turnover is spent on training by the company, this is not a whinging Dickensian boss who hates spending money on training new staff. Nevertheless, the story merits a closer look.

First, the submission to MSPs came not from Arnold Clark, but from its wholly-owned subsidiary, GTG Training. As well as internal staff training, GTG sells business training and support services externally, and this includes a large programme of modern apprenticeships. It is therefore a direct competitor of the college sector – something the press ignored.  

Second, GTG’s figures covered solely those who applied for apprenticeships with Arnold Clark, not other areas. The submission accepts that this is a biased sample, suggesting that we may well be recruiting at the lower end of the achievement spectrum. So we cannot treat this as representative of all young people – yet the press did just that.

Third, GTG’s criticism of colleges turns mainly on the question of study hours. The submission claims that college students typically study at most for 18 hours a week, with few or no extra-curricular activities. The result, according to the evidence submitted to MSPs, is that those who go to college re-emerge into the economy . . . with a further deterioration in concept of working week.

This sounds pretty crude logic. In my experience, youngsters realise that school or college are not work, nor are they meant to be. But it is true that most full-time national qualifications require around 20 hours attendance, while higher nationals require 15 hours. On top of that, students undertake self-directed study and assignments (also known as homework). Many also have a part-time job, to help fund their study.

Fourth, most of the young people’s supposed weaknesses don’t sound like the result of education. Rather, they are attitudinal or personal. The submission lists eight recurring themes, only two of which – communication skills and understanding of citizenship – sound like responsibilities of the education system. The others, such as inability to make a decision based on anything other than I want, seem to me typical of a more general consumerist view, which is pretty pervasive, and no doubt helps sell cars.

Fifth, the evidence of these weaknesses comes from round table discussions with recruiters. Really? Well, I have just had a round table discussion with a youth worker from Fife, who tells me that most of the disadvantage youngsters he works with would love a steady job, and indeed many jump when offered the chance. Yes, they need training, but who doesn’t?

MSPs might pose a few sharp questions when they meet GTG. They could do worse than look at the latest employer skills survey from the UK Commission for Employment and Skills. There is a neat chart on page 30, showing that 68% of Scottish employers who hired school-leavers reported that they were either well-prepared or very well-prepared for work, while 82% of those who recruited Scottish college-leavers took the same view.

UKCES also reported on employers’ criticisms of young people. In Scotland, the most frequent complaint was over lack of work experience among school-leavers, and lack of specific skills or competences among college recruits. Attitude and personality (including punctuality) were the second most frequent complaint of school-leavers, but few employers thought these a problem for college recruits.

Comparing across the home nations, Scottish employers were more likely to think young people well-prepared for work. But the responses were broadly similar across the UK, with most employers taking a pretty positive view. In fact, employers appear so satisfied that you’re more likely to be knocked over by a spaceship than seeing this part of the story making media headlines.

Either Arnold Clark is in a tiny minority of employers who are flummoxed by the challenges of today’s teenagers, or GTG Training is using the submission to have a go at its competitors in the college sector. This fits a particular press narrative, in which young people are invariably stigmatised, their skills derided and their personalities lampooned.

 

Postscript, added 24 May

David Scott, head of the firm that made this critical submission to the Scottish Parliament, failed to attend the committee meeting on 23 May to discuss his paper. According to the BBC, Mr Scott claimed that he had an unforeseen business engagement. Needless to say, politicians of all parties were unamused – or more accurately, amused themselves at Mr Scott’s expense. The BBC report is at: http://www.bbc.co.uk/news/uk-scotland-scotland-politics-18180095

Results of the 2011 UKCES employer skill survey are available at: http://www.ukces.org.uk/assets/ukces/docs/publications/ukces-employer-skills-survey-11.pdf

Bolognese sauce: dropping out in Germany

For many academics in these islands, the Bologna process sits low on the horizon. This is not just stereotypical British/Irish insularity. On the whole, the decision by most European countries to move towards a standard system of Bachelor’s and Master’s degrees, involving similar periods of study across all the participating national systems, sounded to us like minimal change.

Bologna meant that it would become much easier to compare degrees from different countries and universities, because the titles and periods of study would be broadly similar. It was also in many countries intended to promote efficiency and improve completion. If the average time to graduation was around seven or eight years, then the idea of a three-year Bachelor’s programme sounded very attractive to governments, as well as to universities and presumably many parents.

Has it worked out? A recent report in Germany suggests that hopes of improved completion are not yet being realised. Of those German students who embarked on a Bachelor’s degree in 2006/7, 28% left without a degree. This is three per cent higher than for the cohort who started in 2004/5, before the Bologna reforms were fully implemented. Unfortunately, though the report does note that one in every two foreign students left without a degree, otherwise the German data do not distinguish between different types of student.

Of course, we can and should enter all sorts of qualifications to these figures. We do not know how many of the ‘leavers’ will return to complete their studies at a later stage; we do not know why they left, nor whether they are better or worse off as a result. All the same, the trend is upwards, at a time when the policy makers and academic managers expected the number of early leavers to decline. Why?

One possible answer lies in the institutional pattern. The withdrawal rate for the 2006/7 cohort was 19% in the Fachhochschulen (FHS) loosely translated as vocational polytechnics) and 35% in the universities. The FHS were faster to move to a Bachelors/Masters structure, and went through a similar period of high withdrawal before the reforms bedded in. There is also a better prospect in an FHS of transferring to another subject rather than withdrawing altogether.

Meanwhile, many in the universities struggled with the very idea of someone completing their higher education in a mere three years. In some subjects, such as engineering, barely a half of university students graduated. Many university academics lament the disappearance of the old Diplom, which supposedly took five years (in practice, usually longer), and criticise the tendency towards modularity inherent in the three-plus-two year Bachelors/Masters structure.

It also remains to be seen whether German employers will accept job applicants who have a Bachelor’s degree alone, or whether they will still prefer those who have studied for five years.  The early signs are that Bachelor’s degrees are holding their own in the labour market, or at least are proving more attractive to employers than many – including myself – had expected.

Germany is the largest country in the European Union (although not in the Bologna process, which also involves Russia and Turkey among others). While some of the implementation difficulties can be laid at the door of Germany’s universities, and the famously rigid mindsets of the German professoriate, this is still an important signal of greater challenges to come. It is also a signal that institutional type still matters: the FHS are clearly better equipped to handle innovations like Bologna, the universities are better at defending a traditional view of knowledge and study – though this report suggests that while they can disrupt the planned changes, they still have to appear to implement them. This looks to me like a risky strategy.

 

 U. Heublein, J. Richter, R. Schmelzer & D. Sommer, Die Entwicklung der Schwund- und Studienabbruchquoten an den deutschen Hochschulen: Statistische Berechnungen auf der basis des Absolventenjahrgangs 2010, Hochschul-Informations-System, Hannover, 2012.

Inspecting education – value for money?

Educational inspection has undergone a number of changes since it was introduced in the early nineteenth century, but it has always been controversial. When Ann Walker of the Workers Educational Association, recently tweeted a link to a report on the cost of OFSTED, the responses confirmed that the current inspection regime arouses strong feelings. Many people also expressed surprise over the cost of the system, estimated in the report at £207m a year, or 0.27 per cent of all education spending.

The report appeared in 2009. It is a brief document, and its main focus is not on OFSTED but on the broader issue of how governments attempt to ‘manage by numbers’. The authors did not give a date for their figures, reasonably enough as the report was a summary of an ESRC research programme. Their calculation, though, is clearly based on figures for inspection under the Labour government, and probably come from 2007.

How does this compare with the cost of inspection under the Coalition? I’ve looked at the data for 2010-11, which is the most recent year for which financial statements are available. I have also looked at the financial statements for the Scottish and Welsh inspectorate, and compared them with total  education spending for each country as reported by the Treasury, in its public expenditure statistical analysis for the same year.

The first point to note is that OFSTED consumes 0.278% of all educational spending in England. This is slightly higher than under Labour as a share of the total. While the amount spent on OFSTED has fallen, standing at £196.5m, so has the education budget.

The Welsh Assembly spent £11.7m on Estyn, which is equivalent to 0.271% of the total education expenditure for Wales. While this is slightly less than in England, the difference is not huge.  

The Scottish Government devoted £17.5m of its education budget to inspection. At 0.217% of the total education spending, this does come out rather cheaper than OFSTED.

Admittedly, public spending on education per head of population is much higher in Scotland. And public spending per capita on inspection is accordingly higher in Scotland. Even so, it seems to have the most cost-effective inspection regime of the three British nations – and I am not aware of a shred of evidence that this has damaged the quality of teaching.

In all three British nations, the inspectorate accounts for around a quarter of one per cent of the education budget. This is not the total cost of course, as it ignores time spent by teachers and others producing reports and preparing for the inspection process.

Nor does it tell us whether the inspection systems offer good value for money. Every penny spent on inspection is money that could have been spent on front-line staff, and the differences between England and Scotland suggest that OFSTED might have a few questions to answer.

Christopher Hood, Ruth Dixon and Deborah Wilson (2009), Managing by Numbers: The way to make public services better? Available at http://www.publicservices.ac.uk/wp-content/uploads/policy-briefing-nov2009.pdf

How far should universities go to avoid engaging with their local communities?

 

A senior member of a major British adult education provider told me last week that he was disappointed by the higher education sector, finding it aloof and unresponsive. This had not always been his experience, so he was wondering whether I thought the universities were now out of the adult learning field altogether. His view was that this was largely caused by research assessment regimes, which have rewarded academics who impress other academics, while discouraging any wider engagement in the community.

This is probably a reasonable indictment of the old Research Assessment Exercise. Or, more accurately, it is a fair description of how many academics and their managers chose to respond to the old RAE. Nor is this simply a British phenomenon. In many countries, academic research is measured either by the numbers of times that their work is cited by other academics, or by the number of papers that they publish in journals that are highly regarded by other academics.

This is even worse than the old RAE. It leads to entirely predictable games-playing, as academics are clever folk who will devise the most effective ways of achieving high citations, or getting into those highly-rated journals.  Governments appear to be satisfied with this, as they invariably either boast about the number of “our scientists” who perform well on this measure, or berate their nation’s scholars for failing to measure up. But whichever system was used, the result has been to turn academics inwards, encouraging them to speak above all to their own peers, and to ignore the wider community (with the obvious exception of those organisations who pay for and commission various commercially driven projects).

This seems to me entirely counter-productive. If we cannot explain our research to the wider community, and justify it to the public, then we cannot expect our research to command public support. Rather, we should expect much of the public to mistrust academics, viewing them either as self-indulgent and wildly out of touch, or as in the pay of large vested interests. Over time, this is bound to undermine the political consensus in favour of publicly funded academic research.

I am therefore moderately encouraged by a number of recent developments. The first is the inclusion of ‘impact’ in the new research assessment system. This requires academics to show that high quality research has in some way influenced the wider public, and has had benefits for them. This will explicitly include the measurement of impact on civil society and third sector organisations as well as on the public and private sectors. This is certainly not without its problems – not the least of which is that the sector has limited experience of assessing the impact of research on people who are not other academics. But it is a step in the right direction.

The second is the decision of several universities to appoint professors specialising in the public understanding of science. Marcus du Sautoy is probably the best known of these, thanks to his broadcasting collaboration with the comedian Dara O’Briain. Sheffield has gone a step further, appointing Angie Hobbs as professor in the public understanding of philosophy. Again, this seems to me to be a sensible decision by those universities that are far-sighted enough to recognise that an informed public opinion is in their long term interests as much as anybody’s.

The third is the growing willingness of academic researchers to engage with those who criticise and protest against their work. In the most recent case, scientists at Rothamsted Institute of Arable Crops Research offered to meet a direct action group of anti-GM protesters to discuss their concerns. The protesters in turn called for an open debate, which the two sides are now arranging. It is unlikely that this dialogue will resolve all the differences, which run deep, but it is a world away from the violent police-led responses of the past.

These are welcome developments, though it probably goes without saying that I’d like to see them become the norm rather than the exception. If universities are public institutions then why would they not expect all of their researchers to promote public understanding of their work? Perhaps it should be a requirement of all public research funding that the researchers should be willing to communicate their findings to the local community, and indeed listen to what the community thinks of it.