Unattractive journal author services from “International Research Promotion”

Another day, and another crop of emails inviting me to give someone money to help me publish.This latest one is signed by a Dr P. Saha of International Research Promotion. As you can see, Dr Saha is offering to check my papers for plagiarism (thanks for that!), arrange peer review reports (authored by whom exactly?), format my papers to journal requirements, and translate my papers.

 

New Picture (1)

You won’t be surprised to hear that I’ve tried to check International Research Promotion out. They say that they have a head office in London, in premises that are also apparently used by 53 other companies, and which are said to be run as a commercial mail-drop service. They also claim offices in Toronto (again, in premises advertised as a mail-drop address) and in “Hooghly, West Bengal, India”, which is an administrative district and not an address.

I won’t be calling on their services, and I wouldn’t advise any other academic to do so either. They are offering nothing that you cannot organise for yourself, and probably a lot more effectively – just ask colleagues, or your university research office, if in doubt. And although I do not know whether or not the two bodies are connected, they share most of their name with the International Research Promotion Council, which Jeffrey Beall thinks is a ‘scam’.

 

 

Can we trust the Eurobarometer surveys?

eurobarom

From Eurobarometer 75, one of the reports analysed by Hoepner & Jurczyk

I’ve always treated the Eurobarometer surveys as something to dip into occasionally. They regularly cover public opinion in the member states of the EU, with candidate nations like Serbia and Turkey also taking part. Several have dealt with various aspects of education and training, or other issues in which I’m interested such as civic participation, and I’ve cited their results.

Now, though, I wished I’d checked the technical details a bit more thoroughly before quoting the findings. Two German social scientists have gone over the methods used in the surveys, and their findings make uncomfortable reading. Martin Höpner and Bojan Jurczyk set out what they call ten ‘good rules of public opinion survey research’, all of which seem to me broadly aligned with good practice in survey design. They then check in detail selected examples of Eurobarometer surveys, and conclude that they are so poorly designed as to blur the line between research and propaganda.

More specifically, they accuse Eurobarometer of using

incomprehensible, hypothetical, and knowledge-inadequate questions, unbalanced response options, insinuation and leading questions, context effects, and the strategic removal of questions that led to critical responses in previous Eurobarometer waves.
I find their analysis pretty compelling. They give detailed examples of questions that seem to lead respondents directly to express views that are favourable to the European Union. Note that we are not talking here about the way that others – the media, for instance – report the findings, but rather about the very design of the survey questionnaires themselves.
New Picture (2)

From Hoepner & Jurczyk 2015

Has this bias been unintended, a simple result of accident or drift? The authors of this study believe not, and conclude with a stark warning that ‘survey manipulation’ simply intensifies the gap between citizens and elites. Eurobarometer is an arm of the European Commission and if Höpner and Jurczyk are even half right, then its value to the research community, as well as the wider public, has been compromised.

Evaluating proposals for European research

h2020Funding for research is tight and getting tighter, at least if you listen to researchers. Mind you, academics’ complaints are not much of an indicator: many of my colleagues moaned endlessly during the early years of this century, when public funding for academic research reached unprecedented heights.

Budgets have been tightened or cut since then, yet the institutional and sytem wide pressures for external funding are greater than ever. Competition for research funding is particularly fierce at the European level. I’m currently in Brussels, along with a couple of hundred other social scientists who are helping to evaluate proposals under the European Commission’s Horizon 2020 programme.

Overall, the Commission is making €80 billion of funding available over 7 years (2014 to 2020), with a strong focus on research that will promote technological, economic or social innovation. Like all EC programmes, the funding is drawn from member states’ budgets, in this case the budgets for publicly funded science and research.

Of the total, just under €10m was set aside for 2015 to support research and innovation on the theme: Young Generation in an Innovative, Inclusive & Sustainable Europe. I’m currently in Brussels helping to evaluate the 145 proposals that were submitted, which it’s likely that five will be selected.

Putting together a proposal is a lengthy and painstaking business. It involves bringing together partners of different kinds and from different European countries, as well as securing the formal commitment of each of the participants, and getting them all agreeing on a detailed plan of work. At the end of all that, the odds of your getting funded are one in thirty.

We’re going to have a tough week making the decisions, but the process is much, much toughter on those who have produced the proposals.