Few UK universities have adopted rules against impact-factor abuse

Posted in Research on February 18th, 2018 by steve

“A survey of British institutions reveals that few have taken concrete steps to stop the much-criticized misuse of research metrics in the evaluation of academics’ work. The results offer an early insight into global efforts to clamp down on such practices …” (more)

[Nisha Gaind, Nature, 12 February]

Tags: , ,

Academia is fucked-up. So why isn’t anyone doing something about it?

Posted in Research on March 23rd, 2017 by steve

“A week or so ago, a list of perverse incentives in academia made rounds. It offers examples like ‘rewarding an increased number of citations’ that – instead of encouraging work of high quality and impact – results in inflated citation lists, an academic tit-for-tat which has become standard practice. Likewise, rewarding a high number of publications doesn’t produce more good science, but merely finer slices of the same science …” (more)

[Sabine Hossenfelder, Backreaction, 22 March]


Every attempt to manage academia makes it worse

Posted in Research on March 19th, 2017 by steve

“I’ve been on Twitter since April 2011 — nearly six years. A few weeks ago, for the first time, something I tweeted broke the thousand-retweets barrier. And I am really unhappy about it. For two reasons. First, it’s not my own content …” (more)

[Sauropod Vertebra Picture of the Week, 17 March]


New Research Initiative Criticised After Concerns Over Ranking Academics

Posted in Research on February 2nd, 2017 by steve

Ireland“A new initiative to measure the research output of Trinity’s staff, currently in its pilot phase, has raised concerns that it might potentially rank and compare individual academics, already drawing criticism from a trade union. The new initiative, Principal Investigator Quantitative Analytics, is proposing to rank academics based on their research output …” (more)

[Róisín Power and Sinéad Baker, University Times, 1 February]

Tags: , ,

The Stern Review on REF 2014 – a review of recommendations

Posted in Research on August 23rd, 2016 by steve

UK“Following the recent publishing of the Stern Review of REF 2014, Dr Sergey Popov looks at some of the recommendations contained in the report …” (more)

[QPOL, 22 August]

Tags: , ,

Why I had to quit the research excellence framework panel

Posted in Research on November 19th, 2015 by steve

UK“Despite some whispers that the research excellence framework (REF) might be scrapped, the government’s higher education Green Paper, published earlier this month, indicates that it will remain – possibly subject to a metrics-based interim ‘refreshment’. There is even a proposal to introduce a version for teaching. That is a pity …” (more)

[Times Higher Education, 19 November]

Tags: , ,

Metrics-based mini REF ‘won’t be credible’

Posted in Research on November 10th, 2015 by steve

UK“A proposed additional assessment of research quality between research excellence frameworks based on metrics such as citations rather than peer review would not be seen as credible, according to one of the authors of a major government-commissioned report on the subject …” (more)

[David Matthews, Times Higher Education, 10 November]

Tags: , ,

Journal impact factors ‘no longer credible’

Posted in Research on November 5th, 2015 by steve

UK“Trickery by editors to boost their journal impact factor means that the widely used metric ‘has now lost most of its credibility’, according to Research Policy journal. With many editors now engaged in ‘ingenious ways’ of boosting their impact factor, ‘one of the main bastions holding back the growing scourge of research misconduct’ has been ‘breached’, the publication warns in an editorial …” (more)

[David Matthews, Times Higher Education, 5 November]

Tags: , ,

Was the REF a waste of time? Strong relationship between grant income and quality-related funding allocation

Posted in Research on August 25th, 2015 by steve

UK“If the funding allocated to universities on the basis of the REF is correlated to the grant funding universities already receive, what is the point of the output assessment process? …” (more)

[Jon Clayden, Impact of Social Sciences, 25 August]

Tags: , ,

Why did REF2014 cost three times as much as the RAE? Hint: It’s not just because of the added impact element

Posted in Research on August 4th, 2015 by steve

UK“The benefits of any research assessment framework should ideally outweigh the costs and burden incurred by universities and staff. Derek Sayer argues there should be cause for concern now that recent analysis shows the 2014 REF bill was three times as much as the last UK assessment exercise …” (more)

[Impact of Social Sciences, 3 August]

Tags: , , ,

Science, values and the limits of measurement

Posted in Research on July 14th, 2015 by steve

UK“Metrics play a growing role in managing research. But to understand their limitations, we need to draw on the humanities. Last week, the independent review of metrics in research assessment published its final report The Metric Tide …” (more)

[Cameron Neylon, Guardian, 14 July]

Tags: , ,

Can the research excellence framework run on metrics?

Posted in Research on June 18th, 2015 by steve

UK“The current research excellence framework is ‘a bloated boondoggle’ that ‘steals years, and possibly centuries, of staff time that could be put to better use, and includes so many outcome measures that every university can cherry-pick its way to appearing ‘top-ranking’ …” (more)

[Paul Jump, Times Higher Education, 18 June]

Tags: , , ,

Why are UK universities still relying on journal impact factors?

Posted in Research on April 30th, 2015 by steve

UK“If you work in the sciences, you will be all too aware of the journal impact factor (JIF). The requirement for ‘publications in high impact journals’ has become a staple of job advertisements, and the achievement of this goal is emblazoned across research group websites as evidence of gloriousness …” (more)

[CDBU, 30 April]

Tags: , , ,

Bibliometrics: The Leiden Manifesto for research metrics

Posted in Research on April 22nd, 2015 by steve

International“Data are increasingly used to govern science. Research evaluations that were once bespoke and performed by peers are now routine and reliant on metrics. The problem is that evaluation is now led by the data rather than by judgement. Metrics have proliferated …” (more)

[Diana Hicks and others, Nature, 22 April]


Death in academia and the mis-measurement of science

Posted in Life, Research on February 11th, 2015 by steve

UK“Universities are increasingly run like businesses hungry for performance benchmarks, disconnected from the way scientists themselves would like their research evaluated …” (more)

[Arran Frood, EuroScientist, 9 February]

Tags: , , ,

In this game is the REF to blame?

Posted in Research on December 23rd, 2014 by steve

UK“Anyone working in and around higher education in the United Kingdom will have obsessing about the ‘Research Excellence Framework’ (REF) over the past week …” (more)

[Ferdinand von Prondzynski, University Blog, 23 December]

Tags: , , ,

Game-playing of the REF makes it an incomplete census

Posted in Research on December 19th, 2014 by steve

UK“Research assessment is only partly reliable as an indicator of the real quality of the work going on in higher education. It has a dual character. On one hand it is rooted in material facts and objective methods …” (more)

[Simon Marginson, The Conversation, 19 December]

Tags: , ,

Why evaluating scientists by grant income is stupid

Posted in Research on December 12th, 2014 by steve

UK“As Fergus Millar noted in a letter to the Times last year, ‘in the modern British university, it is not that funding is sought in order to carry out research, but that research projects are formulated in order to get funding’. This topsy-turvy logic has become evident in some universities …” (more)

[Dorothy Bishop, CDBU, 11 December]

Tags: ,

Assess the real cost of research assessment

Posted in Research on December 10th, 2014 by steve

UK“The Research Excellence Framework keeps UK science sharp, but the process is overly burdensome for institutions, says Peter M Atkinson …” (more)

[Nature, 10 December]

Tags: , , ,

Quality control in research: the mysterious case of the bouncing impact factor

Posted in Research on December 3rd, 2014 by steve

Norway“Research must be reliable and publication is part of our quality control system. Scientific articles get reviewed by peers and they get screened by editors. Reviewers ideally help improve the project and its presentation, and editors ideally select the best papers to publish …” (more)

[Curt Rice, 3 December]

Tags: ,