Love DORA, Hate Rankings?

Posted in Research on May 10th, 2021 by steve

“Lizzie Gadd argues that any commitment to responsible research assessment as outlined in DORA (Declaration on Research Assessment) and other such manifestos needs to include action on global university rankings. Highlighting four fundamental critiques of the way in which journal metrics and university rankings have been deployed in higher education, she proposes universities could unite around the principle of being ‘much more than their rank’ …” (more)

[LSE Impact Blog, 10 May]

Tags: ,

Law, Metrics, and the Scholarly Economy

Posted in Research on April 22nd, 2021 by steve

“As markets began to usurp other forms of social regulation throughout the 20th century, metrics became increasingly central to the coordination of new spheres of market-mediated relations. More recently, digital metrics have been operationalized to facilitate the platformization of those domains. Platforms use automated scoring systems to rank content and actors across the markets they mediate …” (more)

[Jake Goldenfein, LPE Project, 22 April]


Row erupts over university’s use of research metrics in job-cut decisions

Posted in Research on March 27th, 2021 by steve

“A university in the United Kingdom is facing criticism over the responsible use of research metrics, after it used information about scientists’ research income and publication records to identify dozens of jobs that are ‘at risk’. Critics say that using metrics in such a decision is inappropriate because they tend to focus on a small part of an academic’s job. They add that the institution at the centre of the row – the University of Liverpool – used a metric based on citations that is designed to evaluate large groups of researchers, rather than individuals …” (more)

[Holly Else, Nature, 25 March]

Tags: ,

Why the h-index is a bogus measure of academic impact

Posted in Research on July 8th, 2020 by steve

International“Earlier this year, French physician and microbiologist Didier Raoult generated a media uproar over his controversial promotion of hydroxychloroquine to treat COVID-19. The researcher has long pointed to his growing list of publications and high number of citations as an indication of his contribution to science, all summarized in his ‘h-index’ …” (more)

[Yves Gingras and Mahdi Khelfaoui, The Conversation, 8 July]


Must academic evaluation be so citation data driven?

Posted in Research on September 30th, 2018 by steve

“For the past quarter-century, I have reviewed cases for academic tenure and promotion in many disciplines in many countries. Usually what is required is an evaluation of the candidate’s research record. Teaching and, increasingly, public engagement are also mentioned as factors to weigh …” (more)

[Steve Fuller, University World News, 28 September]

Tags: , ,

World-Class Universities: what are they, and why are they relevant?

Posted in Research on August 31st, 2018 by steve

Ireland“In this article, published in Ireland’s Yearbook of Education 2017-2018, Professor Deeks discusses the characteristics of world-class universities, how these are measured, the challenges facing universities, their core role and their contribution to the economy and society …” (more)

[Education Matters, 31 August]

Tags: , , ,

Despite becoming increasing institutionalised, there remains a lack of discourse about research metrics among much of academia

Posted in Research on August 15th, 2018 by steve

Ireland“The active use of metrics in everyday research activities suggests academics have accepted them as standards of evaluation, that they are ‘thinking with indicators’. Yet when asked, many academics profess concern about the limitations of evaluative metrics and the extent of their use. Why is there such a discrepancy between principle and practices pertaining to metrics? …” (more)

[Lai Ma, LSE Impact Blog, 15 August]


Few UK universities have adopted rules against impact-factor abuse

Posted in Research on February 18th, 2018 by steve

“A survey of British institutions reveals that few have taken concrete steps to stop the much-criticized misuse of research metrics in the evaluation of academics’ work. The results offer an early insight into global efforts to clamp down on such practices …” (more)

[Nisha Gaind, Nature, 12 February]

Tags: , ,

Academia is fucked-up. So why isn’t anyone doing something about it?

Posted in Research on March 23rd, 2017 by steve

“A week or so ago, a list of perverse incentives in academia made rounds. It offers examples like ‘rewarding an increased number of citations’ that – instead of encouraging work of high quality and impact – results in inflated citation lists, an academic tit-for-tat which has become standard practice. Likewise, rewarding a high number of publications doesn’t produce more good science, but merely finer slices of the same science …” (more)

[Sabine Hossenfelder, Backreaction, 22 March]


Every attempt to manage academia makes it worse

Posted in Research on March 19th, 2017 by steve

“I’ve been on Twitter since April 2011 — nearly six years. A few weeks ago, for the first time, something I tweeted broke the thousand-retweets barrier. And I am really unhappy about it. For two reasons. First, it’s not my own content …” (more)

[Sauropod Vertebra Picture of the Week, 17 March]


New Research Initiative Criticised After Concerns Over Ranking Academics

Posted in Research on February 2nd, 2017 by steve

Ireland“A new initiative to measure the research output of Trinity’s staff, currently in its pilot phase, has raised concerns that it might potentially rank and compare individual academics, already drawing criticism from a trade union. The new initiative, Principal Investigator Quantitative Analytics, is proposing to rank academics based on their research output …” (more)

[Róisín Power and Sinéad Baker, University Times, 1 February]

Tags: , ,

The Stern Review on REF 2014 – a review of recommendations

Posted in Research on August 23rd, 2016 by steve

UK“Following the recent publishing of the Stern Review of REF 2014, Dr Sergey Popov looks at some of the recommendations contained in the report …” (more)

[QPOL, 22 August]

Tags: , ,

Why I had to quit the research excellence framework panel

Posted in Research on November 19th, 2015 by steve

UK“Despite some whispers that the research excellence framework (REF) might be scrapped, the government’s higher education Green Paper, published earlier this month, indicates that it will remain – possibly subject to a metrics-based interim ‘refreshment’. There is even a proposal to introduce a version for teaching. That is a pity …” (more)

[Times Higher Education, 19 November]

Tags: , ,

Metrics-based mini REF ‘won’t be credible’

Posted in Research on November 10th, 2015 by steve

UK“A proposed additional assessment of research quality between research excellence frameworks based on metrics such as citations rather than peer review would not be seen as credible, according to one of the authors of a major government-commissioned report on the subject …” (more)

[David Matthews, Times Higher Education, 10 November]

Tags: , ,

Journal impact factors ‘no longer credible’

Posted in Research on November 5th, 2015 by steve

UK“Trickery by editors to boost their journal impact factor means that the widely used metric ‘has now lost most of its credibility’, according to Research Policy journal. With many editors now engaged in ‘ingenious ways’ of boosting their impact factor, ‘one of the main bastions holding back the growing scourge of research misconduct’ has been ‘breached’, the publication warns in an editorial …” (more)

[David Matthews, Times Higher Education, 5 November]

Tags: , ,

Was the REF a waste of time? Strong relationship between grant income and quality-related funding allocation

Posted in Research on August 25th, 2015 by steve

UK“If the funding allocated to universities on the basis of the REF is correlated to the grant funding universities already receive, what is the point of the output assessment process? …” (more)

[Jon Clayden, Impact of Social Sciences, 25 August]

Tags: , ,

Why did REF2014 cost three times as much as the RAE? Hint: It’s not just because of the added impact element

Posted in Research on August 4th, 2015 by steve

UK“The benefits of any research assessment framework should ideally outweigh the costs and burden incurred by universities and staff. Derek Sayer argues there should be cause for concern now that recent analysis shows the 2014 REF bill was three times as much as the last UK assessment exercise …” (more)

[Impact of Social Sciences, 3 August]

Tags: , , ,

Science, values and the limits of measurement

Posted in Research on July 14th, 2015 by steve

UK“Metrics play a growing role in managing research. But to understand their limitations, we need to draw on the humanities. Last week, the independent review of metrics in research assessment published its final report The Metric Tide …” (more)

[Cameron Neylon, Guardian, 14 July]

Tags: , ,

Can the research excellence framework run on metrics?

Posted in Research on June 18th, 2015 by steve

UK“The current research excellence framework is ‘a bloated boondoggle’ that ‘steals years, and possibly centuries, of staff time that could be put to better use, and includes so many outcome measures that every university can cherry-pick its way to appearing ‘top-ranking’ …” (more)

[Paul Jump, Times Higher Education, 18 June]

Tags: , , ,

Why are UK universities still relying on journal impact factors?

Posted in Research on April 30th, 2015 by steve

UK“If you work in the sciences, you will be all too aware of the journal impact factor (JIF). The requirement for ‘publications in high impact journals’ has become a staple of job advertisements, and the achievement of this goal is emblazoned across research group websites as evidence of gloriousness …” (more)

[CDBU, 30 April]

Tags: , , ,