Ranking colleges and universities doesn’t serve a real purpose

Posted in Governance and administration on July 31st, 2013 by steve

“As a soon-to-be college graduate, my school has given me countless opportunities to do a little bit of ‘career soul-searching’. One of the most recent opportunities was to work with Forbes Media‘s social media and product development team on its annual Top College’s list …” (more)

[Kasey Varner, Guardian, 31 July]

Tags: ,

Competition and controversy in global rankings

Posted in Governance and administration on June 29th, 2013 by steve

International“Higher education is becoming more competitive by the day. Universities are scrambling for scarce research funds and public support. They are trying to recruit from increasingly suspicious and cynical students. The spectre of online education is haunting all but the most confident institutions …” (more)

[Richard Holmes, University World News, 29 June]


Citation Cartels

Posted in Governance and administration on June 22nd, 2013 by steve

International“… Surely it is now time for Thomson Reuters to stop counting self-citations for the Research Influence indicator in the THE World University Rankings …” (more)

[University Ranking Watch, 22 June]

Tags: , , ,

Elite journals are losing their position of privilege

Posted in Research on May 16th, 2013 by steve

“Having first documented the large-scale demise of the impact factor as a predictor of quality research, George Lozano and team examined whether this pattern also applies to the handful of elite journals …” (more)

[Impact of Social Sciences, 16 May]

Tags: , ,

Citation Cartel Journals Denied 2011 Impact Factor

Posted in Research on June 30th, 2012 by steve

“Thomson Reuters released the 2011 edition of the Journal Citation Report (JCR) on Thursday, promoting increased coverage of regional journals and listing 526 new journals receiving their first journal impact factor. Far less conspicuous was a list of 51 journals that were suspended from this year’s report due to ‘anomalous citation patterns’ …” (more)

[Phil Davis, The Scholarly Kitchen, 29 June]

Tags: ,

Record number of journals banned for boosting impact factor with self-citations

Posted in Research on June 29th, 2012 by steve

“More research journals than ever are boosting their impact factors by self-citation. Every year, Thomson Reuters, the firm that publishes the impact factor rankings, takes action against the most extreme offenders by banning them from appearing in the latest lists. It lets them in again, suitably chastened, a couple of years later …” (more)

[Richard Van Noorden, Nature News Blog, 29 June]


Experts question rankings of journals

Posted in Research on October 5th, 2011 by steve

“Peer review may be a good way to assess research papers, but it can fall short in ranking the journals themselves. That’s the reaction of some metrics experts to the first such journal rankings, launched this week by the Faculty of 1000 (F1000) in London …” (more)

[Declan Butler, Nature News, 5 October]


The Futility of Ranking Academic Journals

Posted in Research on August 18th, 2011 by steve

“… Quality can be a subjective measurement; just because the ranking exercise is conducted by groups of noteworthy academics, usually in private, doesn’t make it otherwise. Then there is the problem of the databases which hold only a proportion of the over 1.3 million articles published annually. The main beneficiaries are the physical, life, and medical sciences, due to their publishing habits …” (more)

[Ellen Hazelkorn, WorldWise (Chronicle of Higher Education), 16 August]


Dropping ERA rankings ‘correct decision’: Ellen Hazelkorn

Posted in Research on July 5th, 2011 by steve

“Dropping rankings from journals for the next round of the Excellence in Research for Australia audit was the correct decision, according to a leading thinker on metrics in higher education. Ellen Hazelkorn, Dublin Institute of Technology’s director of research and enterprise in the higher education policy unit, says the Australian Research Council was right to drop the designated A* to C rankings …” (more)

[Jill Rowbotham, The Australian, 6 July]

Tags: , ,

It’s impact factor time!

Posted in Teaching on June 29th, 2011 by steve

“Once a year, information company Thomson Reuters publishes updates to a measure of popularity that every science journal displays in lights: its ‘impact factor’. This event, which happened again yesterday, always produces a slightly embarrassed buzz among science journal editors …” (more)

[Richard Van Noorden, Nature News Blog, 29 June]

Tags: ,

Outraged European academics resent ‘rankings’

Posted in Research on June 27th, 2011 by steve

“When new lists categorising European arts and humanities journals were first published in 2007, UK academics were – to put in politely – incensed. We want ‘no part’ in such a ‘dangerous and misguided exercise’, said a plethora of journal editors. A special arts and humanities user group was even formed by UK subject associations to provide a co‑ordinated opposition …” (more)

[Zoë Corbyn, Guardian, 27 June]


ERA journal rankings are dead – hurrah, hurrah!

Posted in Governance and administration on May 31st, 2011 by steve

“I haven’t heard of an academic yet who is sad about the end of the ERA journal rankings, announced yesterday by Minster Kim Carr. Carr said in his press release: ‘There is clear and consistent evidence that the [ERA journal] rankings were being deployed inappropriately within some quarters of the sector, in ways that could produce harmful outcomes …'” (more)

[Skepticlawyer, 31 May]

Tags: , ,

Simplistic approach ‘distorts university rankings’

Posted in Governance and administration on March 31st, 2011 by steve

“There is no such thing as an objective international university ranking and they undermine the mission of education, says leading DIT researcher …” (more)

[Dick Ahlstrom, Irish Times, 31 March]

Tags: , ,

‘Me’ focus skews journal ratings

Posted in Research on March 4th, 2010 by steve

“Journal lists that rank scholarly journals by their perceived quality may be fundamentally flawed because of the bias among researchers who are keen to label their own work as top drawer. Research into the compilation of journal lists concludes that the peer-review system used is open to personal bias. It raises questions about the role of the lists as a measure of academic excellence …” (more)

[Hannah Fearn, Times Higher Education, 4 March]