Lies, damned lies… and metrics

Two contradictory things are happening side by side in discussion of scholarly publishing right now. On the one hand, the discourse of open access – seeking to remedy the failures of the current system – bases itself overwhelmingly on the value of the journal article as the artefact to be made open, while at the same time, stronger and stronger criticisms are levelled against journals as an effective mode of scientific communication. Questions are also being asked about the appropriateness of the metrics that are used to make judgements on the quality of the articles published, determining the reputation of authors and their institutions. It is well known that this system consigns developing country research to the periphery of a ‘global’ system, marginalising very important research issues – such as ‘neglected diseases’ that apply to large percentages of the world’s population. These concerns now appear to have a strong echo in the mainstream, even if the perspective of the global South is not clearly articulated in the discussion.

In a scathing critique of the current journal system on the LSE Impact of Social Science blog, Bjorn Brembs, a neurobiologist from Freie Universitat Berlin, lays into the ineffectual communication system provided by journal publishing in its bloated state, compounded by the distortions that result from the commonly accepted journal hierarchy and its supporting metrics. Given the vast numbers of journals, this is no longer a functional space for dialogue between scholars, he argues. Trying to establish what would be worth reading is skewed further by the use of inaccurate and misleading metrics as a proxy for quality – a blind and misplaced belief in the magic of numeric measures.

The most commonly accepted metric, Thompson Reuter’s Journal Impact Factor, is demonstrated to be lacking in transparency, not reproducible and statistically unsound. Backing up this claim with a number of analytical articles, from PLOS Medicine, the BMJ and the International Mathematical Union, he comes to the conclusion that ‘[T]he dominant metric by which this journal rank is established, Thomson Reuters’ “Impact Factor” (IF) is so embarrassingly flawed, it boggles the mind that any scientist can utter these two words without blushing.’

As Brembs quite rightly argues, there is little correlation between the impact factor of a journal, based on the number of citations in that journal, and the individual articles that might or might not have been cited in that journal. And so the extension of the journal citation count to article metrics and author evaluation constitutes a serious distortion, a blind and misplaced belief in statistics as magic.

Brembs’s critique of the current journal system – and that of the sources that he draws on – also highlights subject and language bias in the citation system and journal rankings, but does not draw attention to the way the system functions to marginalize an overwhelming proportion of the world’s scientists – those in the developing world.

This critique comes hot on the heels of another diatribe, from George Monbiot, in the Guardian on 29 August who lashed out at the paywalls and profiteering of the leading journals and their culture of greed, an article that trended on Twitter, obviously striking a nerve. Brembs endorses and reinforces Monbiot’s rejection of the profit system that drives current journal publishing.

It was therefore good to see a few hundred years of the the original English-speaking journal, Philosophical Transactions of the Royal Society, made available online by the Royal Society. Going back to the first edition, one rapidly encounters what has been lost in the commercialisation of our journals in the last half century. In his Introduction, Henry Oldenburg gives us insight into the spirit of collaboration and experimentation and the openness of communication that the journal aimed for at this time.

Scientific knowledge in this early journal is seen as a conversation, so that ‘those addicted to and conversant in such matters may be invited and encouraged to search, try, and find out new things, impart their knowledge to one another, and contribute what they can to the Grand design of improving Natural knowledge, and perfecting all Philosophical Arts and Sciences. All for the Glory of God, the Honour and Advantage of these Kingdoms, and the Universal Good of Mankind.’

This sounds much closer to what could be an African vision of research as collaboration and participation, contributing to the public good. Modern journals are very closed-up and arcane artefacts compared to this vision. In fact this first journal looks and sounds very much like a blog – with some leading scientists like Boyle, Hook and Huygens contributing – with the serious and trivial side by side, short and longer pieces, explanations of experiments and stories of odd an ingenious things, from how to kill a rattlesnake to an anecdote of old people growing new teeth.

It would be good to see some serious discussion about the tendency for southern African universities and researchers to buy blindly into dysfunctional systems like the ISI Journal Impact Factor rather than determining what our own values are and what research publication systems would best suit our goals. Saleem Badat, Vice-Chancellor of Rhodes University, taking apart the university ranking system in the UNESCO World Social Science Report 2010 finds the same kind of distortions and inadequacies that Brembs complains of.  Badat warns against the ‘perverse and dangerous effects’ than can result from ‘uncritical mimicry of and ‘catching up’ with the so-called world-class university’. Instead, he suggests that the diverse goals of different institutions and countries should be reflected in a horizontal continuum that ‘makes provision for universities to pursue different missions.’

We would do well to listen – a matter of playing catch-up with the future instead of the past.

One thought on “Lies, damned lies… and metrics

  1. Björn Brembs

    Indeed, I did not point out how the current system marginalizes those in the developing world, and I should have. I’m currently contemplating writing a review article about some of these aspects and will try and incorporate a reference to this extent into the article. Thanks for alerting me to my oversight.

Leave a Reply

Your email address will not be published. Required fields are marked *