Beyond the repository? The CERN Innovation in Scholarly Publishing Workshop (OAI7). June 22-24 2011

I was in a very expensive and sultry Geneva in late June to attend the CERN workshop on innovations in scholarly publishing, among a record attendance of over 260 delegates. Perhaps this level of attendance is a sign that Open Access is maturing and becoming mainstream as it moves on from an emphasis on access alone to the exploration of how openness enhances the effectiveness of science and increases the impact of the contribution that it can make. The programme also reflected a level of maturity in the system, a second-generation approach that took it for granted that we were talking about a well-established system with repositories already set up and functioning and open access journals well established (and growing fast). The focus was less the setting up and management of scholarly repositories or the creation of digital publications than the semantics of an integrated research communication system. In fact a key perception at the conference was William Nixon’s suggestion that the ‘repository’ will disappear into the wider workflow of research communication (an ironic statement from someone who is the Service Development Manager of the University of Glasgow repository).

The overall focus was therefore on how to get extra mileage from repositories, interlinking data, publishing effectively and garnering government support for Open Access and Open Science. Cameron Neylon, Senior Scientist in Bio-molecular Sciences at the ISIS Neutron Scattering Facility at the Science and Technology Facilities Council (STFC), argued in his talk on the Technical, Cultural and Legal Infrastructure to Support Open Scientific  Communication that repositories are a ‘temporary scaffolding’ awaiting the time that we have ‘reasserted the traditional values of research and built the pillars and foundations that will make openness an embedded part of what we do’. Neylon’s core argument was that, while we can resolve the technological issues to build a viable architecture for data analysis, reuse and discovery and have the legal infrastructure needed, what is not there yet is the cultural infrastructure – the commitment, the communities, the assumptions and the practices that could make open science work. The ‘real values’ that he articulated were those of reproducibility, making a difference to the community, getting process, data and narrative to relate to one another and ensuring accuracy and validity.

Related to these perceptions, there was a very useful session on advocacy. Monica Hammes from the University of Pretoria spoke on the Open Access Conversation, a cogent and detailed account of the mind-changing process that is needed and the partnerships that need to be developed to get a university to adopt and mandate open access, arguing that one has to anticipate the emotional responses of the people one is trying to persuade, recognising where their interests lie. Heather Joseph of SPARC in Washington, speaking on advocacy at the national and international level, demonstrated how the wording and the logic of arguments have to be distilled and clarified in order to reach government.  Given the powerful lobbying capacity of the big publishing companies in their push for enclosure, she argued that any advocacy initiatives have to be well argued, supported by persuasive data, be very strategic and need to be built on alliances and communities.

When it came to journals, Mark Patterson, Director of Publishing at the Public Library of Science (PLOS) journals gave a compelling account of the rising success of open access journals and the new models that are emerging in this context. PLoS, he reported, now well established as a journal publisher, ‘is now exploring new ways to enhance scholarly communication through online publications that publish new findings more rapidly, and new products that facilitate the evaluation and organization of content after publication.’ Journals, he argued, are ‘giant sorting mechanisms’ and content can be enhanced and organised after publication. PLOSOne, the revolutionary journal that pioneered this approach, is built on the separation of scientific rigour and impact. The former is reviewed before publication, the latter dealt with only after publication. PLOSOne is growing exponentially – projected to publish 12,000 articles in 2011 – and is being emulated by a number of the big journal publishers. The prediction Patterson makes is that this model of ‘megajournal’ could account for 50% of the literature in 5 years. A variety of new impact factors – beyond the ‘citation count’ – are being explored and the value of the content is being enhanced through the creation of social networking hubs.

Other notable speeches included Barend Mons, a professor of Biosemantics at the Universities of Rotterdam and Leiden and Scientific Director of the Netherlands Bioinformatics Centre (NBIC) who spoke on Nanopublications – an intricate and virtuoso mapping of how the narrative contribution of conventional scholarly publication needs to be embedded in a more complex semantic network of data for effective mining and citation.  There is also a Nature article by Mons and other authors on this topic. Quite how a technical process of this intricacy would fit into our under-resourced universities in Africa, particularly in the smaller southern African countries, is something to reflect upon, but it is good to have a roadmap on where we could be heading in bioinformatics in particular.

On the archival front, Jonathan Deering , software developer from the Centre for Digital Theology at Saint Louis University described a system that has been developed for the annotation of historical manuscripts – something that I suspect would appeal to African institutions working on archival records and lost histories.

The useful lessons for African institutions arising from this conference are to be found in the plotting of a road map of where we should be heading and what benefits could accrue if we get it right. As always, capacity and infrastructure levels will be a challenge for African institutions.

The starting point would need to be the acceptance that research communication lies at the heart of the university enterprise and must be supported. The creation of a repository is a good starting point, then a review of technology supporting the communication of research processes and data.

The road map needs to include technology solutions for linking wider data sets to scholarly publications; the formulation of the arguments needed get support for emerging models of scholarly publication; and expanded metrics for measuring the reach and real impact of research. Most of all, though, the question is how we can link and integrate the different research processes and their outputs in an open and collaborative system, to deliver the development impact our governments keep asking for.

All of the plenary addresses delivered at CERN OAI7 are available online on the workshop website, offering the relevant slide presentation as well as an audio/slideshow file. These are available on the link to the workshop programme. (It is a little obscure – if one clicks on a session title, the speeches – but not the name of the relevant speaker – come up in a pop-up box with hyperlinks. )

One thought on “Beyond the repository? The CERN Innovation in Scholarly Publishing Workshop (OAI7). June 22-24 2011

Leave a Reply

Your email address will not be published. Required fields are marked *