Open Access News

News from the open access movement

Saturday, June 16, 2007

New fees at Royal Society hybrid journals

On June 22, the Royal Society will raise the publication fees for its EXiS Open Choice (hybrid OA) journals.  The new prices:

These prices apply from June 22 2007

Proceedings A
Phil Trans A
Notes and Records

First 10 pages: £150 ($278) per journal page. Additional pages above this: £50 ($93) per journal page.

Biology Letters
Phil Trans B
Proceedings B

First 6 pages: £225 ($416) per journal page. Additional pages above this: £50 ($93) per journal page.

Note that VAT is applicable for this service.

Correction (June 21, 2007). A colleague points out that the new fees represent a significant decrease from last year. For example, compare the prices above with the prices reported in this article from June 2006. My apologies to the RS for not noticing or mentioning that the new fees are lower than the old fees. On the other hand, the RS price announcement doesn't mention this fact either.

Promoting OA journals from developing countries

The Summer issue of the INASP Newsletter is now online.  Excerpt:

Widening access to OpenAccess Journals published by
developing and emerging countries

INASP and the Library Head Office, Lund University have formed a partnership agreement to support widening access to Open Access Journals published by developing and emerging countries.

In order to increase the visibility of journals in the Directory of Open Access Journals (DOAJ), information about the country of origin of journals will be added to the Directory and made searchable.

The DOAJ will be promoted and marketed to journals produced in the PERI network countries and materials will be developed which underline the advantages for the Open Access journals from these countries. Once developed, the materials will be sent to the journal editors.

The DOAJ staff will then undertake special follow-up, counselling and advice to journals/journal editors in order to assist appropriate journals to be able to deliver article level metadata for inclusion in the DOAJ service, thus making the journals more visible, more used and able to gain more impact. Furthermore the DOAJ staff will engage in communication with the journals editors in order to develop preservation policies, preservation arrangements etc.

The activities described above will be evaluated in December 2007 in order to determine their impact by measuring the number of Open Access journals from PERI countries and the number of these journals which are searchable on article level. A short report will be written to communicate the findings....

Hamburg University Press becomes an all-OA publisher

Starting on July 1, all scientific publications of the Hamburg University Press will be OA.  The press will also sell print-on-demand editions from its own web site and and through traditional booksellers. 


  • Kudos to the leadership at the press.  This step is bold and welcome.  Although a growing number of university presses have OA titles, series, or imprints, I believe that only Rice University Press is in the same league as an all-OA press.  (If there are others, I'd be glad to hear about them.)
  • The announcement, on the front page of the press site, doesn't give further details.  For example, while it explicitly mentions OA in the sciences, it doesn't say anything about the arts and humanities.  However, it lists five books soon to appear in dual (OA/TA) editions, and one is Tilmann Altenberg and Klaus Meyer-Minnemann (eds.), Europäische Dimensionen des "Don Quijote" in Literatur, Kunst, Film und Musik.  So perhaps the new policy applies to all fields in which the press publishes work. 
  • The announcement doesn't mention books or journals, but all its examples are books.  Nor does it mention the licensing or permissions that will accompany the OA or free online editions.

More on the feasibility of a Usage Factor

The UKSG has released its final report on the feasibility of developing a Usage Factor for journals and journal articles.  From the text (dated May 2007 but announced and released June 15, 2007):

...Based on [surveys and interviews] it appears that it would not only be feasible to develop a meaningful journal Usage Factor, but that there is broad support for its implementation. Detailed conclusions and recommendations are provided in Section 4 of this report. Principal among these are: 

  • the COUNTER usage statistics are not yet seen as a solid enough foundation on which to build a new global measure such as Usage Factor, but confidence in them is growing and they are seen as the only viable basis for UF
  • the majority of publishers are supportive of the UF concept, appear to be willing, in principle to participate in the calculation and publication of UFs, and are prepared to see their journals ranked according to UF
  • there is a diversity of opinion on the way in which UF should be calculated....
  • the great majority of authors in all fields of academic research would welcome a new, usage-based measure of the value of journals 
  • UF, were it available, would be a highly ranked factor by librarians, not only in the evaluation of journals for potential purchase, but also in the evaluation of journals for retention or cancellation 
  • publishers are, on the whole, unwilling to provide their usage data to a third party for consolidation and for calculation of UF. The majority appear to be willing to calculate UFs for their own journals and to have this process audited. This is generally perceived as a natural extension of the work already being done for COUNTER. While it may have implications for systems, these are not seen as being problematic.... 
  • there are several structural problems with online usage data that would have to be addressed for UFs to be credible. Notable among these is the perception that online usage data is much more easily manipulated than is citation data. 
  • should UKSG wish to take this project further there is a strong likelihood that other agencies would be interested in contributing financial support

From elsewhere in the body of the report:

...6 of the 7 authors are interested in knowing how frequently their articles are accessed online.  One author currently monitors the Web of Science to access how frequently his articles are being cited; he would find the usage equivalent of this very valuable. Other authors mentioned that they are also very interested in where and by whom their articles are being used. The majority of the authors were not familiar with COUNTER....

[In response to the question whether usage data should cover articles from the previous 2 years, the previous 5 years, or some other period, one interview subject suggested:]  Go for 2 years. UF should be more immediate than IF. Given the trend towards free availability of research articles after a period, paid access is going to be increasingly regarded as being for a shorter period after publication. Librarians will want metrics to cover the period for which they are paying. Five years would be too long....

[In response to a question about benefits for participating publishers, one suggested:]  By participating in this process, publishers will influence it, helping to develop useful measures in which they can have confidence. Currently journal publishers are under a lot of pressure to demonstrate the value they provide. The challenge from open access has further stimulated this....

[T]he number of sites on which the full text of a particular article will be available is likely to grow in the future, as a result of an increase in open access publishing and institutional repositories. This will increase the difficulty in obtaining a 100% global figure for journal usage. This need not be an insurmountable obstacle to the calculation of comparable UFs, but it is a potential problem....

Librarians indicated that, if UF were available, it would become the second most important factor ( after ‘feedback from library users’) in decisions in the purchase of new journals, while it would be the third most important factor ( after ‘feedback from library users’ and ‘usage’) in retention/cancellation decisions....


  1. I'm all for anything that will help authors make better decisions on where to submit their work and help librarians make better decisions on what to buy, renewal, and cancel.  But we have to be careful going in. Impact Factors (IFs), for example, measure something but it isn't quality, and yet the IF has come to be used as a quality measurement, or a crude substitute for a quality measurement, by promotion and tenure committees, funding agencies, and librarians.  If we can avoid the same mistake with the UF, then let's go for it.  The more good data we have, the better.  But what are the odds that we'll avoid the same mistake?
  2. If the UF materializes, it will be important for OA journals to participate in COUNTER, important for COUNTER to reach out to more OA journals, and important for the usage of articles on deposit in OA repositories to count toward the UF of the articles and the journals in which they were published.
  3. Here's what worries me most:  There are good reasons to think that the UF will systematically undercount the usage of OA articles.  As I argued in an article from January 2004:
    As soon as we provide open access to an article, we should expect copies to proliferate around the world....Copies interfere with the measurement of traffic and usage.  A given archive or journal might measure usage very well.  But if there is an unknown number of copies elsewhere on the net, and an unknown percentage of readers are using those other copies, then the local measurements will be inaccurate to an unknown degree.  We might know that all verified counts are undercounts, but we won't know by how much.

    The report itself (in the second-to-last paragraph excerpted above) acknowledges this problem but says it is "not...insurmountable".  As the UF moves forward, let's watch closely how the UKSG and other stakeholders address this problem --and remember that we are stakeholders ourselves.

Do we need a Repositories Plan B?

Andy Powell, Repository Plan B? eFoundations, June 15, 2007.  Excerpt:

"The most successful people are those who are good at Plan B."
-- James Yorke, mathematician

My post yesterday about real vs. fake sharing in the context of services like Facebook, coupled with my ongoing thinking about what is or isn't happening in the repositories space has made me begin to wonder.

Pete said to me yesterday...that he feels very reluctant these days to put content into any service that doesn't spit it back out at you as an RSS or Atom feed.

I completely concur....

But what does this mean for repositories?

Imagine a world in which we talked about 'research blogs' or 'research feeds' rather than 'repositories', in which the 'open access' policy rhetoric used phrases like 'resource outputs should be made freely available on the Web' rather than 'research outputs should be deposited into institutional or other repositories', and in which accepted 'good practice' for researchers was simply to make research output freely available on the Web with an associated RSS or Atom feed.

Wouldn't that be a more intuitive and productive scholarly communication environment than what we have currently? ...

Since [arXiv], we have largely attempted to position repositories as institutional services, for institutional reasons, in the belief that metadata harvesting will allow us to aggregate stuff back together in meaningful ways.

Is it working?  I'm not convinced.  Yes, we can acknowledge our failure to put services in place that people find intuitively compelling to use by trying to force their use thru institutional or national mandates?  But wouldn't it be nicer to build services that people actually came to willingly?

In short, do we need a repositories plan B?

Vienna OA presentations

The presentations from ElPub 2007, Openness in Digital Publishing: Awareness, Discovery and Access (Vienna, June 13-15, 2007), are now online.

Chemical data in OA repositories

JISC's 18th-month SPECTRa project (Submission, Preservation and Exposure of Chemistry Teaching and Research Data) ended in March and the final report is now available.  From the report:

Project SPECTRa's principal aim was to facilitate the high-volume ingest and subsequent reuse of experimental data via institutional repositories, using the DSpace platform, by developing Open Source software tools which could easily be incorporated within chemists' workflows....

SPECTRa was funded by JISC's Digital Repositories Programme as a joint project between the libraries and chemistry departments of the University of Cambridge and Imperial College London, in collaboration with the eBank UK project.

Surveys of chemists at Imperial and Cambridge investigated their current use of computers and the Internet and identified specific data needs. The survey's main conclusions were:

  • Much data is not stored electronically (e.g. lab books, paper copies of spectra)
  • A complex list of data file formats (particularly proprietary binary formats) being used
  • A significant ignorance of digital repositories
  • A requirement for restricted access to deposited experimental data

Distributable software tool development using Open Source code was undertaken to facilitate deposition into a repository, guided by interviews with key researchers. The project has provided tools which allow for the preservation aspects of data reuse. All legacy chemical file formats are converted to the appropriate Chemical Markup Language scheme to enable automatic data validation, metadata creation and long-term preservation needs. Additional tools would be required to add value to any large-scale data aggregates.

The deposition process adopted the concept of an "embargo repository" allowing unpublished or commercially sensitive material, identified through metadata, to be retained in a closed access environment until the data owner approved its release. The resultant repository architecture envisages a federated framework in which data will first be deposited into an intermediate departmental repository and may later be pushed into a central Open Access repository.

The interviews with researchers also identified the concept of a "golden moment" - a point at which the researcher best understands the process, possesses a comprehensive package of information to describe it, and is motivated to submit it to a data management process.

Among the project's findings were the following:

  • it has integrated the need for long-term management of experimental chemistry data with the maturing technology and organisational capability of digital repositories;
  • scientific data repositories are more complex to build and maintain than are those designed primarily for text-based materials;
  • the specific needs of individual scientific disciplines are best met by discipline-specific tools, though this is a resource-intensive process;
  • institutional repository managers need to understand the working practices of researchers in order to develop repository services that meet their requirements;
  • IPR issues relating to the ownership and reuse of scientific data are complex, and would benefit from authoritative guidance based on UK and EU law....

Friday, June 15, 2007

Re-opening access to presidential papers

The right to know, Austin American-Statesman, June 15, 2007.  An editorial.  Excerpt:

With the strong support of U.S. Sen. John Cornyn, a Texas Republican, Congress may repeal one of the worst abuses of power in the Bush administration.

In his first year in the White House, Bush issued an executive order giving presidents, vice presidents and their heirs the power to block the release of White House records. That order sealed White House records that have been public dating back to the Watergate scandal that brought down the Nixon administration in the 1970s.

Acting on the public's right to know what government is doing, Cornyn, a Senate committee and an overwhelming majority in the House of Representatives support a repeal of Bush's order making White House records secret. The repeal act passed the Senate Homeland Security and Government Affairs Committee this week and goes before the full Senate next.

Bush, whose administration embraces secrecy at every turn, has vowed to veto the repeal act. But the House version of the law, HR 1255, was approved by a veto-proof 333-93 vote in March. Senate Homeland Security Chairman Joseph Lieberman, I-Conn., said the bill may have to be amended in the Senate but it could pass by a veto-proof margin as well....

Bush over-reached when he ordered White House records sealed from public view indefinitely. Congress is right to vacate that order with an act making those records public after 12 years, as they have been since 1981.

The Senate should make that position clear to Bush with a vote as resounding as the House vote, one that assures a Bush veto will be overridden.

PS.  Hear, hear.  For background, see my blog posts from 2002 on Bush's executive order.  Sen. John Cornyn, of course, is the Senator from Bush's state and party who introduced FRPAA last year.

PLoS' annual price review

The Public Library of Science recently added this paragraph to its FAQ on publication fees:

We raised our prices for the first time in 2006, and now have an annual price review. In line with our commitment to minimize the burden on authors, in 2007 our price increases will be: no change for PLoS ONE; 5% for PLoS Computational Biology, PLoS Genetics, and PLoS Pathogens; and 10% for PLoS Biology and PLoS Medicine. The variation in price increases reflects the different costs associated with these publications. These prices will come into effect after July 15. For current pricing information see Publication Fees for PLoS Journals.

And it added this paragraph to its page on publication fees:

Please note that the publication fees for newly submitted articles will be increased for the following journals after July 15, 2007: PLoS Biology (US$2750); PLoS Medicine (US$2750); PLoS Computational Biology (US$2100); PLoS Genetics (US$2100); PLoS Pathogens (US$2100).

Presentations on OA in Latin America

The presentations from the CRIA meeting, Strategies for Open and Permanent Access to Scientific Information in Latin America (Atibaia, Brazil, May 8-10, 2007), are now online.  (Thanks to EPT.)

DRIVER is looking for Belgian OA repositories

From the DRIVER news page:

As the first public release of the DRIVER testbed (end of July) approaches, we would like to make a call for Belgian content. If you want to participate and be part of the European infrastructure, take a look at the DRIVER guidelines (and annex) for repositories, and let us know about the status of your repository. We will be glad to integrate you in the European content layer.

BMC workshop on OA

Charlotte Webber, Summary of BioMed Central’s Workshop at MLA ’07 conference, BioMed Central blog, June 15, 2007.

The feasibility of collaboration between institutions and funding bodies in respect to financing open access publications (May 21st, 2007 Philadelphia, USA)

The workshop began with an introduction from Natasha Robshaw, BioMed Central, who gave an overview on:

  • BioMed Central article-processing charges
  • cost comparison of open access publishing options
  • payment mechanisms for open access publications
  • central funds for open access publishing...

By working with research administrators to set up central Open Access publishing funds, paid for as an indirect cost by research funders, librarians can make it much easier for authors to publish in open access journals, and so can accelerate the transition to a fully open access future.

[Download Presentation]

Ellen Finne Duranceau, Scholarly Publishing and Licensing Consultant, MIT Libraries gave a presentation on the role of a librarian in an open access world. Ellen highlighted the redefined role of a librarian in an open access world and discussed MIT’s role in the open access world....Essentially MIT’s mission is focussed on the widest dissemination of information in an efficient and reliable environment, which is consistent with the open access world....

Ellen also discussed the changing positions of librarians since the conception of open access in 2002. Over the last five years MIT Libraries have identified new librarian roles to include information services librarian for engineering and science, Dspace product manager and scholarly publishing consultant....A faculty member from MIT libraries sums up the role of the librarian in the open access world: “I thought the faculty committee on the library system would be three years of dry drudgery. But it turns out librarians in their new role are now located at the centre of the most contentious and important issues of the day.”

[Download Presentation]

The workshop continued with three discussions a) Collaborating with research administration and funders to set up central university funds for open access publication charges [led by Dominic Tate], b) administrating central funds for open access publication charges: memberships; individual payments; reporting and accountability [led by Maria Romano], and c) How can BioMed Central adapt open access payment models to meet the needs of the community? [led by Natalia Timiraos] ...

The delegates discussed researchers’ awareness of open access as an option for publishing research and highlighted their concerns with the current situation in the US. Two major concerns included the funders’ need to cover article processing charges and the lack of education about open access....

NIH contributes funds towards most of the research and this goes directly to the researchers rather than the institutions. The standard ‘page charges’ are paid by the grants made to these researchers. However the NIH is stretched and could not possibly provide appropriate funding for all processing charges. The delegates recommended the funders should be responsible for such charges. For example, in Ohio, the OhioLINK consortium has a BioMed Central prepay membership for all members. The board of regents of the state of Ohio funds OhioLINK. They also have an agreement with PloS for half of the article processing charges to be paid for by them....

Delegates recommended more education be provided about open access in order to encourage researchers to embrace the option. It was mentioned that one librarian has been teaching undergraduates about the benefits of this movement in the hopes that these undergraduates will recognise the value of open access when they reach the stage of publishing. Penn State was provided as an example of an institution actively supporting open access where The Dean of Libraries has focussed the university’s efforts into their open access activities such as the Dspace project....

Delegates from the University of Kansas Medical Centre noted that from their experience the key to making open access a success is education and information. A representative from the University of California Davis suggested that BioMed Central increase its visibility further so that researchers and funders are continually exposed to open access. The delegates agreed that more librarians do need to get behind open access-related activity in order to increase awareness and ongoing support from researchers and funders alike.

Texas A&M University is a rare example of an institution that currently receives some central funding from the university for their research. As a result, researchers at this institution have the benefit of knowing that open access publishing will always be an available option to them....

OA mandates for ETDs growing in Australasia

Australasian Digital Theses Program: Membership survey 2006, February 2007.  (Thanks to CAUL.)  Excerpt:

The key findings of the survey are:

  1. The average percentage of records for digital theses added to ADT is 95% when digital submission is mandatory and 17% when it is not mandatory. This implies that less than 22% of students submit a digital thesis when it is not mandatory when you take into account the small percentage that are not added to ADT [Australian Digital Theses] for copyright or embargo reasons (up to 5%).
  1. 59% of respondents will have mandatory digital submission in place in 2007.
  1. With this level of mandatory submission it is predicted that 60% of all theses produced in Australia and New Zealand in 2007 will have a digital copy recorded in ADT.
  1. Two respondents have retrospectively digitised all or nearly all of their theses, and nine have projects underway....
  1. Over half the respondents have a repository already and most are using it to manage digital theses.
  1. 87% will have a repository by the end of this year, and the rest are in the initial planning stage.

Comment.  This is very good news.  For ETDs, mandatory digital submission is essentially equivalent to OA.  Here's how I put it last year:

In principle, universities could require electronic submission of the dissertation without requiring deposit in the institutional repository.  They could also require deposit in the repository without requiring OA.  But in practice, most universities don't draw these distinctions.  Most universities that encourage or require electronic submission also encourage or require OA.  What's remarkable is that for theses and dissertations, OA is not the hard step.  The hard step is encouraging or requiring electronic submission.  For dissertations that are born digital and submitted in digital form, OA is pretty much the default.  I needn't tell you that this is not at all the case with journal literature....

Update. Also see Arthur Sale's comments on the report:

It is apparent from the report (and indeed highlighted by the authors) that a mandatory deposit policy results in a submission rate of 95% of all theses accepted, while its absence results in a submission rate of 17-22% (in other words, a pitifully empty repository). While this should not be news to anyone, the report has hard quotable facts on the success of an institutional mandatory policy over a substantial population of universities.

59% (ie 33) of Australian and New Zealand universities have mandatory deposit policies in place in 2007, so the technological change has gone well beyond the tipping point. I expect the remaining 41% of universities to follow suit in the very near future; the report suggests that 24% had already started planning to this end in 2006.

In another interesting fact, three universities have provision for a thesis to be lodged electronically only (in other words no paper copy) and one is considering it. It is not clear how much this provision is used for hypermedia theses, or if it will spread.

The case for OA for consumer magazines

Adam Hodgkin, Open Archives (4): Citeability and Moving Walls, Exact Editions Blog, June 15, 2007.  Excerpt:

There is a strong case for Open Access to scientific research and scholarship published in article form. This was crystalised in the Budapest Open Access Initiative in 2001....The Budapest programme specifically limited its recommendations to:

The literature that should be freely accessible online is that which scholars give to the world without expectation of payment. [my emphasis].

This is an important limitation....After all most consumer magazines pay their contributors, often very handsomely. Yet it may well be in the interests of a successful consumer magazine to make a substantial portion of its archive freely accessible as a web resource. Why should this be?

One reason -- is that Open Access to an archive enhances the authority and renown of a magazine. Consumer magazines are often quite specialist, quite limited, in their appeal. But this tight focus is part of their strength and gives them potentially authoritative status. The reputation of a magazine or a periodical is immediately enhanced if its articles can be effectively cited, referenced, commented upon, by others. The prevalence and searchability of the web has enormously increased the extent to which magazines can build a reputation through links and citations. Citeability / referenceability / linkability is the strongest reason for making some portions of a consumer magazine archive available as a digital resource.

This way bloggers, enthusiasts, journalists, emailers, advertisers, and reviewers will pile in to amplify the reputation of the publication. An obvious way of gaining the advantages of a citeable archive, whilst not giving away the baby with the bath water, is for the publisher to make the archive freely available through the web, outside of a 'moving wall, so that issue become available after a period of some months (6 months, 12 months -- whatever is judged necessary to maintain the perceived value of the personal subscription)....

So making portions of a consumer magazine archive Openly Accessible makes sense if this significantly enhances the reputation and the authoritative quality of the publication, and if it does so without damaging the commercial prospects of the magazine....

Exact Editions is a distribution partner whose reward is a small commission on the digital subscriptions sold. So we are keenly interested in having more subscribers....

Furthermore, the way our deal works with the publishers we absorb the distribution and maintenance costs of the digital edition. So it costs Exact Editions, not the publisher, a bit more to maintain an Open Archive. We think these costs are easily containable within the parameters of the small commission we obtain from selling additional digital subscriptions, so we encourage our publishing partners to offer Open Archives with a moving wall. The marginal costs of maintaining Open Access are marginal. So you don't need to feel sorry for us!

On the other hand, if you enjoy the open archives and never buy a subscription you can thank us as well as the publisher for making this service available....

A2K enters the WIPO mandate

Tove Iren S. Gerhardsen, Negotiators Agree To Add Access To Knowledge To WIPO Mandate, IP Watch, June 14, 2007.  Excerpt:

World Intellectual Property Organization members negotiating development-related proposals for WIPO’s future mandate reached preliminary agreement on several more key issues Thursday, including access to knowledge and exceptions and limitations.

Agreement came after lengthy talks on what officials described as the most difficult area of the negotiations....A new draft emerging on 14 June showed that compromises had been reached on more points that had previously seemed intractable, leading some sources to commend the positive spirit at the meeting....

One of the main disagreements from 13 June in the cluster entitled “Norm-setting, flexibilities, public policy and public domain” was a paragraph related to access to knowledge, which Group B first opposed. It now reads: “To initiate discussions on how, within WIPO’s mandate, to further facilitate access to knowledge and technology for developing countries and LDCs [least developed countries] to foster creativity and innovation and to strengthen such existing activities within WIPO.”

One developing country official told Intellectual Property Watch that the Group B of developed countries had added the “within WIPO’s mandate” and the word “further” had also been added to reach agreement. The original proposal from 13 June said, “to discuss possible new initiatives and strengthen existing mechanism within WIPO.”

Agreement was also reached on the public domain issue, which now starts with: “To promote norm-setting activities related to IP that support a robust public domain in WIPO’s member states.” ...

A proposal on exchanging experiences on open collaborative projects for the development of public goods such as the Human Genome Project and open source software was moved to another cluster, together with a proposal related to counterfeiting and piracy and one on best practices for economic growth, sources said. At press-time, these issues were being heatedly debated, according to sources.

Update. Also see James Love on the KEI Policy Blog:

I think this is a very good outcome, and gives WIPO the mandate that it needs to move forward in this area.

Some of the Group B countries had difficulty explaining why they were opposed to WIPO discussing “access to knowledge.” Even more important, the developing country delegations were very strong on this issue. Now it will be necessary to build the case for specific A2K initiatives at WIPO, in an environment where WIPO has agreed that the topic is relevant and appropriate.

Google tightens its terms for the CIC libraries deal

Questions Emerge as Terms of the CIC/Google Deal Become Public, Library Journal Academic Newswire, June 14, 2007.  Excerpt:

A copy of Google's most recent library book scanning deal with the 12 libraries of Committee on Institutional Cooperation hit the web this week and, as it goes with all things Google, generated no small amount of controversy. The most intriguing aspect of the deal: CIC libraries may not see digital copies of in-copyright books scanned by Google for a very long time, if ever.

Section 4.11 of the agreement states that Google will hold the "University Copy" of these works "in escrow," releasing them to the contributing libraries if the "in-copyright Work becomes public domain;" if the "library party has obtained permission through contractual agreements with copyright holders that includes the right to make a copy of the In-Copyright Work and to provide it to the CIC or Source CIC University;" if "well established case law exists that in-copyright works can be copied and held" by the libraries without infringing on the rights of a copyright holder; if Google is in "material breach of its obligations" under other sections of the agreement; or if the CIC and Google "agree in writing" that the release of particular in-copyright works is "legally supported and appropriate." The exceptions, however, are pre-existing terms in the agreements two CIC libraries, Michigan and Wisconsin, signed with Google that call for the company to provide copies of scanned works.

On his personal blog yesterday, Digital Library Federation executive director Peter Brantley suggested he wasn't about to hold his breath waiting for Google's escrowed copies to be released. "Pretty much, unless Google ceases business operations, or there is a legal ruling or agreement with publishers," he wrote, "in-copyright material...will be held in escrow until such time as it becomes public domain." Further, should an agreement with publishers resolve the issue of Google's scanning of in-copyright works, which many experts believe is a much more likely outcome that a court ruling, it is highly probable that any agreement would forbid the transfer of these copies to libraries. "I find it hard (not impossible, but hard) to imagine why publishers, as a community, would permit the CIC to obtain such copies," Brantley noted. "The 'library copy' is something that has deeply irritated them since the Google Book Search program started."

The deal also includes a clause (4.9a) that has libraries, the majority of which in the CIC are public institutions, indemnify Google, a multi-billion private company, should disagreements arise over copyright status....

No trade-off between OA data and OA postprints

Stevan Harnad, On Patience, and Letting (Human) Nature Take Its Course, Open Access Archivangelism, June 14, 2007.

Summary:  Peter Murray-Rust is anxious to ensure that all research data should be harvestable and data-mineable, by man and machine alike. He worries that authors might instead agree to transfer copyright to their publishers for their data (as many already transfer it for their article texts) in exchange for the publisher's green light to self-archive. Not to worry. If authors don't self-archive their data at all today, when they hold all the rights, nor do 85% of them self-archive their articles (not even the 62% for which they already have their publisher's green light), then why on earth would they transfer copyright for their data in exchange for a green light to self-archive both? So first things first: Focus on ensuring OA for all article texts (postprints) by first mandating immediate deposit of all postprints as soon as they are accepted for publication (without necessarily insisting that access to those deposits be immediately set to OA). All else will follow from that simple step, as surely as day follows night. OA is just a matter of keystrokes.

Implications for OA from Pierre Bourdieu

Ulrich Herb, Open Access: Soziologische Aspekte, Information Wissenschaft & Praxis 58, 4 (2007).  Self-archived June 14, 2007.  In German but with this English-language abstract:

Claims for Open Access are mostly underpinned with science-related (Open Access accelerates scientific communication), financial (Open Access alleviates the serials crisis), social (Open Access reduces the Digital Divide), democracy-related (Open Access facilitates participation) and socio-political (Open Access levels disparities) arguments. Using sociological concepts and notions this contribution analyses some of the presumptions mentioned. Naiveties as the assumption that access to information and knowledge would be sufficient to even out disparities are not considered as they are widely disproved by findings from the Sociology of Education and Social Psychology. This contribution focuses strongly on Pierre Bourdieu's theory of (scientific) capital and its implications for the acceptance of Open Access and Michel Foucault's discourse analysis and the implications of Open Access for the Digital Divide concept. Bourdieu's theory of capital implies that the acceptance of Open Access depends on the logic of power and the accumulation of scientific capital. It does not depend on slogans derived from hagiographic self-perceptions of science (e.g. the acceleration of scientific communication). According to Bourdieu's theory it is crucial for Open Access (and associated concepts like alternative impact metrics) how scientists perceive its potential influence on existing processes of capital accumulation. Considering the Digital Divide concept Foucault's discourse analysis suggests that Open Access may intensify disparities, scientocentrisms and ethnocentrisms.

OA archiving at six Quebec universities

Karen Herland, Librarian poster forum opportunity to share research, Concordia Journal, June 14, 2007.  Excerpt:

...Kumiko Vézina presented her research on professors’ attitudes about open access publishing and archiving [at Concordia University's 5th Annual Poster Forum, June 1, 2007]. She sent a survey to researchers in the six major Quebec universities asking how interested they were in making their research papers completely accessible in online repositories and journals.

She revealed that more than a quarter of the researchers had already published in open access journals, but very few were consistently self-archiving their work in repositories. Their hesitation usually hinged on misunderstanding about the legality and impact of such repositories and not on disagreement with the underlying principles.

Vézina concluded that librarians had an important role to play in countering some of these apprehensions. She also urged the librarians present to ensure that internal repositories were not only available, but actively promoted within their institutions.

Concordia is going ahead with plans to establish an internal repository of our own. Equipment has been purchased and the project will be introduced over the next academic year....

EC committee discussion of OA policy

Back on April 18, I blogged an EC press release on a meeting of the EC's High Level Expert Group on Digital Libraries.  The meeting was supposed to discuss (among other topics) "how to ensure more open access to scientific research...."  But I never saw anything about the meeting's outcome.

Here, finally, are the summary minutes from that meeting.  (Thanks to Gary Price.)  Excerpt:

Important steps have been taken in the scientific information area. The recently adopted Communication on "Scientific Information in the Digital Age: Access, Dissemination and Preservation" examines how new digital technologies can be better used to increase access to research publications and data. The Communication was presented at a major European conference on Scientific Publishing in the European Research Area on 15-16 February 2007. The discussion at the conference, in particular the debate around the petition for open access to publicly funded research results, confirmed that views between stakeholders are still far apart on this crucial issue. The Communication will be discussed in the coming months by the European Parliament and the Council, and with funding bodies and other interested parties. The high level group can play a key role in this debate, in order to find well balanced and reasonable solutions....

Mrs Reding stated the Commission’s objective on scientific information: to improve access to scientific information, since faster and wider access to research results enhances its use and impact. The ongoing debate between scientists, funding bodies and scientific publishers on access to scientific information has been intense and often heated. The Commission is acting as facilitator in this discussion, taking well into account the arguments of both sides. The Subgroup on scientific information has produced some first results in terms of agreed principles on access and preservation of scientific information. However, key points of disagreement persist, in particular concerning the option of mandatory deposit in open access repositories after a so called "embargo period". The key question is the duration of this embargo period, to guarantee access to publicly funded research while allowing a fair remuneration for the work of the publishers. The Commission, as an important funding body for scientific research in Europe, intends to take direct action as far as the 7th Framework Programme is concerned: In specific programmes, there will be experiments with the deposit of articles resulting from Community funded research in an open repository, after an embargo period of a duration to be defined. The Commission encourages bottom-up solutions agreed between stakeholders and will take steps to facilitate such solutions, but in case this is not achieved, a "top-down" intervention may be considered.

Ms Niggemann reported about the discussion within the subgroup, recalling the special features of scientific and scholarly information and presenting the meeting document "Principles on scientific publications agreed by representatives of publishers, the scientific community and libraries". This document highlights areas of agreement and disagreement between the stakeholders. One of the main points of disagreement concerns the option of mandatory deposit in open access repositories after an embargo period. Scientists propose that it should be mandatory to put "author's-final" or "final-published" version of articles stemming from publicly funded research in open access repositories after a certain period for access and preservation purposes, ideally not later than 6 months after publication. On the other side, publishers claim that free availability of any post peer-review version of an article would undermine its saleability. Another area of disagreement concerns the possible duration of the embargo period, which would vary between the following extreme values: from "3 months after publication" to "70 years after the authors' death (copyright expiry)". Another area where agreement is difficult is the affordability of publications, including the question of a possible decrease of price of articles over time. Mrs. Niggemann concluded by indicating that, even in presence of these differences of views, the discussion in the subgroup had also its promising side as all parties involved showed interest in performing practical tests with the different options.

Publishers (Ms Dutton, Mr Mabe and Mr Cowhig) confirmed that the main outstanding issue concerned embargo periods. In order to break the current impasse, they tabled a proposal for a large scale pilot study to measure the impact of embargo periods, differentiated by subject areas. The experiment would concern, on a voluntary basis, about 200 journals from different publishing houses and from different disciplines (from hard sciences to humanities). The pilot would be conducted by publishers with the involvement of qualified and independent European researchers in bibliometrics and it would be supported by the European Commission. An independent monitoring and assessment would be conducted concerning the impact on research productivity, on access, on cost effectiveness, on submission of articles to journals, on journal subscriptions. The publishers are also working on a service charter to increase transparency, and they recommended not to destabilise the current system with ill-conceived attempts to change it.

Library representatives (Ms Niggemann and Ms Brindley) welcomed the proposal for an experiment and suggested to involve The European Digital Library's infrastructure and national libraries.

The representatives of the scientific community (Prof. Kroo, Dr Romary, Prof. Noorda,) stated that expectations of the scientific community towards open access are big, and these have been clearly expressed by several organisations such as the European Union Research Advisory Board (EURAB), the European Research Council (ERC), and the European Science Foundation (ESF). The subgroup achieved some steps ahead, but many unresolved questions are still to be tackled, including possible more transparent business models and cost issues (prices of publications). The increasing speed of obsolescence of knowledge calls for its quicker and wider availability. If this is not achieved, Europe will inevitably lag behind. The proposal for a pilot study tabled in the meeting by the publishers was seen as a positive step. However, it needs to be further considered in the light of the expectations of the scientific community on access to articles resulting from publicly funded research (as for example expressed in the ERC and EURAB statements). In any case, researchers and research institutes should be closely associated to the definition and implementation of the experiment in order to achieve a balanced design....

[T]he next plenary meeting [of the Expert Group is] envisaged for October/November 2007....

PS:  For more information on members of the Expert Group named in the minutes, see the group's membership page.

SPARC partners with OA repository for agricultural research

AgEcon Search is an OA repository for agriculture and applied economics at the University of Minnesota, and now it's also a SPARC Partner.  From SPARC's announcement:

...This SPARC Scientific Communities partnership recognizes how the creators of AgEcon Search have developed a model subject-specific repository that is innovative, collaborative, and successful as a focal resource for studies in the field.

The AgEcon Search collection includes current and archival working papers, journal articles, and conference papers that focus on agricultural economics and sub-disciplines such as agribusiness, food supply, natural resource economics, environmental economics, policy issues, agricultural trade, and economic development. The project is a collaboration of the University of Minnesota Libraries, the university Department of Applied Economics, and the American Agricultural Economics Association. Special projects have been funded by grants from the Farm Foundation, the USDA Economics Research Service, the American Agricultural Economics Association Foundation, and the National Agricultural Library.

Although launched 10 years ago as a repository for current working papers, AgEcon Search now includes 13 journals - and that number will grow in the coming year. Journal participants are diverse. Some are e-only journals that have their own Web sites but are part of AgEcon Search because it enhances their visibility and use. Others are print journals for which AgEcon Search serves as the only electronic distribution channel. A few have undertaken digitization projects and have contributed material back to the 1950s. AgEcon Search will serve as the permanent archive for this literature and encourages authors and organizations to use the electronic library as the storehouse for additional appropriate scholarly electronic works....

Ten years since its launch, AgEcon Search has become an important tool for academe and industry. The collection contains over 24,000 papers from 140 institutions and professional associations, and over 1.25 million downloads have been recorded since 2001....

Thursday, June 14, 2007

Can the UN discuss facilitating, or only blocking, access to knowledge?

James Love, U.S. Government Opposition to Term "Access to Knowledge" in Key WIPO Negotiation, The Huffington Post, June 13, 2007.  Excerpt:

I'm in Geneva at a meeting on the World Intellectual Property Organization (WIPO), in a negotiation on something called the WIPO "Development Agenda." As Thiru Balasubramaniam has written in this blog entry, the U.S. government, as well as other members of a rich country negotiating bloc called "Group B," have opposed the use of the term "access to knowledge," in the context of topics that should be discussed by the UN agency responsible for setting global norms on intellectual property policy. Other Group B countries also have taken this position.

Technically, we are discussing the draft text on "issues related to norm-setting, flexibilities, public policy and the public domain," where the controversial paragraph 3 now has the following brackets:

3. To discuss possible new initiatives and strengthen existing mechanisms within WIPO to facilitate [access to knowledge] and technology for developing countries and LDCs and to foster creativity and innovation within WIPO's mandate.

Canada said it "didn't understand" what "access to knowledge" meant. The UK indicated that there was a sentiment by many countries that while WIPO could discuss measures that would make access to knowledge hard, such as tough new digital copyright laws, it shouldn't discuss proposals, like a treaty to provide minimum access to works by libraries, teachers and the blind, which would expand access.

Here "access to knowledge" is referred to by many simply as A2K, a term that is apparently terrorizing the many lobbyists for publishers. I'm hoping the U.S. will come around, and agree that yes, the U.N. can actually "discuss possible new initiatives" to facilitate "access to knowledge." It is rather amazing that this is even controversial.

OA to Anthropologica backfile

The Canadian Anthropology Society has provided OA to the back issues (2002-2005) of its journal, Anthropologica.  (Thanks to

Notes on the Ghana OA and IR workshop

Yao Mereku has blogged some notes on the Open Access and Institutional repository sensitization workshop in Ghana (Accra, June 12-13, 2007).  Excerpt:

...Open Access activist from all over the world were there to share their works and achievements, and also inform us about what is going on in other country....

In all I have learned that the way forward for our literature archiving bodies is digitization....

Wednesday, June 13, 2007

Questionnaire for repository managers

Poornima Narayana is the Deputy Head of the Information Centre at India's National Aerospace Laboratories and writing a doctoral dissertation on institutional repositories in India and policies to fill them.  To collect data, she has created an online questionnaire for repository managers.  If you can speak for an IR, in any country, I hope you can find the time to fill out her questionnaire. 

Beyond declarations to policies

Stevan Harnad, The Gap between OA Precept and OA Practice, Open Access Archivangelism, June 13, 2007.

Nowhere is the gap between precept and practice more apparent than in the disparity between the number of institutions that have signed the vague and pious Berlin Declaration in support of the abstract principle of Open Access, and the number of institutions that have actually registered a concrete Open Access policy in ROARMAP.

But that gap is closing...

DINI presentations for OA repository managers

Presentations from the DINI meeting Technische und rechtliche Rahmenbedingungen für die Betreiber von Open-Access Repositories
(Berlin, June 5-6, 2007) are now online.

"Scholarly publishers should start expert Web 2.0 projects"

Larry Sanger, What Strong Collaboration Means for Scholarly Publishing, Citizendium Blog, June 12, 2007.  Excerpt:

I gave the keynote last Thursday at the Annual Meeting of the Society for Scholarly Publishing, “Imagining the Future: Scholarly Communication 2.0,” in San Francisco.  The speech was called “What Strong Collaboration Means for Scholarly Publishing.” ...Here are the opening paragraphs:

When I was asked to speak to you, the Society for Scholarly Publishing, I have to admit that I found this puzzling, because I don’t know anything about scholarly publishing. Why should someone who knows so little about scholarly publishing be asked to give a speech to the Society for Scholarly Publishing? That’s a paradox.

I found a similar paradox in an article by John Thompson in the Chronicle of Higher Education from 2005. Thompson wrote: “academic publishers can survive today only if they become something other than academic publishers” (June 17, 2005).

The quote actually explains why I’m here. I’m here because I can tell you about a way to become something other than academic publishers. I suppose this is a little absurd, but as a philosopher, I am trained to take joy in life’s little absurdities.

So I’m going to try to make the case that scholarly publishers should start expert Web 2.0 projects. Here’s my plan for the talk.

  •  I’m going to begin by painting a picture, a vision of what information online could look like in ten or twenty years. In short, I’m going to build a castle in the air. But then I will try to put a foundation underneath it.
  • I’ll go over a number of examples of free encyclopedia projects from which we can learn.
  • Then I’ll draw out some general principles.
  • I’ll consider various business models for projects started by scholarly publishers.
  • Finally, I’ll give you some ideas for projects you might start.

More here.

Nature launches another OA resource

Nature has launched Scintilla.  From its about page:

Scintilla collects data from hundreds of news outlets, scientific blogs, journals and databases and then makes it easy for you to organise, share and discover exactly the type of information that you're interested in.

For example, you can keep track of life science podcasts, or the latest papers on schizophrenia, DNA methylation or immunology. Interested in physics blogs? Scintilla can help.

You can rate items and recommend them to any colleagues who've also signed up to the site. You can also create or join groups centered around particular areas of interest (like bioinformatics or open science).

A short tutorial is available if you'd like a little help with getting started....

From the page for content producers:

Scintilla is a new information filtering and personalisation site for scientists. To provide readers with useful, interesting and stimulating information we're including freely-accessible content from as many of the best science bloggers and news sources as possible.

To do this we're aggregating content from RSS/Atom feeds. We would like to assure you that items shown on Scintilla will always be credited to the original site, there'll always be a link back to the item on the original site and if you ever want us to remove your content from our servers then we'll be happy to oblige, though we hope you'll be pleased by the increased readership....

In case anyone's worried about losing advertising revenue, there's a solution to that too (more details will be available soon). In your user profile on Scintilla, you can claim a blog as your own and enter an Google AdSense ID. From then on, whenever an item from your site is displayed on Scintilla, AdSense ads displayed alongside the item will use your AdSense ID, so click-throughs will be credited to your account, just as if they were displayed on your own site....

From Alf Eaton's post about it on Nature's Nascent blog:

Scintilla is an aggregator —of science weblogs, news stories and publication databases— but it works in a slightly different way from the existing online RSS readers that cover the whole internet. For a start, the sources are manually selected, and only related to science, so there shouldn't be any trouble with spam when searching for stories. Also there's no 'unread items' count, so you don't have to feel like you have lots of reading to catch up on. Browse the site, add sources to your collection, and visit your 'Read' page on Scintilla whenever you're looking for some juicy science stories to read.

The other important feature of Scintilla is ratings and recommendations....These ratings will be analysed alongside everyone else's and used to recommend stories that you might like....

You can also manually recommend stories to other people: either to individual members of your social network, or to groups that you've joined....[I]f your speciality isn't covered then feel free to start up a group and invite your colleagues to join....

A quick note: we sent emails about Scintilla to a selection of bloggers, but it's hard to find contact details for 600 or so weblogs. For everyone we didn't manage to contact directly, please join the mailing list/discussion group and/or send feedback to

U of Victoria facilitates electronic submission and OA for ETDs

Canada's University of Victoria has made it easier for grad students to make OA their theses and dissertations OA.  From yesterday's announcement:

University of Victoria graduate students can now submit their theses and dissertations electronically and make them immediately available on the Web. This kind of open access gives UVic graduate student research global exposure.
Approximately 350 to 400 theses and dissertations will be posted in PDF format annually in UVic’s digital archive, UVicDSpace. As UVicDSpace grows, we plan to include faculty publications and research as well.


  • easy submission process for graduate students
  • instant availability online through UVicDSpace
  • instant availability online through Google Scholar, Google and the CARL Harvester
  • instant availability online coming soon through Library and Archives Canada Theses Portal
  • microfiche format as a preservation copy
  • no need to print multiple copies for the Library
  • no need for physical storage space
  • wider sharing of research; thesis more frequently cited

For more information see the ETD Website...or Faculty of Graduate Studies website.... 

Comment.  Good move.  This process should be as easy as possible.  Next steps:  (1) don't require PDFs, (2) when students choose OA, encourage them to use CC licenses or the equivalent, and ultimately (3) mandate OA for ETDs even if the university also accepts or requires a printout.

Peter Cochrane calls for OA

Peter Cochrane, Are we destined to repeat history?, June 13, 2007.  Cochrane is the former CTO and Head of Research at BT.  Excerpt:

Throughout my professional career I have relied on professional journals and publications as my primary sources of refereed information for the reporting of scientific and engineering results. But this old world of open corroboration and reference seems to be retreating real fast....

I have to confess I...tend to increasingly look upon most professional journals as the irrelevant preserve of the academic and technologically introverted communities.

How did this happen? The professional institutions have been the laggards in a world where they preached change. They have stuck with and protected the old paper publications and refereed system. They have also promoted closed and exclusive websites. But the world has moved on with everything biased toward the open and fast moving.

The only way these professional groups can reclaim the high ground is to digitise their extensive libraries from the past, provide open access to everyone and migrate to a faster publishing and review regime. Will they do it? I hope so but I don't see it happening! They are mostly made up of old minds with a vested interest in preserving the past.

So what will happen? I think the bifurcation of the old and new will accelerate. And for sure the old doesn't have much time. The good news is that repeating some of the work from the past using the latest technology and techniques might just reveal things that were missed or misunderstood the first time around.

Training for HINARI and AGORA

Siân Harris, Training increases HINARI and AGORA benefitsResearch Information, June/July 2007.  Excerpt:

...In 2002, medical staff and researchers in the world’s poorest countries began to have access to the latest cutting-edge research thanks to the World Health Organization’s (WHO’s) HINARI (Health InterNetwork Access to Research Initiative) programme. Through this, around 100 biomedical publishers – from the giants with hundreds of journals to small society publishers with one or two titles – opened up electronic access to their products to those who cannot afford to subscribe to any journals. This resource currently gives access to around 3,750 biomedical journals, which would be equivalent to around £1.5 m per year in subscription costs. This gives trainee doctors in Sierra Leone, for example, access to the same range of resources as their counterparts in Oxford University or Harvard. Such resources have the potential to transform clinical practice, medical training and research in such countries, and 2,500 institutions in the developing world have so far signed up to access these resources.

However, there is a challenge: until recently many institutions in the developing world have been relying on 20 or 30 year old books for their research and teaching and have little or no experience of using the internet. This is where [Lenny Rhine, retired librarian from the University of Florida] comes in. He runs training courses to teach medical librarians and health workers how to use the massive array of resources that have now become available to them. ‘They have gone from nothing to almost too much, so we spend a lot of time teaching search skills,’ he commented....

OA publishing in physics

Siân Harris, Is physics the new biomedicine? Research Information, June/July 2007.  Excerpt:

...[BioMed Central] has begun to move beyond the boundaries of biomedical sciences. Chemistry Central was launched last year and now gives access to seven open-access titles. This has been followed by PhysMath Central.

This isn't the first time that the publishing community has applied the open-access model to physics or maths. The OSA has published Optics Express under this model for 10 years and the open-access New Journal of Physics has been jointly published by the Institute of Physics Publishing and Deutsche Physikalische Gesellschaft, the German physical society, since 1998. Meanwhile physics, maths and engineering take centre stage in the portfolio of the already-profitable Egyptian open-access publisher Hindawi Publishing. And, like biomedical researchers, physicists have had their own open-access archive, arXiv, for many years.

Despite these significant moves, however, there is still caution about the role of open access in physics, according to Chris Leonard, publisher of the new PhysMaths Central website. He described many editors of physics journals as 'quite luke warm' on the topic of open access.

So why has BioMed Central been attracted to physics and maths now? According to Leonard, several factors combined to make this a good time. Having been going for seven years and being near to profitability, the open-access publisher wanted to broaden its scope. 'We found out that there was quite an appetite for similar services in other disciplines,' he explained. 'It was partly a push from our side and partly a pull from the scientists' side. They've seen their colleagues benefiting from open-access exposure to their research.'

This can particularly be seen in another significant factor that has been emerging in tandem with the publisher's own ambitions. CERN, the huge international particle-physics facility in Switzerland,...[has] a new plan: to make all particle-physics research results open access....

For this reason, particle physics seemed like a good starting point for the new PhysMath Central....But this is not the end of the story, according to Leonard. 'We will cover the whole scope of physics and maths,' he promised. He anticipates that PhysMath Central will launch about seven titles this year....

PhysMath Central is capitalising on researchers experience with [arXiv] by enabling them to submit papers to the PhysMath Central journals directly from arXiv or to submit papers to both arXiv and the new journals at the same time. 'We hope that this will help cement arXiv as the repository for physics. We are also beginning to speak about ways to avoid duplication of effort. This might include automatic deposit in arXiv of the final, peer-reviewed versions of papers,' he said. Another area of interest for PhysMath Central is the raw data: 'We want to host the raw data accompanying the articles wherever we can.'

BMJ publishing director on OA

Siân Harris, Physicians and researchers have different needs, Research Information, June/July 2007.  An interview with Alex Williamson, publishing director at the BMJ Group.  Excerpt:

Do clinicians want open access?

There is a tendency to generalise with open access. We tend to be lumped into the category of biomedical research but doctors don’t treat rats. Biomedical research tends to be too far removed from the day-to-day activities in a hospital to be of practical interest to many practicing clinicians.

What’s more, clinical journals aren’t quite like pure science research publications, where the readers and authors are the same group of people. Journals like ours do appeal to researchers but, in the main, they are read by practicing clinicians, many of whom will never write a journal paper. And, if they do, they are often simply writing up work that they think will interest colleagues, without receiving any funding to do so.

We have a hybrid open-access model, which started in September last year. Uptake has been around two per cent of authors. I would be surprised if we moved to the open-access model. I think we’ll stay with the hybrid model. All our material is free after 12 months anyway....

IRs and permissions

Stevan Harnad, Get the Institutional Repository Managers Out of the Decision Loop, Open Access Archivangelism, June 12, 2007. 

Summary:  Many Institutional Repositories (IRs) are not run by researchers but by "permissions professionals," accustomed to being mired in institutional author IP protection issues and institutional library 3rd-party usage rights rather than institutional author research give-aways. The solution is to adopt a sensible institutional (or departmental) deposit mandate and then to automatize the deposit procedure so as to take Repository Managers out of the decision loop, completely.

The optimal deposit mandate is to require Open Access deposit of the refereed final draft, immediately upon acceptance for publication, but there is a compromise for the faint-hearted, and that is the Immediate-Deposit/Optional-Access (ID/OA) Mandate:

The only thing standing between us and 100% OA today is keystrokes. It is in order to get those keystrokes done, at long last, that we need OA mandates, and ID/OA is a viable interim compromise: It gets all N keystrokes done for 62% of current research, and N-1 of the keystrokes done for the remaining 38%. For that 38%, the "Fair Use Button" will take care of all immediate researcher usage needs for the time being. The robots will have their day once 100% deposit mandates prevail and the research community tastes what it is like to have 62% OA and 38% almost-OA world, at long last. For then those Nth keys will inevitably get stroked, setting everything to Open Access, as it should (and could) have been all along....


  • Funding agencies can bypass permission problems because they are upstream from publishers.  Researchers sign their funding contracts before they sign their copyright transfer agreements with publishers.  Hence, if funders are firm (the Wellcome Trust is the model here), they can mandate OA and publishers will have to choose between accommodating their policies and refusing to publish work by funded researchers.  But institutional repositories are downstream from publishers and cannot bypass permission problems in the same way.  I don't care much who runs an IR.  But we shouldn't assume that a university-level OA mandate will make permission problems disappear.  (I'm not saying that Stevan makes this assumption.) 
  • Most journals already permit postprint archiving, so no permission-seeking is necessary.  But in the minority of cases for which it is necessary, it's a great boon to have some permission-seekers on staff to assist authors.
  • When the relevant permission problem is an embargo, Stevan's remedy is exactly right.  Authors should deposit their work immediately upon publication, but only flip the access switch from closed to open when the embargo expires.  During the period of closed access, the repository can provide OA to the metadata and the author can email copies of the text to readers who request them.

Nottingham creates an OA publishing fund

Stephen Pinfield announced at the June 6 ARMA conference that the University of Nottingham had set up an OA publishing fund to help cover publication fees charged by fee-based OA journals.  See his slide presentation, Setting up central funds and processes for open-access publishing and dissemination

From Natasha Robshaw's summary of Pinfield's presentation:

The final presentation was given by Stephen Pinfield, Chief Information Officer, University of Nottingham, who discussed how libraries and research administration can and should work together to set up central funds and processes for open access publishing.  The presentation started with an overview of the funders' policies and the routes to open access via open access journals and repositories.  It is hoped that, soon, all UK institutions will have repositories in place, but as an interim measure JISC has launched the Depot, a national repository to which any UK author can submit their research articles.  Stephen went on to discuss the need for institutions to put in place arrangements to manage Wellcome funding and similar allocations from other funders to cover OA charges. He noted that clear policies, publicity and support need to be associated with these funds.  Stephen announced that Nottingham University has set up a central fund for any author to apply for when publishing in open access journals.  This fund covers all Nottingham University authors, no matter who they are funded by, and in its first year is set at £20,000 from FEC income. The fund  is expected to grow significantly in future years.  Their library’s periodicals budget is currently £2.7 million so, as open access publishing grows to account for a larger fraction of all publications, it is likely that the money from this subscription budget will also be transferred to supplement the open access publishing fund.


  • Universities need to join funding agencies in helping to pay publication fees at fee-based OA journals.  Kudos to Nottingham for its willingness to do so (as I once put it) "today as an investment in a superior scholarly communication system, tomorrow from the savings on canceled subscriptions." 
  • If there are other universities with similar funds, I'd like to hear about them.
  • Of course, any university willing to pay these fees should also be willing to adopt a strong policy encouraging or requiring OA archiving for the research output of the institution.  The two strategies are compatible and complementary.

ARMA presentations on OA publishing

The presentations from the ARMA conference, Open Access Publishing: Funding Mechanisms and Institutional Collaboration (Cardiff, June 6, 2007) are now online via the BioMed Central blog.  BMC's Natasha Robshaw summarizes each presentation and links to the PPT slides.

Tuesday, June 12, 2007

New study on data access and dissemination

The Research Information Network (RIN) has announced a new study, Publication and quality assurance of research data outputs.  From the description:

The RIN, supported by JISC and the Natural Environment Research Council (NERC), has commissioned a project to investigate, with reference to selected areas of the scholarly spectrum, (i) the nature and range of arrangements for making research data as widely available as possible (‘data publishing’) and (ii) the role that data outputs currently play alongside or as an alternative to conventional publications in the research communication process....

The study, initiated in June 2007, is being undertaken on behalf of the joint funders by the Key Perspectives consultancy....

[T]he objectives are to add to the current state of knowledge by gathering evidence on: ...

Access and use

  • Access arrangements for the data, taking account of any legal, ethical and other constraints; associated with this, IPR related to the collection and presentation of data.
  • The needs of the communities using data, especially in the context of rapid technological change and the methodologies stemming from e-research and the development of the semantic web.
  • To meet those needs, the different ways and environments in which the data are disseminated and published (important note: ‘publishing’ here is defined in the broadest possible sense, to cover all the mechanisms and channels used to make data accessible to users).
  • The extent to which data outputs complement or replace the reporting of research results in scholarly journals or books; relationships and linkages between data and such information outputs....

The project will serve as a basis for subsequent work that will seek to assess the evidence gathered through the present study and present conclusions and recommendations regarding the evolution of the data publication environment. This second project will be commissioned as soon as possible following the completion of the present study....

The project is due to be completed at the end of 2007.

Public-domain ransom supermarket from Public Resource

Last month, if you remember, a new non-profit called Public Resource found that the Smithsonian Institution was selling public-domain photographs, bought copies, and posted them to Flickr

Now it's generalizing the practice.  (Thanks to Carl Malamud via Cory Doctorow via Rob Styles.)

Whenever a US government web site sells public domain content, Public Resource asks citizens to buy one copy and donate it to Public Resource for OA distribution.  To make this work, it has set up an online store stocked with content from 54 federal agencies.  From the store's about page:

When you buy content, we get the material from the U.S. government and then upload the data to places like the Internet Archive, Google Video, and other fine content sources. Because this data is public domain, anybody can use the material without restriction!

The Public Resources store is based on the US National Technical Information Service (NTIS), which sells government information to the public.  For example, see the Gov.Research Center, the NTIS division devoted to science and technology research.

More on the launch of Open Medicine

Dean Giustini, Doing Medical Journals Differently: Open Medicine, UBC Academic Search - Google Scholar Blog, June 11, 2007.  Excerpt:

Here, some of my Open Medicine colleagues discuss the circumstances that led to the launch of the Journal, and its existence as an example of open access's contribution to academic freedom.

Willinsky J., Murray S., Kendall C., Palepu A. Doing Medical Journals Differently: Open Medicine, Open Access, and Academic Freedom. - Working paper.

Notable, quotable quotes:

1. "Open Medicine was born of an editorial interference incident in the field of medical publishing, a field which is distinguished by its own professional and commercial influences." (You can say that again. DG)

2. "For all of the attention spent on finding the perfect economic model for increasing access to knowledge, it is important not to lose sight of scholarly communication’s other basic principles, beyond dissemination, namely editorial independence, intellectual integrity, and academic freedom." (What models will survive?) ...

4. ..."opening science to a larger world has always been a motivating force in scholarly publishing, [but] this openness is not just a matter of journals. Today, it includes initiatives focused on open data, open source biology, open encyclopedias, and a number of different “open science” projects." (Don't forget open search.)

5. "Open Medicine has raised the stakes for open access by demonstrating how this new approach can be used today to reassert editorial independence, intellectual integrity, and academic freedom." (And raised the stakes for librarians, I say.)

Four new OA journals coming from Libertas Academica

Libertas Academica has added four new titles to its list of OA journals:

The first journal has an editor (Erich Bornberg-Bauer) but the other three are still hiring.  All four are still recruiting members for their editorial boards.

Update (June 20, 2007). Add a fifth title to this list: Clinical Medicine: Arthritis and Musculoskeletal Disorders. The editor-in-chief will be Tariq Haqqi, but the journal is still recruiting the editorial board.

EBSCO proud to boost visibility of OA journals

KnowledgeSpeak has published an interview with Tim Collins, President of EBSCO.  Excerpt, quoting Collins:

...Another challenge we successfully tackled [in 2006] was to expand access to open access journals. EBSCO's databases are academia's most used for-fee online research tools, so it is our responsibility to help give exposure to open access journals. Our products now include many of these sources, offering them more exposure than was ever before possible. It was a time consuming project, but now we offer indexing for the best of these journals in our subject-specific research databases and we are set up to add good journals as they become available in the future....

Milestone for BMC Bioinformatics

The BioMed Central blog reports a milestone for BMC Bioinformatics:

BMC Bioinformatics reached an impressive milestone in May 2007, receiving 105 manuscript submissions - a record number for the journal. This is the first first time that monthly submissions for a single BioMed Central journal have reached triple digits, and demonstrates the continuing success and growth of BioMed Central's journals.

We are especially pleased to see BMC Bioinformatics maintaining and improving its position, as bioinformatics is now a very competitive field of publishing, and authors have many open access journals to choose from....

To see all the latest research from BioMed Central in bioinformatics and related fields, check out our Bioinformatics and Genomics Gateway.

PS:  Rising submissions help a journal improve its quality and impact, which in turn help it improve its submissions even further in a benign circle.  Congratulations to the whole Bioinformatics team. 

DRIVER wiki for sharing info on European repositories

DRIVER has launched a DRIVER wiki.  From the site:

The DRIVER Wiki is a key avenue for the sharing of information, current events and developments across Europe. The Wiki will not be an authoritative source of information but instead will represent latest developments across Europe and provide an avenue for local events and developments to be made visible to the larger European community. For authoritative information on the state of development of institutional repositories in your country please see the Countries page.

Mentor is a service for developers and managers of institutional repositories across Europe. The purpose of this service is to introduce developers and managers of institutional repositories to their peers on a one-to-one basis in the belief that the sharing of experiences can assist those that follow.

APLA converts its journal to OA

Jennifer Richards reports that Canada's Atlantic Provinces Library Association has decided to convert the APLA Bulletin to OA.  From her announcement:

I am pleased to report that the Atlantic Provinces Library Association passed the following motions at their Ordinary General Meeting during the CLA/APLA/NLLA National Conference held in St. John’s NL May 23-28, 2007:

Motion: It is moved that APLA support open access publishing, and begin with Volume 71 of the APLA bulletin, to be open access.

Motion: It is moved that APLA adopt a creative commons copyright statement with the first online issue, to replace the current copyright statement.

Both were passed unanimously.

I believe this makes APLA the second library association in Canada to fully support open access publishing.

PS:  Kudos to APLA.

A proposal for redirecting subscription funds to support OA journals

Heather Morrison, A potential positive cycle: more access, more funds, Imaginary Journal of Poetic Economics, June 11, 2007.

Abstract:   Hypothesis: a process of transitioning to open access can unleash funds, creating a positive cycle of increasing access and freed funds to create more open access; the very opposite of the negative serials pricing spiral of recent decades, which featured increasing prices and decreasing access.

As support for this hypothesis, this post looks at the potential for open access if libraries were to focus on high-priced journals (US $1,000 or more for an institutional subscription), and succeed in working with their faculty to convert just 10% to a volunteer / in-kind support model.

It is estimated with such a scenario, that individual libraries could save up to $450,000 US from their budgets after spending on open access journal support is factored in. The cumulative savings for libraries are potentially huge; for example, if the ARL libraries subscribed to just a quarter of these journals each, the annual savings for ARL would be in the order of $13.8 million annually. This would only be a fraction of the savings for libraries, as ARL is only a subset of libraries, albeit large ones. The true collective savings for libraries would have to factor in libraries around the globe, including libraries in Europe and the somewhat smaller libraries in North America. If these savings were invested in further open access initiatives, libraries would save even more, freeing up more funds to create more access....

Milestone for RoMEO

SHERPA issued this announcement earlier today:

SHERPA is delighted to announce that its service RoMEO has doubled its entries in the past year and now lists 300 publisher policies on self-archiving.

Why is RoMEO important?

If an academic author wants to put their research article on-line, they are faced with an increasingly complex situation....[S]ome publishers prohibit authors using their own articles in this way. Others allow it, but only under certain conditions, while others are quite happy for authors to show their work in this way....

The RoMEO service...offers users the ability to:

  • View summaries of publishers' copyright policies in relation to self-archiving
  • View if publisher policies comply with funding regulations, as some publishers are too restrictive and cannot be used to publish funded research
  • To search journal and publisher information by Journal Title, Publisher Name and ISSN ...

PS:  Thanks and congratulations to SHERPA.  RoMEO is indispensable for the OA movement; and as it covers more publisher policies, it becomes even more indispensable.

More on removing permission barriers

Stevan Harnad, Open Access: What Comes With the Territory, Open Access Archivangelism, June 12, 2007.

Summary: Downloading, printing, saving and data-crunching come with the territory if you make your paper freely accessible online (Open Access). You may not, however, create derivative works out of the words of that text. It is the author's own writing, not an audio for remix. And that is as it should be. Its contents (meaning) are yours to data-mine and reuse, with attribution. The words themselves, however, are the author's (apart from attributed fair-use quotes). The frequent misunderstanding that what comes with the OA territory is somehow not enough seems to be based on conflating (1) the text of research articles with (2a) the raw research data on which the text is based, or with (2b) software, or with (2c) multimedia -- all the wrong stuff and irrelevant to OA.


  • Stevan is responding to Peter Murray-Rust's blog post from June 10.  But since I agreed with most of what Peter MR wrote, I'll jump in.
  • Stevan isn't saying that OA doesn't or shouldn't remove permission barriers.  He's saying that removing price barriers (making work accessible online free of charge) already does most or all of the work of removing permission barriers and therefore that no extra steps are needed.
  • The chief problem with this view is the law.  If a work is online without a special license or permission statement, then either it stands or appears to stand under an all-rights-reserved copyright.  The only assured rights for users are those collected under fair use or fair dealing.  These rights are far fewer and less adequate than OA contemplates, and in any case the boundaries of fair use and fair dealing are vague and contestable.
  • This legal problem leads to a practical problem:  conscientious users will feel obliged to err on the side of asking permission and sometimes even paying permission fees (hurdles that OA is designed to remove) or to err on the side of non-use (further damaging research and scholarship).  Either that, or conscientious users will feel pressure to become less conscientious.  This may be happening, but it cannot be a strategy for a movement which claims that its central practices are lawful.
  • This doesn't mean that articles in OA repositories without special licenses or permission statements may not be read or used.  It means that users have access free of charge (a significant breakthrough) but are limited to fair use. 

June issue of RRT Newsletter

More on the bill to strengthen the NIH policy

Lila Guterman, Open-Access Policy Would Be Strengthened in House Panel's NIH Spending Bill, Chronicle of Higher Education blog, June 12, 2007.

The House Appropriations Committee will decide this week whether to strengthen the open-access publications policy of the National Institutes of Health. The shift, which appears in a 2008 spending bill for the Department of Health and Human Services, would make mandatory a policy that until now has been voluntary, according to Heather Joseph, executive director of the Scholarly Publishing and Academic Resources Coalition, which advocates for open access. The bill was approved last week by a House Appropriations subcommittee.

The two-year-old policy requests but does not require that NIH grantees upload their research papers to an online repository no later than 12 months after publication. But not many scientists have made use of the repository. “Mandatory is a crucial element just because compliance is so low without it,” said Ms. Joseph.

PS:  There's still time for US citizens to ask their representatives to support this bill.  See the details from the ACRL and the frequent updates from the ATA.  The NIH is the world's largest funder of medical research, and its $28 billion budget results in about 65,000 peer-reviewed articles per year.  Strengthening its public access policy from a request to a requirement is a critical goal for OA and medical research. 

Monday, June 11, 2007

AGU launches hybrid OA program for most of its journals

In April, the American Geophysical Union launched a hybrid OA program for most of its 19 journals.  Here's the policy in its entirety:

AGU journals now offer authors the opportunity to make their articles open for others to read for free. Authors choosing this option pay a fee based on article length and number of figures; these charges are designed to offset the potential loss of subscription income. This new option, called Author Choice, provides

  • Unlimited access to the article for all readers from the moment of publication,
  • Permission to deposit the PDF version in institutional repositories so long as the repository accepts AGU copyright permissions, and
  • Continued copyright protection to prevent unauthorized uses of the author’s work.

Author Choice is available to all authors who honor full publications charges. Cost is an additional $142.50 per publication unit for Geophysical Research Letters and $75 per publication unit for all other AGU journals. Authors will be able to use the AGU online calculator to determine their fees. The Author Choice option is not available for Reviews of Geophysics, Earth Interactions, and Nonlinear Geophysics.

Any article published in 2007 or currently in production is eligible for Author Choice. For articles already published, access restrictions will be removed upon payment of the Author Choice fee. If there is sufficient interest, Author Choice may be extended to articles published in 2002–2006.

Comments.  (1) It appears that a "publication unit" is a page or figure.  (2) What does it mean for a repository to "accept AGU copyright permissions"?  A repository will accept anything that the publisher allows it to accept.  (3) When authors pay publication fees, they should retain copyright.  If AGU really wants the copyright in order to protect authors, then it should let authors decide.  (4) Note that AGU doesn't promise to reduce subscription prices in proportion to author uptake. 

JISC report on Web 2.0 in higher education

Tom Franklin and Mark van Harmelen, Web 2.0 for Content for Learning and Teaching in Higher Education, JISC, May 28, 2007.  Excerpt:

This report is the result of a study into the use of Web 2.0 technologies for content creation for learning and teaching in Higher Education, funded by the JISC, and carried out between March and May 2007....

This study has focussed on the content sharing aspects of Web 2.0....

Recommendation 2: JISC should consider funding projects investigating how institutional repositories can be made more accessible for learning and teaching through the use of Web 2.0 technologies, including tagging, folksonomies and social software....

Recommendation 3: JISC should consider funding work looking at the legal aspects of ownership and IPR, including responsibility for infringements in terms of IPR, with the aim of developing good practice guides to support open creation and re-use of material....

Recommendation 6: JISC should consider funding a study to look at how repositories can be used to provide end-user (i.e. referrer) archiving services for material that is referenced in academic published material, including internet journal papers. Part of this consideration should extend to copyright issues.

Update (8/28/07). JISC has published its response to the recommendations. Excerpt:

[Recommendation 2] JISC endorses this ambition, has included relevant sections in recent calls for proposals, as a result has funded projects in this area (for example, SPIRE, Rich Tags, PROWE), and will continue to do so.

[Recommendation 3] There is a good basis already built for this work, such as proposed, current and recent projects undertaken by JISC Legal and the JISC IPR Consultancy, for example focusing on student-created content. Further work has now been commissioned looking specifically at IPR and copyright issues.

[Recommendation 6] This is an ambitious goal bearing in mind, for example, the technical and legal challenges. JISC has funded some work that may lay the foundations for this goal, such as a project to identify the significant properties of elearning materials (what needs to be preserved for them to remain useful), and has a stake in the UK Web Archiving Consortium. JISC Legal can offer advice on the copyright issues. As the report notes, wikis offer reasonable versioning and roll-back functionality.

More on removing permission barriers

Peter Murray-Rust, “open access” is not good enough, A Scientist and the Web, June 10, 2007.  Excerpt:

I have ranted at regular intervals about the use of “Open Access” or often “open access” as a term implying more than it delivers. My current concern is that although there are are tens of thousands of theses described as “open access” I have only discovered 3 (and possibly another 15 today) which actually comply with the BOAI definition of Open Access.

The key point is is that unless a thesis (or any publication) explicitly carries a license (or possibly a site meta-license) actually stating that it is BOAI compliant, then I cannot re-use it. I shall use “OpenAccess” to denote BOAI-compliant in this post and “open access” to mean some undefined access which may only allow humans to read but not re-use the information...

By contrast...the term “Open Source” is completely self-explanatory within a large community....

So I believe that “open access” should be recast as “toll-free” - i.e. you do not have to pay for it but there are no other guarantees. We should restrict the use of “Open Access” to documents which explicitly carry licenses compliant with BOAI. (A weaker (and much more fragile approach) is that a site license applies to all content. The problem here is that documents then get decoupled from the site and their OpenAccess position is unknown.)

If the community wishes to continue to use “open access” to describe documents which do not comply with BOAI then I suggest the use of suffixes/qualifiers to clarify. For example:

  • “open access (CC-BY)” - explicitly carries CC-BY license
  • “open access (BOAI)” - author/site wishes to assert BOAI-nature of document(s) without specific license
  • “open access (FUZZY)” - fuzzy licence (or more commonly absence of licence) for document or site without any guarantee of anything other than human visibility at current time. Note that “Green” open access falls into this category. It might even be that we replace the word FUZZY by GREEN, though the first is more descriptive.


  • I agree with much but not all of what Peter MR says.  I'm responding at length because I've often had many of the same thoughts.
  • I'm the principal author of the BOAI definition of OA, and I still support it in full.  Whenever the occasion arises, I emphasize that OA removes both price and permission barriers, not just price barriers.  I also emphasize that the other major public definitions of OA (from Bethesda and Berlin) have similar requirements.
  • I don't agree that the term "open access" on its own, or apart from its public definitions, highlights the removal of price barriers and neglects the removal of permission barriers.  There are many ways to make content more widely accessible, or many digital freedoms, and the term "open access" on its own doesn't favor or disfavor any of them.  Even at the BOAI meeting we realized that the term was not self-explanatory and would need to be accompanied by a clear definition and education campaign. 
  • The same, BTW, is true for terms like "open content", "open source", and "free software".  If "open source" is better understood than "open access", it's because its precise definition has spread further, not because the term by itself is self-explanatory or because "open access" lacks a precise definition.
  • I do agree that many projects which remove price barriers alone, and not permission barriers, now call themselves OA.  I often call them OA myself.  This is only to say that the common use of the term has moved beyond than the strict definitions.  But this is not always regrettable.  For most users, removing price barriers alone solves the largest part of the problem with non-OA content, and projects that do so are significant successes worth celebrating.  By going beyond the BBB definition, the common use of the term has marked out a spectrum of free online content, ranging from that which removes no permission barriers (beyond those already removed by fair use) to that which removes all the permission barriers that might interfere with scholarship.   This is useful, for we often want to refer to that whole category, not just to the upper end.  When the context requires precision we can, and should, distinguish OA content from content which is merely free of charge.  But we don't always need this extra precision.
  • In other words:  Yes, most of us are now using the term "OA" in at least two ways, one strict and one loose, and yes, this can be confusing.  But first, this is the case with most technical terms (compare "evolution" and "momentum").  Second, when it's confusing, there are ways to speak more precisely.  Third, it would be at least as confusing to speak with this extra level of precision --distinguishing different ways of removing permission barriers from content that was already free of charge-- in every context.  (I'm not saying that Peter MR thought we should do the latter.)
  • One good way to be precise without introducing terms that might baffle our audience is to use a license.  Each of the CC licenses, for example, is clear in it own right and each removes a different set of permission barriers.  The same is true for the other OA-friendly licenses.  Like Peter MR, I encourage providers to remove permission barriers and to formalize this freedom with a license.  Even if we multiplied our technical terms, it will usually be more effective to point to a license than to a technical term when someone wonders exactly what we mean by OA for a given piece of work.

Update. See Peter MR's response to my response. We agree on every substantive issue here.

The European Library is hiring

The OA European Library is looking for a general manager and seven other staffers.

Tom Cochrane on OA at IATUL

Richard Akerman has also blogged some notes on Tom Cochrane's talk earlier today at the IATUL 2007 meeting Global Access to Science: Scientific Publishing for the Future (Stockholm, June 11-14, 2007).  Excerpt:

see Microsoft 2020 report

Global access to science – meeting the revolution...

QUT has mandated research deposit...

QUT shows top 10 authors / top 10 papers

focus on collecting research

Rise of Mandates for Open Access

Gold or Green

Gold - pay publisher
Green - more controlled by researcher...

OAK Law Project ...

"Those who focus on open access, far from being radical, are not being nearly radical enough"
- Timo Hannay ...

* understand changing scholarly publishing ("keep up")
* readiness to lead in enhancing accessibility
* clarify role of repositories
* repudiate view that problems are too big
* knowledge of changes to science
* lead, or share the lead, in dealing with the data deluge...

Update.  Richard has now blogged on Håkan Carlsson's talk as well, and I expect many more over the next few days.  For the time being, I'll just point you to Richard's blog and urge you to read his posts directly.

Update on CERN's SCOAP3 initiative

Richard Akerman has blogged some notes on Rüdiger Voss' talk earlier today on CERN's SCOAP3 initiative at the IATUL 2007 meeting Global Access to Science: Scientific Publishing for the Future (Stockholm, June 11-14, 2007).  Excerpt:

Open Access Publishing in Particle Physics

convert entire discipline to open access journal publishing

approximately 10,000 scientists worldwide

5016 articles published in 2005 in peer-reviewed journals
83% of all papers in 6 journals
87% of all papers published by 4 different publishers

CERN Convention (1953) is an early OA manifesto

embrace OA movement (

* today particle physics is almost entirely green
* without mandates, without debate

peer-reviewed journals remain important as version-of-record archives and as key instruments of merit recognition and career promotion

OA landscape in 2007

* most particle physics journals offer OA options
- hybrid model, authors buy OA to articles
- reluctant take-up by authors

* gold OA to journals is there, but variety of options bewildering

in 2005: 72.6% NO OA option
in 2007: 86.8% offer OA option

time is ripe for a full transition to OA

OA issues
* grant universal access to peer-reviewed results of publically funded research
* in a green environment authors benefit for peer review and journal prestige
* bring subscription costs under control
* raise researcher awareness of economics of scientific publishing
* inject competition into scientific publishing by linking price to quality
* stabilize the diversity and future of journals which have served particle physics well - but leave
room for new players

SCOAP3 model ...

* estimated annual budget: 10 million euros
* contributions on a "fair share" basis by nationality (affiliation) of articles/authors ...

LHC [Large Hadron Collider] is a much bigger project, 40 funding agencies, 550 million $

* online journals free to read for anybody
* preserve high-quality peer review process
* generate medium and long-term savings for libraries and funding agencies
* free to read and to publish for developing countries

SCOAP3 Status ...
* potential funding partners to be invited soon to sign Expressions of Interest
* once partners commit to sizeable fraction of budget, invite publishers to tender in autumn
* Goal: have SCOAP3 operational for the first LHC papers

New review of Chanier's 2004 book on open archives

Jean-Claude Bertin reviews Thierry Chanier's book, Archives ouvertes et publication scientifique. Comment mettre en place l'accès libre aux résultats de la recherche? (Paris: L'Harmattan, December 2004) in the May issue of ReCALL (accessible only to subscribers).  (Thanks to Stevan Harnad.)  Excerpt:

...What is at stake here is the dissemination of scientific information among researchers and the new opportunities open to them to make knowledge available to more people, more rapidly and more efficiently, as well as the opening of new horizons for scientific evaluation especially in social sciences and the humanities. In this book, Thierry Chanier militates so that these social upheavals which are underway will primarily benefit researchers and not only commercial publishers.

The central topic is that of “open archives” ....

The book is structured along seven chapters that gradually build up the case:

  • an overview of the present situation of scientific publication, featuring a market mainly controlled by commercial publishers who derive high profits from an activity based on monopolistic strategies. The main point made is the reluctance of these actors to abandon their privileged situation in favour of better working conditions for researchers (more rapid exchange of knowledge by reduction of costs and publication delays);
  • a study of the specific case of social sciences and the humanities, best described as a fragmented community (numerous small publishers, heavier influence of university press companies, diversity of readership) which makes it difficult to generalise any model;
  • a review of research communities and publication uses: Thierry Chanier describes a model of the publication process, the place of the various actors and shows how ICT influences the model. He advocates a new set of relationships between authors, middlemen (editors and publishers) and readers which better answer the needs of the scientific community who should take advantage of new technologies to take control of the editorial / publication process;
  • in order to defend his thesis, the author analyses the financial structure of the editorial / publication process. His rigorous identification of the various costs involved, as well as of copyright issues, enables him to discuss the new “paying author” philosophy which he thinks should make open archives workable; ...
  • the last chapter moves on to a more global description of the place of open archives within the field of scientific information creation, storage, indexation and dissemination, as well as the place of the various actors (publishers and university libraries) in the new model....

New OA journal on game studies

The Canadian Games Study Association (apparently no web site yet) is launching a new peer-reviewed OA journal, Loading....  It's still in the early stages and is looking an editor, an editorial board, and referees.  (Thanks to Population of One.)

Scholarly authority in a world of scholarly abundance and OA

Michael Jensen, The New Metrics of Scholarly Authority, Chronicle of Higher Education, June 15, 2007 (accessible only to subscribers).  Michael Jensen is the director of strategic Web communications for the US National Academies.  Excerpt:

When the system of scholarly communications was dependent on the physical movement of information goods, we did business in an era of information scarcity. As we become dependent on the digital movement of information goods, we find ourselves entering an era of information abundance. In the process, we are witnessing a radical shift in how we establish authority, significance, and even scholarly validity. That has major implications for, in particular, the humanities and social sciences....

In Web 1.0, roughly 1992 to 2002, authoritative, quality information was still cherished: Content was king. Presumed scarce, it was intrinsically valuable....

Web 2.0, roughly 2002 through today, takes more for granted: It presumes the majority of users will have broadband, with unlimited, always-on access, and few barriers to participation. Indeed, it encourages participation, what O'Reilly calls "harnessing collective intelligence." Its fundamental presumption is one of endless information abundance....

Imagine you're a member of a prehistoric hunter-gatherer tribe on the Serengeti. It's a dry, flat ecosystem with small pockets of richness distributed here and there. Food is available, but it requires active pursuit — the running down of game, and long periodic hikes to where the various roots and vegetables grow. The shaman knows the medicinal plants, and where they grow. That is part of how shamanic authority is retained: specialized knowledge of available resources, and the skill to pursue those resources and use them. Hunting and gathering are expensive in terms of the energy they take, and require both skill and knowledge. The members of the tribe who are admired, and have authority, are those who are best at gathering, returning, and providing for the benefit of the tribe. That is an authority model based on scarcity.

Contrast that with the world now: For most of us, acquiring food is hardly the issue. We use food as fuel, mostly finding whatever is least objectionable to have for lunch, and coming home and making a quick dinner. Some of us take the time to creatively combine flavors, textures, and colors to make food more than just raw materials. They are the cooks, and...among cooks, the best are chefs, the most admired authorities on food around. Chefs simply couldn't exist in a world of universal scarcity.

I think we're speeding — yes, speeding — toward a time when scholarship, and how we make it available, will be affected by information abundance just as powerfully as food preparation has been....

But right now we're still living with the habits of information scarcity because that's what we have had for hundreds of years. Scholarly communication before the Internet required the intermediation of publishers. The costliness of publishing became an invisible constraint that drove nearly all of our decisions. It became the scholar's job to be a selector and interpreter of difficult-to-find primary and secondary sources; it was the scholarly publisher's job to identify the best scholars with the best perspective and the best access to scarce resources....Fundamentally, scholarly authority was about exclusivity in a world of scarce resources....

[PS:  Here omitting much fascinating detail.]

What are the implications for the future of scholarly communications and scholarly authority? First, consider the preconditions for scholarly success in Authority 3.0. They include the digital availability of a text for indexing (but not necessarily individual access — see Google for examples of journals that are indexed, but not otherwise available); the digital availability of the full text for referencing, quoting, linking, tagging; and the existence of metadata of some kind that identifies the document, categorizes it, contextualizes it, summarizes it, and perhaps provides key phrases from it, while also allowing others to enrich it with their own comments, tags, and contextualizing elements.

In the very near future, if we're talking about a universe of hundreds of billions of documents, there will routinely be thousands, if not tens of thousands, if not hundreds of thousands, of documents that are very similar to any new document published on the Web. If you are writing a scholarly article about the trope of smallpox in Shakespearean drama, how do you ensure you'll be read? By competing in computability.

Encourage your friends and colleagues to link to your online document. Encourage online back-and-forth with interested readers. Encourage free access to much or all of your scholarly work. Record and digitally archive all your scholarly activities. Recognize others' works via links, quotes, and other online tips of the hat. Take advantage of institutional repositories, as well as open-access publishers. The list could go on....

The thornier question is what Web 3.0 bodes for those scholarly publishers. It's entirely possible that, in the not-so-distant future, academic publishing as we know it will disappear. It's also possible that, to survive, publishers will discover new business models we haven't thought of yet. But it's past time that scholarly publishers started talking seriously about new models, whatever they turn out to be — instead of putting their heads in the sand and fighting copyright-infringement battles of yesteryear.

I also don't know whether many, or most, scholarly publishers will be able to adapt to the challenge. But I think that those who completely lock their material behind subscription walls risk marginalizing themselves over the long term. They simply won't be counted in the new authority measures. They need to cooperate with some of the new search sites and online repositories, share their data with outside computing systems....

I hope it's clear that I'm not saying we're just around the corner from a revolutionary Web in which universities, scholarship, scholarly publishing, and even expertise are merely a function of swarm intelligences....

But make no mistake: The new metrics of authority will be on the rise. And 10 to 15 years isn't so very long in a scholarly career. Perhaps most important, if scholarly output is locked away behind fire walls, or on hard drives, or in print only, it risks becoming invisible to the automated Web crawlers, indexers, and authority-interpreters that are being developed. Scholarly invisibility is rarely the path to scholarly authority.

Sunday, June 10, 2007

Bethesda statement in Catalan and Spanish

Ismael Peña López has translated the Bethesda Statement on Open Access Publishing into Catalan and Spanish.  (Thanks, Ismael.)

Central v. distributed OA archiving in Canada

Stevan Harnad, No Need for Canadian PubMed Central: CIHR Should Mandate IR Deposit, Open Access Archivangelism, June 10, 2007. 

Summary:  What is needed for Canadian biomedical research output isn't yet another (this time Canadian) PubMed Central: What is needed is that all Canadian (and US and UK) biomedical research output (as well as all the output of all the other scientific and scholarly disciplines, worldwide) should be made Open Access (OA) for all users, webwide. And the way to accomplish that is for the institutions and funders of the researchers to mandate that they deposit each article, immediately upon acceptance for publication, into each author's own OA Institutional Repository (IR). That is the solution that will systematically scale up to cover all research, from all institutions, across all fields, across all countries -- not the creation, willy-nilly, of central repositories like PubMed Central to deposit it into directly. PubMed Central should instead be a central OAI harvester, harvesting the biomedical research output of all the (distributed, local) IRs.

Canadian Institutes of Health Research (CIHR) should think twice, and then lead, clear-headedly, instead of following, blindly, in this.