Open Access News

News from the open access movement

Saturday, January 19, 2008

More on the student campaign for OA

Jennifer McLennan, New SPARC Campaign Engages Students on Open Access, ARL Bimonthly Report, No. 256, February 2008.  Excerpt:

...The student dedication to “open” was first made clear to us [at SPARC] when Students for Free Culture at New York University designated their annual regional meeting to open access, and invited SPARC, Public Library of Science, and Science Commons to speak. The messages from the speakers were familiar, but the excitement and engagement of the audience—and the views they had to offer—were completely new. We expected the meeting to be an introduction to open access for students, instead it turned into a learning experience for us on the depth of the student commitment to making open sharing of information habitual—for everyone.

The potential that students clearly embody for shaping the future of scholarly exchange and the growing level of student activity throughout 2007— as well as the hire of the first SPARC summer intern— inspired the genesis of the SPARC student campaign and companion guide: The Right to Research: The Student Guide to Opening Access to Scholarship. Developed in close collaboration with our student colleagues, the guide is a tool they will use to engage more of their peers.

Specifically, The Right to Research:

  • helps students recognize the problem of access, saying they shouldn’t have to skip over research that could be important to their papers;
  • introduces the principle of open access (OA), making a clear distinction between the principle and the ways OA is being realized—through OA journals, repositories, and copyright management;
  • indicates how open access can make life as a student easier, advance research, widen access to those who need it, and increase visibility for student scholars;
  • offers ways to support OA that pertain both to graduate students approaching publishing decisions and to undergraduates who want to take up the OA banner.

Please join us in inviting more students to the conversation on access. Visit the SPARC students Web site.

Jennifer adds new details in the January issue of the PLoS E-Newsletter for Institutional Members:

The Right to Research is introduced on the heels of several student-centered SPARC initiatives, including: the December 2007 SPARC Innovator Profile, which highlights five student leaders as Agents of Change; the first Sparky Awards for best videos illustrating the value of information sharing; and the SPARC-ACRL forum at ALA Midwinter, which focused on student engagement.

The Public Library of Science has been a key ally for SPARC in the conception and realization of the student engagement initiative. We'd like to extend thanks to everyone at PLoS for their support, especially Donna Okubo, Liz Allen, and Gavin Yamey....

PS:  The Right to Research web site will launch later this month.

Publishing trends to watch

The publishing consultants at Greenhouse Associates have put out a list of Five Trends to Watch in 2008.  The trends focus on non-academic publishing, but consider whether they apply to academic publishing as well, esp. this pair in tension:

3. The pay wall will stand

Online advertising was one of the biggest stories of 2007, evidenced by the growth of online advertising revenues, emergence of new forms of advertising, and the voracious acquisition of online advertising companies by Google, Yahoo, Microsoft, and AOL. Yet for all of that activity, the business and professional segments of the information business have been largely untouched. Paid content remains the rule for business and professional users, especially as such content becomes increasingly embedded in critical workflow applications used in healthcare, science, law, financial services, construction, manufacturing, energy, and other industries.

4. Content producers will adopt open-source approaches

Wikipedia established the viability of content that is created and maintained by users, rather than by publishers' editorial staffs. Commercial databases have been untouched to date by this open-source mode, and even worked to try to prove the inferiority of such approaches as compared to traditional, centrally-controlled approaches. However, we are now aware of several information companies that are considering opening their databases so that their users can make entries to supplement or correct the publishers' own data. While processes must be worked out in these cases to maintain the integrity of the data, the value of leveraging the resources of the user community appears too compelling for publishers to deny much longer....

The very idea of a funder OA mandate for books

Jan Velterop, Reviewed reviews, The Parachute, January 18, 2008.  Excerpt:

"Book self-archiving cannot and should not be mandated, for the contrary of much the same reasons peer-reviewed journal articles can and should be."

Stevan Harnad
18 January 2008
contribution to liblicense-l

I agree with him.

I think.

Why I can't be entirely certain is because by peer-reviewed journal articles he may mean the same as the NIH in the description of the types of articles that fall under the mandate, which says:

"The Policy applies to all peer-reviewed journal articles, including research reports and reviews. The Policy does not apply to non-peer-reviewed materials such as correspondence, book chapters, and editorials."

That's a mistake, in my view. Review articles belong in the second sentence, with editorials and the like; not the first. More often than not, review articles are initiated by a publisher, inviting a distinguished author to write one. More often than not the author is offered some payment for writing it. Seldom if ever is a review article the result of a funded research project.

Review articles have a lot in common with books. And if self-archiving of books "cannot and should not be mandated", the same applies, grosso modo, to review articles.

Even OA publisher par excellence, BioMed Central, requires subscriptions to access review articles, for instance in the journal Breast Cancer Research. I think they are right to do that....

Comments.  Whether OA mandates should ever apply to books is a fascinating question.  Some thoughts:

  1. First, see the whole discussion thread in which Stevan made his comment about books.
  2. Jan is right that review articles have important similarities to books.  But if a funder like the NIH mandates OA for review articles, then clearly the mandate only applies to those review articles that result from its funding.  If Jan is also right that "seldom if ever is a review article the result of a funded research project", then the question is generally moot --and in those exceptional cases where funding does give rise to a review article, the funder policy could be justified.
  3. Funders mandate OA because they want the results of their research to be disseminated or available as widely as possible.  They might not care whether the research results in journal articles, books, or some other genre.  But they might start to care (or one might argue that they ought to care) if they want to allow grantees who write books to earn royalties on them.  But even in that case, funders wouldn't have to give up on the idea of mandating OA to books.  They would have to evaluate the mounting evidence that for some kinds of books, including research monographs, OA is compatible with royalties and might even stimulate a net increase in sales. 
  4. One critical variable is consent.  Even a funder "mandate" is conditional, or depends on grantee consent.  A funder book mandate could never say, "You must make your book OA" --or if it did, it would be unenforceable.  It could only say, "If you take our money, and use it to write a book, then you must provide OA to the text; if you don't like that, then don't take our money."  I say more about the priority of consent over genre in the Richard Poynder interview at p. 40.
  5. Some researchers might not take grant money if it required OA to any books resulting from the grant.  But some would still take it.  I know this from my own case.  In the mid-1980s (I forget the year!) I received a grant from the National Endowment for the Humanities for research that resulted in a book.  The NEH didn't require OA, but I made an OA edition on my own as soon as I was legally allowed. 
  6. I've participated in drafting several funder OA policies, and in every case I recommended an exemption for royalty-producing books.  For practical and strategic reasons (to get policies adopted, to get researcher support), it's important to focus OA mandates on royalty-free literature.  But points 3 and 4 above show that a funder could depart from this advice for its own reasons (to disseminate its research results more widely) and with grantee consent.
  7. I can think of two policies that come interestingly close to OA mandates for books.  (1) The Scholarly Editions Program of the US National Endowment for the Humanities (NEH, not NIH) doesn't require OA for the books it funds, but since September 2006 it has given preference to proposals that promise OA for the resulting books.  (2) Ilmenau Technical University has required dual editions (OA and TA) for every book published by Ilmenau University Press since March 2007.  I applaud them both. 

Update.  Stevan Harnad has now blogged, and elaborated, his original forum comment on books.

January First Monday

The January issue of First Monday is now online.  Here are the OA-related articles:

  • Maureen O'Sullivan, Creative Commons and contemporary copyright: A fitting shoe or "a load of old cobblers"?  Abstract:   This article examines copyright's historic trajectory from a common law to a statutory privilege, turning almost full circle in recent years, in the current age of high technology. It simultaneously probes theories of intellectual property rights which are grounded in somewhat skewed ideas related to tangible property, and contextual parallels and contrasts are drawn between physical and ephemeral resources throughout. The founding and fomenting of various civil society organisations in response to the expansions in the term and scope of copyright law, such as Creative Commons, is then charted. This leads on to complex questions about what constitutes the public domain, and whether and how it should be facilitated. The aims of grassroots movements such as Creative Commons to persuade and assist authors, through voluntary means, to relax their legislative rights and its impact on copyright law and practice, are also critically evaluated.
  • Charles Bazerman and three co-authors, Open access book publishing in writing studies: A case study.  Abstract:   The publication of scholarly books has been shaped strongly in recent decades by two factors: assessments by publishers of the potential market for books and the influence of publisher's reputations on tenure and promotion decisions. This article reflects on the choices made by a group of senior scholars in the field of composition and rhetoric as they conceived of and published an open access book on activity theory and writing and, subsequently, published an open access book series in the area of rhetoric and composition. The implications of open access book publishing for access to scholarly work and tenure-and-promotion decisions are considered.

Friday, January 18, 2008

Three German articles on OA

Barbara Lison (ed.), Information und Ethik, a large PDF containing most of the proceedings of the Third Leipzig Kongress für Information und Bibliothek (Leipzig, March 19-22, 2007), Verlag Dinges & Frick GmbH Wiesbaden, 2007.  (Thanks to Klaus Graf.)  Here are the OA-related articles:

  • Ulrich Herb, Open Access – Ein Wundermittel?
    Wissenschaft, Gesellschaft, Demokratie, Digital Divide
  • Bernd Hagenau, Ulrich Herb, and Matthias Müller,
    Auf dem grünen Weg – neue Aufgaben und Funktionen einer SSG-, Hochschulund Landesbibliothek
  • Anja Beyer and Marion Irmer, Sicherheitsaspekte elektronischen Publizierens

Update. Herb's article has now been separately self-archived.

OA data archive from Google

Alexis Madrigal, Google to Host Terabytes of Open-Source Science Data, Wired Science, January 18, 2008.  Excerpt:

Sources at Google have disclosed that the humble domain,, will soon provide a home for terabytes of open-source scientific datasets. The storage will be free to scientists and access to the data will be free for all. The project, known as Palimpsest and previewed to the scientific community at the Science Foo camp at the Googleplex last August, missed its original launch date this week, but will debut soon.

Building on the company's acquisition of the data visualization technology, Trendalyzer, from the oft-lauded, TED presenting Gapminder team, Google will also be offering algorithms for the examination and probing of the information. The new site will have YouTube-style annotating and commenting features.

The storage would fill a major need for scientists who want to openly share their data, and would allow citizen scientists access to an unprecedented amount of data to explore. For example, two planned datasets are all 120 terabytes of Hubble Space Telescope data and the images from the Archimedes Palimpsest, the 10th century manuscript that inspired the Google dataset storage project....

Attila Csordas of Pimm has a lot more details on the project, including a set of slides that Jon Trowbridge of Google gave at a presentation in Paris last year. WIRED's own Thomas Goetz also mentioned the project in his fantastic piece of freeing dark data.

One major issue with science's huge datasets is how to get them to Google. In this post by a SciFoo attendee over at business|bytes|genes|molecules, the collection plan was described:

(Google people) are providing a 3TB drive array (Linux RAID5). The array is provided in “suitcase” and shipped to anyone who wants to send [their] data to Google. Anyone interested gives Google the file tree, and they SLURP the data off the drive. I believe they can extend this to a larger array (my memory says 20TB).

You can check out more details on why hard drives are the preferred distribution method at Pimm. And we hear that Google is hunting for cool datasets, so if you have one, it might pay to get in touch with them.

Comment.  For background, see my post from March 2007.  At that time, Google was offering to transfer huge datasets from lab to lab, at its own expense, provided it could make copies for offline storage and eventual Google-hosted OA.  I'm very glad to see that it hasn't forgotten the OA part of the plan and is even adding tools for visualization, annotation, and user comments.

Update. For a critical view of Palimpsest, see Chuck Humphrey's comments at the IASSIST blog.

Overcoming obstacles to a citation index for OA articles

The citation extraction process in CitEc, RePEc blog, January 16, 2008.  Excerpt:

CitEc is an experimental autonomous citation index, that is, it is a software system which is able to automatically extract references out of the full texts of documents and create links between citing references and cited papers.

With its last update, the CitEc database has reached almost three million references and more than one million citations between documents available in RePEc. This is an important threshold but still is far of being a complete set of citations. There are some limits in the references extraction process:

First, the system needs to have open access to a electronic version of the documents full text. Many journals listed in RePEc have restricted access....We try to get on board as many publishers as possible but unfortunately not all of them are willing to collaborate with us at this time....

Second, the URL provided by the RePEc archive maintainer must be correct and must point to the PDF file containing the document full text and not to an intermediate abstract page or similar. Some archives provides this kind of links to force the researchers to pass through their institutional web pages....

The third limit is more technical. In order to extract references, the PDFs files need to be converted into plain ASCII text....There are a wide variety of PDF files created in different ways and not all of them can be converted.

Finally, the systems does a parsing of the references section, which first needs to be isolated, to identify each reference and split it in its parts: title, author, year, etc. The parsing is done using pattern matching techniques which in some cases are not able to identify the full list of existing references.

As the last update as of December 31, 2007, the CitEc numbers are: 527,357 articles and working papers available in RePEc. Of them, 343,441 cannot be processed by the system due to limitations mentioned in the first two points above....

That leaves an amount of 183,916 documents available to be processed by CitEc. Of them, the process was successfully completed in 134,130 papers, that is the 73% of the available documents. The complete list of sources and the number of processed documents for each series or journal is available here....

UK barcamp on the re-use of public sector information

Michael Cross, 'No one in government IT will have done this before', The Guardian, January 17, 2008.  Excerpt:

In a small way, we made history. Last Saturday, an ad hoc group of citizens interested in improving an aspect of public policy sat down informally with the civil servants responsible and designed a web service to do the job. In all, it took less than five hours.

The event was a barcamp held by the National Archives as part of its response to the Cabinet Office's "government 2.0" report, the Power of Information (PDF). What's a barcamp? Dude, you've obviously been out of the loop for the past, ooh, seven weeks. It's the buzzword for a self-organising meeting to tackle an issue of interest. Everyone welcome, no timetable, no rules, no speakers' podium; lots of laptops, flipcharts and Post-It notes. You get the idea.

This barcamp's purpose was to explore how the government can electronically collect and assess requests to re-use information gathered and held by public bodies. Last summer the government agreed to set up such a channel....

I went along for two reasons. First, the web channel is very much in line with Technology Guardian's Free Our Data campaign. We think there is vast potential for businesses and communities to build new services based on government information. It's our strongly held belief that this resource - possibly the government's most valuable asset - is underused because of the difficulty in finding data and negotiating its re-use. The proposed web channel would address both issues.

The second reason for going along was a long-standing interest in the way governments design information systems. Too often, I have seen users consulted only after crucial design decisions have been taken. The barcamp approach offers an attractive alternative....

We opened with a couple of case-study presentations...That led to a useful discussion on the difference between an individual's freedom to view official information and their freedom to mash or otherwise republish it - and whether that distinction can survive in the web age....

After lunch, barcamp really got down to business. With a bit of jollying along from Steinberg, we reached a consensus on what the web channel should aim to do, who the users would be, what sort of requests it should handle and very roughly what would be on the home page....

By teatime, we'd pretty well covered it. [John] Sheridan [of the National Archives] folded up the flip-chart sheets and promised to start work (after similar consultations with academics and IT suppliers). No one could quite believe it was happening. Even [Tom] Steinberg [co-author of the government report] marvelled: "No one in government IT will ever have done anything like this before." ...

Also see Jonathan Gray's blog notes on the BarCamp (London, January 12, 2008).

Calling on librarians to improve Wikipedia and Wikia

Mark Chillingworth, Wales urges librarians to help build better Wikipedia, Information World Review, January 17, 2008.  Excerpt:

Jimmy Wales, one of the founders of online encyclopaedia phenomenon Wikipedia, has called on librarians to become deeply involved in the web-based communities that surround his products....

Wales is looking to improve the quality of information available to Wikipedia users, and to extend the number of languages that Wikipedia can be read in.

To do this, he is calling on information professionals to form and join Wikipedia Academies, which are working with communities across the world teaching wiki editing skills. Wales hopes that they will create a generation of wiki editors who will drive up standards.

Academies in South Africa and Germany have already appeared.

“Librarians are not engaging with the Academies,” Wales complained.  “If libraries throughout the world formed regional groups and made an effort, they would be playing a positive role within Wikipedia.  The job of the librarian is about highlighting the weaknesses and strengths of information.”

Wales is also looking to harness the expertise of information professionals to improve web search results....

Nature supplement on Planet Earth

Nature has created another free online supplementYear of planet Earth.

SPARC's panel on student views at the ALA meeting

Bright Futurists: Student Speakers Offer Unique Perspective at ACRL/SPARC Forum, Library Journal Academic Newswire, January 17, 2008.  Excerpt:

This year's SPARC/ACRL forum at the American Library Association Midwinter meeting, entitled Working with the Facebook Generation, began on high note: a round of applause for SPARC executive director Heather Joseph. Moderator Ray English, librarian at Oberlin College, praised SPARC for its work in passing the recent NIH mandate for public access, saying it would free as many as 65,000 peer reviewed research articles funded all or in part by taxpayers. English then turned the program over to three student presenters, who suggested that the upcoming generation, enabled by technology and somewhat disenchanted with the status quo, are poised to help change the information ecology....

Andre Brown, a Ph.D. student in physics and astronomy at the University of Pennsylvania and co-blogger for Biocurious engaged librarians with his presentation on the potential power of blogs in scholarly communication, saying they facilitated the "open practice of science." He explained how blogging, easy and cheap with "a low barrier to entry" thanks to free software, offered the potential to get ideas into the field more quickly, a key point for scientists, who often lament that it can take years for journal articles to be published. He acknowledged, however, that concerns over credit and citation remain a key stumbling block to more widespread use of blogs in science. He suggested that at some point scientists would desire and get used to more "rapid communication." ...

[T]he forum represented a big step for SPARC in engaging the next generation, whose information ideals are still forming, and are being increasingly molded by powerful technology. While the session wasn't without its glitches, it nicely raised the importance for librarians of engaging students, not just their faculty, when it comes to information access issues....

One of the presenters..., Stephanie Wang, an economics student at Princeton, gave librarians a heartening reminder of what they are advocating for. "We just want to do kick-ass science," she said.

ERC will continue to pay publication fees

In my post last week about the OA mandate at the European Research Council (ERC), I couldn't tell whether ERC would continue its previous policy to pay publication fees at fee-based OA journals:

The ERC's March 2007 grant guidelines make clear (pp. 12, 35) that when grantees submit their work to fee-based OA journals, ERC is willing to pay the publication fees.  But the new document [on the OA mandate from January 2008] is silent on the subject.  Do the older grant guidelines stand, because not modified here?  Or is ERC silently rescinding its willingness to pay publication fees?

I'm happy to report that the ERC will continue to pay publication fees.  (Thanks to the ERC-UK via Matthew Cockerill.)  From the latest ERC Guide for [Grant] Applicants, December 27, 2007, p. 16: and indirect costs

Direct eligible costs are those which support all the research, management, training and dissemination activities necessary for the conduct of the project, such as....Publication Costs (page charges and related fees for publication of results)....

This version of the Guide will apply at least until Summer 2008.

UK search engine of human-vetted OA content

Universities' alternative to Google launched, a press release from the University of Manchester, January 16, 2008.  Excerpt:

An internet search engine rivalling the multimillion pound Google is to be launched at the end of January by The University of Manchester's national data centre Mimas.

The free service will add thousands of documents to the 'Intute' service which already allows academics, teachers, researchers and students to search for information relating specifically to their subject area....

At the end of January, researchers will be able to automatically access papers from research databases within universities and other institutions.

The £1.5 million per year collaboration between seven UK Universities and partners - enlists a team of full-time specialists who are scouring the internet.

They are backed by an army of PhD students and a range of organisations - including the massive Wellcome Trust- who have added their own information to the Intute database....

[Intute's Executive Director Caroline Williams] added: "This chimes with calls for open access across the UK....So this database is really a showcase of what the UK academic community has achieved...."

Update. Also see Hurley Goodall, U. of Manchester Adds Digital Repositories to Academic Search Engine, The Wired Campus, January 23, 2008. Excerpt:

The University of Manchester announced yesterday a reintroduction of the academic search engine Intute, slated for the end of the month.

The newest development for the relaunch is the Intute Institutional Repository Search. It will be a search engine for university digital repositories, allowing researchers to easily find academic material in one place.

Caroline Williams, Intute’s executive director, says the search engine has gathered data from 86 institutions, and that number is growing....

OA news from eIFL

The January/February issue of the Newsletter is now online.  Some tidbits:

Open access and awareness raising in universities in Southern Africa

With the sponsorship of eIFL and the Dutch Minister of Foreign Affairs, South African Regional Universities Association (SARUA) hosted an Open Access leadership summit in association with the African Access to Knowledge Alliance (AAKA) in Botswana on November 20-21, 2007.

The Chair of SARUA, Professor Njabulo Ndebele of the University of Cape Town, the Botswana Minister of Education, J D Nkate and the CEO of SARUA, Piyushi Kotecha, opened the conference with strong statements on the value of Open Access in their respective constituencies....

Universities in the southern African region need to explore open research and open science in order to become research intensive in the next 10-20 years, making a contribution not only to global scholarly communications, but also creating links between research, teaching and learning, and ensuring the contribution of universities to socio-economic development in the region.

eIFL provided funding for Alma Swan of Key Perspectives, Leslie Chan of Bioline International and Jasper Maenzanise of Zimbabwe Open University Library to participate in this conference and was represented by Susan Veldsman. Presentations can be found [here]....

Cape Town Open Education Declaration

OSI has been working with the Shuttleworth Foundation (South Africa) to develop a declaration around open educational resources. A soft launch of the site took place recently. The hope is that the declaration will spur the development of a movement around open educational resources as the Budapest Open Access Initiative did for Open Access to research literature.

The declaration will be formally launched on January 22. Individuals and organizations interested in the further development of open educational resources can show their support for the declaration by signing on, so that they could be listed as an early signatory to the declaration.

A big thanks goes out to the eIFL community for assisting in the translation of the declaration into many languages of the network.

Portal to Institutional Repositories in eIFL member countries

Further developments on the federated eIFL repository have been taking place in the last two months resulting in:

  • Continued maintenance and development as a demonstrator to give countries feedback on the status of their institutional repositories
  • updating general data e.g. base URLs, contact people, etc
  • creating new interface to look like eIFL website
  • adding a statistics service to monitor usage from across the world....

More on the NIH OA mandate

Jocelyn Kaiser, Uncle Sam's Biomedical Archive Wants Your Papers, Science Magazine, January 18, 2008 (accessible only to subscribers).  Excerpt:

If you have a grant from the U.S. National Institutes of Health (NIH), you will soon be required to take some steps to make the results public. Last week, NIH informed its grantees that, to comply with a new law, they must begin sending copies of their accepted, peer-reviewed manuscripts to NIH for posting in a free online archive. Failure to do so could delay a grant or jeopardize current research funding, NIH warns....

Only about 12% of authors are complying with the [current] voluntary policy...and of 80,000 eligible articles per year, only 20% to 25% are being submitted either by authors or directly by journals, says David Lipman, who oversees PMC.

NIH says it is ready for the glut of manuscripts it will soon receive, but there are signs that some scientists may be confused about what to submit. For example, NIH is already removing old papers that authors mistakenly posted in PMC. Lipman acknowledges that there will be "a learning process" but notes that traffic on the site is already "huge," with 12 million article views each month.

NIH's brief notice simply states that the policy is mandatory for all articles accepted on or after 7 April. Initially, NIH requested that only original research be archived at PMC, but now the agency says the policy has been expanded to include review articles if they were peer-reviewed. Many journals retain copyright of the manuscripts they publish, so authors must obtain permission to post a copy on the NIH site. It is up to investigators and their institutions to figure out whether their submissions comply with the journals' policy.

To give scientists a nudge, NIH will require them to include the PMC number when they cite their own papers in grant applications and progress reports. Other possible ways of forcing scofflaws to comply range from having a program director call with a reminder to "the most extreme: suspending funds," says NIH Deputy Director for Extramural Research Norka Ruiz Bravo. "We hope we're not going to get there," she says.

The new law puts NIH in line with some other funding agencies that require grantees to send their papers to PMC or a U.K. version of the archive; these include the U.K.'s Medical Research Council and Wellcome Trust, which adopted such policies in 2006, and the Howard Hughes Medical Institute (HHMI) in Bethesda, Maryland, whose rule goes into effect this month (see table). All three institutions require that papers be posted within 6 months of publication in a journal. NIH differs in one way: Whereas other funders help pay author fees that some journals charge to make the full text immediately available, NIH is not offering any extra money for "open access," Ruiz Bravo says....

Scientists who have been sending their papers to PMC say the process is relatively easy, but keeping track of each journal's copyright policy is not....

As for journals, although most major biomedical publications (including Science) already allow authors to submit manuscripts to PMC, some publishers say they will need to police the site for articles mistakenly posted, such as those not yet released from the journal's embargo or those published before 2005. Martin Frank, executive director of the American Physiological Society, says APS asked NIH to remove 78 papers last year, and he expects "hundreds" of similar errors when the mandatory policy kicks in. Lipman acknowledges that NIH had to remove some papers. But complying with copyright, he says, is not NIH's responsibility; it's "between the author and the publisher."

Comment.  I'm confused on one point.  Kaiser paraphrases Norka Ruiz Bravo: "Whereas other funders help pay author fees that some journals charge to make the full text immediately available, NIH is not offering any extra money for 'open access.'" Ruiz Bravo should know, of course, but her statement seems to conflict with Question E3 of the new FAQ:

Will NIH pay for publication costs?

Yes. The NIH will reimburse publication costs, including author fees, for grants and contracts on three conditions: (1) such costs incurred are actual, allowable, and reasonable to advance the objectives of the award; (2) costs are charged consistently regardless of the source of support; (3) all other applicable rules on allowability of costs are met.

As I read it, this passage covers publication fees at fee-based OA journals as well as page charges at TA journals.  One way to reconcile it with Ruiz Bravo's statement in Kaiser's article is to be very literal.  Perhaps NIH won't offer "extra money" to pay publication fees at fee-based OA journals, but it will allow grantees to use grant funds for the purpose.  More later, if I learn more on this point.


Today I'm facing a double whammy of software and connectivity problems.  A bad crash corrupted some data files, forcing me to rely on dated back-ups, and an ice storm is causing intermittent outages.  Please bear with me as I catch up.

Thursday, January 17, 2008

UKPMC user survey

If you use UK PubMed Central, please take this new survey.  Responses are due by February 8, 2008.

The Eco-Patent Commons

Katherine Nightingale, Scheme to 'share environmentally-friendly patents', SciDev.Net, January 17, 2008.  Excerpt:

Large corporations have joined forces in an 'open innovation' project to allow public access to patents with environmental benefits.

IBM, Nokia, Pitney Bowles and Sony, in partnership with the World Business Council for Sustainable Development (WBCSD), have compiled a portfolio of patents — the Eco-Patent Commons — that can be used in manufacturing and business processes....

Also see the WBCSD press release, January 14, 2008.

CC releases CC-Zero draft

Mike Linksvayer, CC0 beta/discussion draft launch, CC blog, January 15, 2008.  Excerpt:

CC0 is a Creative Commons project designed to promote and protect the public domain by 1) enabling authors to easily waive their copyrights in particular works and to communicate that waiver to others, and 2) providing a means by which any person can assert that there are no copyrights in a particular work, in a way that allows others to judge the reliability of that assertion.

As announced on CC’s 5th anniversary, today we are announcing a beta of the CC0 user interface and technical specification and discussion drafts of the CC0 legal tools:

The CC0 Waiver will enable the author or owner of a work to affirm the copyright and related or neighboring legal rights that he or she has in a work, and then to fully, permanently and irrevocably waive those rights. By making this waiver, the Affirmer effectively dedicates all copyright or related legal interests he or she held in the work to the public domain – “no rights reserved”. The CC0 Waiver (United States) will be an effective legal tool within the US and any other jurisdictions with equivalent law. It will also be offered as a template indicating the scope of most of the rights that must be covered in other jurisdictions in order to effect an equivalent dedication to the public domain. Some jurisdictions may need to address additional rights, for example “sui generis” database rights and specific rights to data.

The CC0 Assertion will provide a means by which any person may assert that there are no copyrights in a work, within a system that permits others to judge the reliability of the assertion, based on the Asserter’s identity and other information the Asserter may provide. The CC0 Assertion (United States) is intended to address copyright status under US law. The Assertion may not be appropriate for Works created in or whose copyright status is governed by the law of other jurisdictions.

As with our existing core legal tools (six licenses ranging from Attribution to Attribution-NonCommercial-NoDerivatives), we want the CC0 waiver and assertion legal tools to be valid worldwide and eventually ported to many jurisdictions worldwide to take into account the nuances of copyright law in those jurisdictions....

One of the use cases for CC0 is the Protocol for Implementing Open Access Data, also announced in conjunction with CC’s 5th birthday. In addition to fulfilling the protocol’s legal requirements, the CC0 technical infrastructure will also support the assertion of non-legal community norms in conjunction with a work, beginning with the norm of citation in the context of science....

Another misleading survey

Cameron Neylon, Biosciences Federation Survey on Open Access - Please do this survey! Science in the Open, January 16, 2008.  Excerpt:

Ok, having flagged up two surveys in my previous post I have now done the second one. It seems to be for anyone worldwide but I wanted to bring it to people’s attention because it further clouds the definition of Open Access, whether deliberately or through ignorance I can’t say.

Fairly early on we have the following question:

6. What do you understand by the term ‘Open Access’? (Tick all those that apply)

  • Journals that are free to the reader
  • Journals that are free to the author
  • Journals that charge the author
  • Copies of journal articles freely available online (other than in the journal itself)
  • Not sure
  • Never heard the term
  • Other (please give details)

BBB doesn’t seem to even exist as an option!...Now, we can (and have) argued for a long time over definitions of OA and the role of BBB etc. But to not mention it at all does not seem helpful. Might I humbly suggest that all those who feel it appropriate do the survey and put something in the ‘Other’ box for Question 6? ....

Comment.  The Biosciences Federation is willing to support OA journals if the OA business models can provide revenue assurances that BF doesn't even get from the subscription business model.

New Greek OA organization

The Open Research Society has officially launched.  From today's announcement:

The Open Research Society (ORS) is a Non Governmental Organization (NGO) and is not depending on any government, political party, political or religious organization or entities representing financial interests....

We are happy to invite you to join our worldwide initiative.
Please visit our site and learn more.

At a glance:  Our ten new international journals - 100 journals planned for 2008-2010....

Our three sponsored world summits and forums....

And also: Download our calendar 2008....

ORS Projects Brochure (Download)....

ORS Poster (Download)....

Let's say no more to:
- Barriers to Knowledge Diffusion
- Copyrights Dynasty of Publishers
- Closed Eyes to Knowledge based Social Exclusion
- Poor Exploitation of Scientific Research Outcomes....

PS:  For background, see these notes from October 2007 on the Greek conference where ORS President Miltiadis Lytras first introduced the new society.

New Dutch OA organization

The EU-project Science Education and Learning in Freedom (SELF) has evolved into the independent Free Knowledge Institute.  (Thanks to Glyn Moody.)  From the January 14 announcement:

Coming Wednesday a new foundation is launched aiming to stimulate a society in which technology, educational materials, and cultural and scientific works can be freely shared. The organisation receives the support from the Internet Society Netherlands....

The Free Knowledge Institute is an initiative from three Amsterdam-based professionals who currently work for Internet Society Netherlands. In the past years coordinated a large-scale EU-project SELF which embraced the same objectives. The need to share knowledge freely has become so important that the institute now turns into an independent organisation.

"More and more governments realise the benefits of free knowledge and free information technology", says Wouter Tebbens, the president of the new institute. The Free Knowledge Institute intends to be a knowledge partner helping to show the way in available free knowledge and technology. "That way, we can elaborate on the existing pool of free knowledge and free software, which is growing enormously. Look at projects such as Wikipedia, Linux, and the internet itself", Tebbens states. "Why reinvent the wheel yet again?"

Its main lines of activity are Free Knowledge in technology, education, culture and science. Free Knowledge in education focuses on the production and dissemination of free educational materials; Free Knowledge in IT mainly refers to free software, open standards and open hardware; Free Knowledge in culture includes open content; and Free Knowledge in science includes open access and anti-privatisation of scientific knowledge....

Update. Also see the press release from the Internet Society Netherlands.

Latest CLADDIER report on data-publication linkage

Brian Matthews and four co-authors, Citation, Location, And Deposition In Discipline & Institutional Repositories:  Recommendations for Data/Publication Linkage, November 30, 2007.  Report III for the CLADDIER Project.  (Thanks to Charles Bailey.)

Abstract:   A key aim of the CLADDIER project is to investigate the cross-linking and citation of resources (in particular data and their associated publications) held in institutional and subject-based repositories within the research sector. Typically traditional citations are partial in that they are “backward citations”, referring to work which influenced the current research, and they only cite other formal publications, ignoring other artefacts which are the output of research, in particular research data. Online repositories storing more dynamic digital objects gives the opportunity to provide a more complete picture of the relationships between them, with backward and forward citations to data and publications being propagated between repositories.

This report motivates the cross-citations of data from the CLADDIER use case example, and considers the approaches which have been implemented to harvest and propagate citation information. Most of these existing approaches depend on centralised services, which were considered unsatisfactory in an environment where independent repositories wish to maintain control of their resources and do not wish to be dependant on third-party services. Criteria are identified for building a Citation Notification Service to propagate citation references and links between repositories, including using a peer-to-peer protocol. A number of different architectures are proposed and evaluated.

The requirement for a light-weight peer-to-peer service which is as widely applicable as possible lead to the selection of Linkback services, in particular Trackback which provides an existing simple specification which can be implemented quickly and adapted to the requirements of citation notification. A detailed description the Trackback protocol is then given, together with the design of the adaptations and extensions identified as required for citation notification. This extended Trackback protocol has been implemented in the STFC ePubs institutional repository; this implementation is described and a use case is described.

The resulting protocol provides a flexible mechanism for propagating information between repositories. The report considers the status and possible future applications and extensions of the Trackback Citation Notification Service.

AAP pressures universities to limit fair use

Andrea Foster, Despite Skeptics, Publishers Tout New 'Fair Use' Agreements With Universities, Chronicle of Higher Education, January 17, 2008 (accessible only to subscribers).  Excerpt:

The battle line between publishers and colleges, who have been fighting over campus access to digital versions of books and journals, shifted slightly in favor of the publishers on Wednesday.

The Association of American Publishers announced it had reached an agreement with Hofstra, Marquette, and Syracuse Universities to limit distribution of electronic content for students. The policies may be too vague, however, to actually help professors and librarians figure out what they can rightfully access. And one of the universities said the agreement was made under duress.

Each university, urged by the publishers, has produced guidelines governing electronic reserves, a system that libraries and professors use to make portions of books and journals available free online to students. The documents broadly state that the colleges will respect copyright law, will consider four factors in deciding whether to distribute course material, and will not assume that material elsewhere on the Internet can be redistributed without publishers' approval....

Most colleges agree that material placed on electronic reserves, or e-reserves, should be password protected and available only to students. But some college lawyers say that providing the material to students is allowed under fair use and that their institutions don't need publishers' prior approval.

Further, many colleges complain that publishers are making unreasonable demands with respect to e-reserves, said Prudence S. Adler, associate executive director of the Association of Research Libraries. Some publishers have asked colleges to pay for e-reserve content, even though the colleges have license agreements that already allow them to use the material in digital format, she added....

The publishing group cast [the three new agreements] as the outcome of cordial discussions between its representatives and college officials. But the group's news statement was not issued jointly with the universities. And an official from one of the colleges, who asked for anonymity, said his institution had drafted its e-reserve document after the publishing group threatened to sue the university for failing to restrict the availability of some works through a password-protection system....

Georgia Harper, a copyright expert who is the scholarly communications adviser for the University of Texas at Austin libraries, is skeptical that the guidelines from the three universities will clear up much confusion about how to use electronic content.

"I find the some of the statements to be ambiguous and unhelpful," she said in an e-mail message, noting that the documents include the word "may."


  • I worry that publishers are forcing universities to err on the side of non-use when interpreting the vague and flexible guidelines for fair use.  Universities, more than most institutions, should be pushing the envelope instead.  I worry that publishers are extorting agreements with threats of litigation.  In the absence of these threats, publishers have no bargaining power to demand capitulation to their one-sided reading of the law.
  • When I first read Foster's article, I also worried that the universities had agreed not to "assume that material elsewhere on the Internet can be redistributed without publishers' approval."  Did Hofstra, Marquette, and Syracuse just agree to block local redistribution of OA content?  I turned to the Syracuse agreement for clarification.  Here's what it says on this point:  "Permission may also be required for the use of copyrighted material as electronic course content even when such material is...available elsewhere on the internet...."  That's true as far as it goes.  All that's missing is equal emphasis on the converse:  Permission may not be required for the use of copyrighted material as electronic course content (e.g. when the use is covered by fair use or when the copyright holder has already given blanket consent through a CC license or the equivalent).
  • The Marquette agreement is identical to the Syracuse agreement on this point, but the Hofstra agreement adds the following:  "When available, it is preferable to link to materials already legally available at another side rather than scan or make a digital copy of the materials."  This is out of place if the document is really about the law rather than user convenience or carbon footprint.  If making a copy is permissible, then it's permissible and there's no reason for universities to tie their own hands, especially when they need local copies as classroom hand-outs.   Sometimes copying is permitted as fair use even when the copyright holder wished it were not, and sometimes the copyright holder has deliberately removed permission barriers in order to facilitate copying and redistribution. 
  • The general approach of the three agreements is (1) to minimize rather than maximize fair use, (2) to assume that all copyrighted works stand under all-rights-reserved copyrights rather than some-rights-reserved licenses, and therefore (3) to serve conventional or toll-access publishers and disserve students, teachers, librarians, universities, and open-access publishers.
  • The good news?  The AAP now acknowledges that fair use applies to online content --hence, the need to limit it.  As recently as May 2007, it took the indefensible position (jointly with ALPSP and STM) that "there are exceptions and limitations to copyright laws that may in certain limited circumstances permit the copying of journal articles for certain purposes, but...these exceptions are thus far limited to traditional photocopying and do not permit the exploitation of such materials over the Internet."

Update. Also see the story in Library Journal Academic Newswire for January 22, 2008.

Wednesday, January 16, 2008

Another scientific society converts its journal to OA

The Journal of Cardiovascular Magnetic Resonance has published its first OA articles after converting to OA and moving from Taylor & Francis to BioMed Central.  Details in today's press release:

This journal (JCMR), the official journal of the Society for Cardiovascular Magnetic Resonance, is an open access, online journal that publishes articles on all aspects of basic and clinical research on the design, development, manufacture, and evaluation of magnetic resonance methods applied to the cardiovascular system....

Update.  Also see the BMC blog post on JCMR's conversion.

More on Nature's OA policy for genome research

Kim Thomas, Nature makes genome chain officially free, Information World Review, January 16, 2008.  Excerpt:

Nature Publishing Group has introduced a Creative Commons licence for articles in scientific journal Nature that publish the primary sequence of an organism’s genome.

Nature already makes reports on genome sequences freely available for use by other researchers. The new licence formalises that arrangement, according to David Hoole, head of content licensing for Nature....

The licence being adopted will let researchers freely share and adapt the work, provided the original work is attributed and not used for commercial purposes, and that any resulting work is distributed under a similar licence. The licence will be applied retrospectively to genome sequencing articles already published.

“In effect, [genome sequences are] just large amounts of data, and we’ve always been in favour of open data, sharing, reusing and archiving data in relevant databases, so it just seemed like a logical extension of that approach,” he said.

Hoole added that NPG did not plan to introduce Creative Commons more generally: “It’s not a step towards broader use of the licence; it’s finding very specific applications for it and using it where it makes sense.”

PS:  For background, see my blog post from December 6, 2007.

Happy birthday, Wikipedia

An IR for Leeds Metropolitan U

Leeds Metropolitan University has launched an institutional repository.

Update (1/17/08).  My mistake:  Leeds Metropolitan has only set up a pilot repository and is evaluating its options before launching an official, working repository. 

NPG compatible with funder OA mandates

Alf Eaton, Depositing Nature articles in PubMed Central, HubLog, January 15, 2008.  Excerpt:

Nature Publishing Group journals don't submit articles automatically to PubMed Central, so don't appear on the NIH's Journals That Submit Articles To PubMed Central list.

However, as per the NPG author license policy,

When a manuscript is accepted for publication in an NPG journal, authors are encouraged to submit the author's version of the accepted paper (the unedited manuscript) to PubMedCentral or other appropriate funding body's archive, for public release six months after publication. In addition, authors are encouraged to archive this version of the manuscript in their institution's repositories and, if they wish, on their personal websites, also six months after the original publication.

See also the ROMEO page for NPG, which confirms compliance with all the Open Access mandates produced by funding bodies so far (though it doesn't include the new ERC mandate yet).

(BTW, the 'offprint request' bookmarklet works well on NPG article pages. Might be worth updating that to give more of a reminder to deposit articles in PubMed Central.)

Update.  Here are some further details on Nature's self-archiving policy, thanks to Maxine Clarke, Nature's Publishing Executive Editor:

When a paper is accepted for publication at a Nature journal, the author receives a letter of acceptance to confirm this fact and also that encourages him or her to upload to PMC/appropriate archive. This is a standard message that is on every single letter of acceptance for publication and has been the case for some years now.

See also our author and referees’ website  -- all authors are referred to this for policy guidance upon submission:

Contributions being prepared for or submitted to a Nature journal can be posted on recognized preprint servers (such as ArXiv or Nature Precedings), and on collaborative websites such as wikis or the author's blog. The website and URL must be identified to the editor in the cover letter accompanying submission of the paper, and the content of the paper must not be advertised to the media by virtue of being on the website or preprint server.

Our policy on the posting of particular versions of the manuscript is as follows:

  1. You are welcome to post pre-submission versions or the original submitted version of the manuscript on a personal blog, a collaborative wiki or a preprint server at any time (but not subsequent pre-accept versions that evolve due to the editorial process).
  2. The accepted version of the manuscript, following the review process, may only be posted 6 months after the paper is published in a Nature journal. A publication reference and URL to the published version on the journal website must be provided on the first page of the postprint.
  3. The published version ­ copyedited and in Nature journal format ­ may not be posted on any website or preprint server.

Posting of articles on authors', institutions' and funders' websites after publication is explained in NPG's license to publish policy.

And this:

When a manuscript is accepted for publication in an NPG journal, authors are encouraged to submit the author's version of the accepted paper (the unedited manuscript) to PubMedCentral or other appropriate funding body's archive, for public release six months after publication. In addition, authors are encouraged to archive this version of the manuscript in their institution's repositories and, if they wish, on their personal websites, also six months after the original publication. In all these cases, authors should cite the publication reference and DOI number on any deposited version, and provide a link from it to the URL of the published article on the journal's website (see publications A-Z index).

After a long delay, OA for NZ statutes

Stephen Bell Wellington, Public Access to Legislation project online at last, Computerworld NZ, January 16, 2008.  Excerpt:

At long last, the Public Access to Legislation (PAL) system is accessible to the public. The system, formally released by Attorney-General Michael Cullen today, puts New Zealand statutes, regulations and Bills progressing through Parliament on a revamped website, [New Zealand Legislation].  The same XML-based database used on the site will generate the hard-copy versions of legislation.

PAL was originally scheduled for completion in February 2003.

Its progress was marked by delays and budget blowouts remarkable even for government computer projects....

The database of legislation will be brought fully up-to-date over about the next three months, Cullen says, and it will then be possible to close down the “interim” website run by legal publisher Brookers and the information broker Knowledge Basket, which has to date been the public’s only free online access to Bills in their progress through Parliament....

On completion of the database, the PAL output will become the definitive expression of New Zealand law.

More on the EU Council Conclusions on OA

Stevan Harnad, Critique of EU Council's Conclusions (again heavily influenced by the publisher anti-OA lobby), Open Access Archivangelism, January 16, 2008.  Excerpt:

Here is the video of my presentation to the DRIVER Summit:

Institutional Versus Central Deposit:
Optimising DRIVER Policy for the OA Mandate and Metric Era

Also to be discussed at the DRIVER Summit is this statement by the EU Council (not to be confused with the European Research Council (ERC), which has mandated OA self-archiving!) The EU Council's Conclusions show the tell-tale signs of penetration by the publisher anti-OA lobby; familiar slogans, decisively rebutted many, many times before, crop up verbatim in the EU Council's language, though the Council does not appear to realize that it has allowed itself to become the mouthpiece of these special interests, which are not those of the research community:
Council of the European Union: Conclusions on scientific information in the digital age: access, dissemination and preservation
Here is my critique of this EU Council statement (all [italic]quotes are from the Council's statement, the underscores have been added):
"the importance of scientific output resulting from publicly funded research being available on the Internet at no cost to the reader under economically viable circumstances, including delayed open access"

(1) 'At no cost to the reader' conflates site-licensing and Open Access (OA). This wording was no doubt urged by the publisher lobby. The focus should be on providing free online access webwide. That is OA, and that makes the objective clear and coherent.

(2) 'Delayed open access' refers to publisher embargoes on author self-archiving. If embargoes are to be accommodated, it should be made clear that they apply to the date at which the access to the embargoed document is made OA, not to the date at which the document is deposited, which should be immediately upon acceptance for publication. The DRIVER network of Institutional Repositories (IRs) can then adopt the 'email eprint request' button that will allow individual users to request and receive individual copies of the document semi-automatically....

"ensure the long term preservation of scientific information -including publications and data"
This is an example of the complete conflation of OA-provision with digital preservation, including a conflation of authors' supplementary postprints with the publisher's original, as well as a conflation of research publications with research data....

"promoting, through these policies, access through the internet to the results of publicly financed research, at no cost to the reader, taking into consideration economically sustainable ways of doing this, including delayed open access"

Economic sustainability is again a red herring introduced by the publishing lobby into language that should only concern the research community and research access. The economic sustainability of publishing is not DRIVER's concern.
DRIVER's concern should be interoperable OA-provision....

"B. Invitation to the Commission to implement the measures announced in the Communication on "scientific information in the digital age: access, dissemination and preservation", and in particular to: 1. Experiment with open access to scientific publications resulting from projects funded by the EU Research Framework Programmes by: defining and implementing concrete experiments with open access to scientific publications resulting from Community funded research, including with open access."

This is a vague way of saying that the publishing lobby has persuaded the EU not to do the obvious, but to keep on 'experimenting' as if what needed to be done were not already evident, already tested, already demonstrated to work, and already being done, worldwide (including by RCUK, ERC, NIH, and over a dozen universities):

The EU should mandate that all EU-funded research articles (postprints) are deposited in the fundee's IR immediately upon acceptance for publication. Access can be set in compliance with embargoes, if desired. And data-archiving should be strongly encouraged. DRIVER's concern should be with ensuring that the network of IRs has the requisite interoperability to make it maximally useful and useable for further research progress....

PS:  Also see my own comments on the EU Council Conclusions (one, two).

OA conversion complete for nutrition journal

Last October, the Scandinavian Journal of Food & Nutrition announced that in January 2008 it would convert to OA, move from Taylor & Francis to Co-Action Publishing, and change its name to Food & Nutrition Research.  It has now done all that and more.  It's also providing OA to its six-year backfile. 

Bentham's OA publishing program

I recently asked Matthew Honan, Editorial Director at Bentham Science Publishers, to describe the recent progress and future plans of Bentham's OA publishing program. I thank him for permission to post his response:

Bentham Science Publishers (BSP) publishes 89 subscription journals many of which are leaders in their respective fields. BSP embarked on an ambitious program for the development and launch of up to 300 open access journals from May 2007. The open access publishing division was launched under the banner Bentham Open at

Originally, the plan was to launch separate Letter/ Short communication journals as well as journals publishing review and research articles. This initial plan was changed some months later after creditable suggestions from some editorial board members that stratification of journals in the same area in open access would probably not succeed. Ultimately, BSP decided to publish all three article types in one subject journal as this would provide a much stronger journal and one that would attract more papers; this policy appears to be effective. Bentham Open now plans to publish 200 open access journals. Until 31st December 2007, Bentham Open has launched 163 journals.

Most major subject areas in the natural and physical sciences, medicine, engineering etc. are now covered by new open access journals launched by Bentham Open.

All submitted articles are extensively peer-reviewed with the emphasis to process submitted articles rapidly.

The Editorial Director, Dr. Matthew Honan says that the future plan is to strengthen the existing publishing program for open access journals as well as to launch a number of new open journals. Bentham Open aims to be one of the largest and successful open access journals' publishers. Currently, Bentham Open's publication fees are one of the lowest offered in the industry which are under USD 1000 per published article. In 2008, Bentham Open will announce a new, improved, innovative publication fee model for its potential authors.

Three new OA books on OA repositories

An announcement from SURF today points to three new books on OA repositories from Amsterdam University Press.

  1. Kasja Weenink, Leo Waaijers, and Karen van Godtsenhoven (eds.), A DRIVER's Guide to European Repositories:  Five Studies of Important Digital Repository Related Issues and Good Practices, Amsterdam University Press, January 2008.  Here's the OA edition.  For more details, see my blog post from yesterday.
  2. Muriel Foulonneau and Francis André, Investigative study of standards for Digital Repositories and related services, Amsterdam University Press, January 2008.  Here's the OA edition.
  3. Maurits van der Graaf and Kwame van Eijndhoven, The European Repository Landscape : Inventory study into present type and level of OAI compliant Digital Repository activities in the EU, Amsterdam University Press, January 2008.  Here's the OA edition.

From the SURF announcement:

...The EC-funded DRIVER project is leading the way as the largest initiative of its kind in helping to enhance repository development worldwide. Its main objective is to build a virtual, European scale network of existing institutional repositories using technology that will manage the physically distributed repositories as one large scale virtual content source. As part of the DRIVER project, three strategic and coordinated studies have been conducted on digital repositories and related services. They are aimed at repository managers, content providers, research institutions and decision makers - all key stakeholders who are taking an active part in the creation of the digital repository infrastructure for e-research and e-learning. SURffoundation is the Dutch partner in the DRIVER project, and responsible for the publication of the studies. The studies are published on the 16th of January 2008 together with Amsterdam University Press....

Public funding for Germany's OA-Netzwerk project

Building the German Network of Open-Access-Repositories, a press release from Germany's DINI (Deutsche Initiative für Netzwerkinformation) project, January 16, 2008.  Excerpt:

The DFG (German Research Foundation) supports a project for the networking of certified repositories in Germany

Today, almost all German Universities and numerous scientific institutions systematically provide free online access to scientific publications and other materials (Open Access). Germany with its over 100 digital repositories takes second place behind the USA. In order to increase the worldwide perception and effect of the German Research contribution, the recently started project "Open-Access-Network" (OA-Network) [OA-Netzwerk] seeks to intensify the national networking of these repositories.

Supported by the DFG, the project OA-Network is a joint collaboration of the Humboldt Universität zu Berlin and the Universities of Göttingen and Osnabrück. It aims to virtually integrate all document and publication services with a DINI certificate and to increase the number of DINI certified repositories. The DINI certificate was developed by the German Initiative for Network Information (DINI) to evaluate the quality of publication services. By referring to international standards and quality criteria, Germany takes a leading role in certifying document and publication services. These certified repositories easily blend in overall networks such as the DRIVER pan-European repository infrastructure (Repositories Infrastructure Vision for European Research).

Networking will not only be pushed forward organisationally, but also technically and infrastructurally. OA-Network therefore supports repository managers in the certification process and, at the same time, issues a number of services on its platform. Documents are accessible by full-text search, search via metadata and overall browsing. The platform offers additional services, i.e. information about new documents (Alerting), export functions for common reference formats and the link-up of printing services (Print on Demand). Furthermore, the project integrates future developments from other electronic publishing projects such as user statistics and citation analysis.

Australia's science minister supports open dissemination of science

Kim Carr, Liberating the voices of science, The Australian, January 16, 2008.  Senator Carr is Australia's Minister for Innovation, Industry, Science and Research.  (Thanks to Colin Steele.)  Excerpt:

...The [Australian] Government looks to its science and research agencies to provide cutting-edge scientific research from which policy can be formulated....

In this context, it is essential to communicate new ideas and to infuse public debate with the best research and new knowledge.

Public debate must be as well informed as possible....

I support and commend this policy position [that "a national research agency...should discharge its public role by being readily and rapidly available to provide information on the most up-to-date science and technology and its implications for the nation"], and believe we can do better....

The Rudd Government is committed to creating a charter (akin to that of the ABC) for public research agencies, including the CSIRO, the Australian Institute of Marine Science and the Australian Nuclear Science and Technology Organisation.

These charters will identify and guarantee the responsibilities and obligations of each organisation. They will enshrine not only the right, but the obligation, of scientists and other researchers to participate in public research debates [without censorship or political interference].

As an initial step, I will be consulting with public research agencies on developing a policy addressing these issues.

The principles guiding such a policy will include: ...

- Support for the open communication, dissemination of information and debate about the results of scientific, technical and social research....

Comment.  Senator Carr is responding to the problem of political interference with science, not the problem of price barriers to publicly funded research.  But the remedy may address both problems, showing that they are connected.  We saw a similar development in the US in June 2006 when Sen. John McCain introduced an amendment in the Senate to ensure "the open exchange of data and results of research by Federal agency scientists" as a response to the political interference with science by the Bush administration.  (Several Australian funding agencies already have OA policies to eliminate price barriers to publicly funded research.)

Fight to enforce a Russian OA mandate for public info

According to a 2005 Russian law, certain government information and national standards must be OA on a government web site.  But the key agency is not complying and appears to have ties to businesses which have been selling the same information to the public.  Last month, a new cabinet decree clarified the law (spelling out OA as access that is "free of charge") and a private institute is pushing for its enforcement. 

For details, see State standards become open and free of charge, C-News: Russian IT Review, January 15, 2008.  Excerpt:

The Institute for Information Freedom Development (IIFD) fighting for the state standards to be available in the internet has managed to persuade the Russian government in its rightness. On the eve of the New Year’s holidays the decree of the RF cabinet of ministers setting the procedure to publish the national standards on the site of the Federal Agency for Metrology and Technical Regulation (Rostehregulirovanie) has been enforced....However, the officials do not intend to give up.

The Decree contains the item in accordance with which Rostehregulirovanie ‘has to publish the standards on the official site of the Federal Agency for Metrology and Technical Regulation on a regular basis and free of charge’....

The institute hopes if will be hard for the officials to question the official document. ‘The public servants, who are used to selling the state standards, have no loopholes this time, as the document highlights the access should be free of charge. The term ‘open access’ left some room for manoeuvre. However, now many will have to accept the fact that their profitable business exists no longer’, - Ivan Pavlov says....

"Building a web of data"

Aaron Swartz has launched, a wiki-based collection of tools and tips for those who scrape, process, view, and host large data sets.  (Thanks to Jonathan Gray.)  From the site:

This is a site for large data sets and the people who love them: the scrapers and crawlers who collect them, the academics and geeks who process them, the designers and artists who visualize them. It's a place where they can exchange tips and tricks, develop and share tools together, and begin to integrate their particular projects....

Some of us have spent years scraping news sites. Others have spent them downloading government data. Others have spent them grabbing catalog records for books. And each time, in each community, we reinvent the same things over and over again: scripts for doing crawls and notifying us when things are wrong, parsers for converting the data to RDF and XML, visualizers for plotting it on graphs and charts.

It's time to start sharing our knowledge and our tools. But more than that, it's time for us to start building a bigger picture together. To write robust crawl harnesses that deal gracefully with errors and notify us when a regexp breaks. To start converting things into common formats and making links between data sets. To build visualizers that will plot numbers on graphs or points on maps, no matter what the source of the input.

We've all been helping to build a Web of data for years now. It's time we acknowledge that and start doing it together.

Tuesday, January 15, 2008

Entomology converts to OA after 100+ years of publication

Psyche: A Journal of Entomology has converted to OA after 100+ years of publication.  See today's announcement from Hindawi, the journal's new publisher:

The Hindawi Publishing Corporation is pleased to announce that "Psyche: A Journal of Entomology" has joined the company's growing collection of open access journals. Since its inaugural issue in 1874, Psyche has been published by the Cambridge Entomological Club and has served as a leading forum for the publication of fundamental entomological research. Hindawi is currently in the process of digitizing the back volumes of Psyche, which will be made freely available on the journal's website over the coming months....

"The Cambridge Entomological Club is pleased that Hindawi has taken on the job of publishing the Club's journal, Psyche" said Matan Shelomi, President of the Cambridge Entomological Club. "We look forward to the establishment of a modern, professional editorial and production process for the journal and are happy to see the journal on a road to financial self-sufficiency. As a nonprofit educational organization, the Club is especially gratified to see the public interest served through the vehicle of open access publication."

Psyche is currently accepting new submissions, and all published articles will be made freely available under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited....

OA articles for continuing medical education

Melissa Norton, Medscape develops Continuing Medical Education based on open access research articles, BioMed Central blog, January 14, 2008.  Excerpt:

Several open access articles published by BioMed Central have recently been used by Medscape to produce Continuing Medical Education (CME) material.  These articles span a broad range of specialties, including general practice, psychiatry, surgery, endocrinology, and preventive medicine. 

It is wonderful to see open access research being used in this way to create CME. A key advantage of open access is the opportunity it provides to reuse published research to educate researchers, clinicians and the general public. BioMed Central is planning increased activity in the area of medical education over the coming months. We are working with Medscape to increase the number of CME activities created from BioMed Central articles, including the use of case reports from the Journal of Medical Case Reports. The latest journal metrics also show that BMC Medical Education is rapidly establishing a reputation for itself in this field, and we hope to build on the success of this journal. Lastly, plans are being developed for a new Journal of Clinical Education Resources, scheduled to launch in the first half of 2008....

Five studies of institutional repositories

Kasja Weenink, Leo Waaijers, and Karen van Godtsenhoven (eds.), A DRIVER's Guide to European Repositories:  Five Studies of Important Digital Repository Related Issues and Good Practices, Amsterdam University Press, January 2008.  This is a dual-edition book.  The print edition costs €35.00, and the OA edition is released under a CC-BY-NC-ND license.  (Thanks to Napoleon Miradon.)  Here are the five studies:

  • Alma Swan, The business of digital repositories (Chapter 2)
  • Vanessa Proudman, The population of repositories (Chapter 3) (Proudman's case studies are also online.)
  • Wilma Mossink, Intellectual property rights (Chapter 4)
  • René van Horik, Data Curation (Chapter 5)
  • Barbara Sierman, Long Term Preservation for institutional repositories (Chapter 6)

Update.  Alma Swan's chapter was separately self-archived in September 2007 (and blogged here at the same time).

Update.  Also see the SURF press release on this book and two other recent, related publications (one, two).

Update. Vanessa Proudman has excerpted some highlights from her chapter, Seventeen guidelines for stimulating the population of repositories.

Oversimplifying the evaluation of sources

Edward Bilodeau, Academic banning of Google and Wikipedia misguided, Edward Bilodeau's Weblog, January 14, 2008.  Excerpt:

I came across a news item this morning reporting that a lecturer in the UK has banned her students from using Google and Wikipedia for their assignments. While railing against the variable quality of resources available on the Web is popular in academic circles, calls to restrict usage of the web in higher education are misguided and, in my opinion, do a great disservice to students....

In presenting assignments, professors should specify what kinds of works are acceptable and which are not. The acceptability of a resource should be based on its intellectual content, and not on the media with which it is delivered, nor the tool or process by which it is located....

Back in the 1990's, [the] assumption [that online resources were not appropriate for a university paper] would have been a safe one to make....Today, that same assumption is far less likely to be accurate. Are books no longer acceptable once they are digitized? The trend in academic libraries towards e-journals, and any of these e-journals are or will be indexed by Google and other search engines. Will they still be considered appropriate resources? What about the 3000 or so journals listed in the Directory of Open Access Journals, which have no paper manifestation. Are they valid academic resources? ...

Teaching [students] an oversimplified model for assessing information resources, while potentially effective at getting them to change their habits, does little to develop the skills they need to be successful in their studies and their future careers.

More on the BMC upgrade to Open Repository

Tracey Caldwell, BioMed Streamlines Open Repository With DSpace Features, Linux Insider, January 15, 2008.  Excerpt:

Open access publisher BioMed Central has boosted its Open Repository with open source features from digital content distributor DSpace to make it easier for customers to browse and submit material to the hosted repository solution online.

The latest version of Open Repository allows institutions to convert their Word, Excel, PowerPoint, text and RTF (Rich Text Format) documents to PDF (Portable Document Format). Customers can also set up RSS feeds, and customize lists and search fields.

Dominic Tate, business development manager at BioMed Central, said: "We have become increasingly involved with DSpace, and our technical architect, Graham Triggs, is responsible for the strategic direction of DSpace. We, in turn, feed some development back to DSpace."

Some of the features introduced to Open Repository were under development by DSpace. BioMed Central commissioned others to appeal to new and potential customers.

BioMed Central runs repositories for the Universities of Wales, Chester and Roehampton. The latter has recently switched from Proquest Digital because Open Repository represents "much better value for money," according to Pat Simons, bibliographic and technical services manager at Roehampton University....

PS:  For background, see BMC's announcement of the upgrade in November 2007.

Archaeology journal retreats from free online access

Naomi J. Norman, A Letter from the Editor-in-Chief, American Journal of Archaeology, January 2008.  (Thanks to Sebastian Heath via Charles Ellwood Jones.)

The editorial is a free PDF, but it's locked to prevent users from cutting/pasting.  (Why?)  So here's a paraphrase in lieu of an excerpt:  AJA used to charge for print subscriptions and make read-only PDFs available free of charge on its web site.  But starting now, it will offer electronic subscriptions and (except for this editorial) stop offering free PDFs. 

Norman hopes this will change will expand AJA's readership.  My bet is that it will have the opposite effect. 

AJA will still offer OA to museum exhibitions, books, images and data for selected articles, and some bibliographies.

French geology journal makes its backfile OA

All the back issues of Carnets de Géologie (Notebooks on Geology) are now OA through HAL.  You can also access the six-year OA backfile through INIST and the Geoscience e-Journals portal.  (Thanks to Hélène Bosc.)

Harnessing an intellectual commons to master complexity

John Wilbanks, Complexity and the Commons, John Wilbanks' blog, January 9, 2008.  (Thanks to Glyn Moody.)  Excerpt:

...[I've been] preoccupied with the reasons for why it takes so long and costs so much to get a drug to market....

I am very skeptical that the law is the key, single solution here. Mainly because I believe the law is only a symptom of the big problem. I do believe that the commons itself is the solution, but not because of intellectual property.

It’s the solution because it’s an incredibly efficient way to generate knowledge rapidly, at low costs. And knowledge is what we need. A commons is the infrastructure for distributed collaboration and innovation in the life sciences, and we should be thinking about the law in terms of how to expose and integrate as much knowledge as possible at the lowest possible transaction cost.

The primary reason drugs are so expensive to discover and develop is that we just don’t know very much about the human body. The problem isn’t the law. To paraphrase Bill Gates, the problem is complexity (read the whole thing, it’s worth it)....

Complexity is the core problem. Human bodies are so mindbogglingly complex that we can’t accurately predict…well, very much of anything about them. There is a knowledge gap a mile wide and ten miles deep between where we are today and a world in which it costs less in terms of time and money to get a drug to market....

As a result, the best way we have to test drugs is to give them to people and see what happens....

The solution to the knowledge gap has, to date, been to simply take more shots on goal. We throw eight million potential drug compounds at a target and check to see what sticks, then we cull the field, again and again, tweaking here and there.

It’s like playing roulette and winning by betting on every square, then patenting the one that wins and extracting high rents from it. Patents are much less the problem in such a scenario than the fact that we are playing roulette. Changing the odds is the better answer – it lowers the pressure to rely on patents! ...

One of the reasons I believe so deeply in the commons approach (by which i mean: contractually constructed regimes that tilt the field towards sharing and reuse, technological enablements that make public knowledge easy to find and use, and default policy rules that create incentives to share and reuse) is that I think it is one of the only non-miraculous ways to defeat complexity. If we can get more people working on individual issues – which are each alone not so complex – and the outputs of research snap together, and smart people can work on the compiled output as well – then it stands to reason that the odds of meaningful discoveries increase in spite of overall systemic complexity.

This is not easy as far as solutions go. It requires open access to content, journals and databases both. It requires that database creators think about their products as existing in a network, and provide hooks for the network, not just query access. It requires that funders pay for biobanks to store research tools. It requires that pharmaceutical companies take a hard look at their private assets and build some trust in entities that make sharing possible. It requires that scientists share their stuff (this is the elephant in the lab, frankly). It requires that universities track sharing as a metric of scientific and societal impact.

It is not easy. But it is, in a way, a very simple change. It just requires the flipping of a switch, from a default rule of “sharing doesn’t matter” to one of “sharing matters enormously”. It is as easy, and as hard, as the NIH mandate on open access. It’s a matter of willpower....

Complexity is the enemy. Distributed innovation, built on a commons, is a strong tonic against that enemy.

CODATA activity in 2007

CODATA has published its Highlights from 2007.  Excerpt:

...CODATA proposes guidelines for implementing the GEOSS Data Sharing Principles

In 2006, the Group on Earth Observations (GEO), established Task DA-06-01, Furthering the Practical Application of the Agreed GEOSS Data Sharing Principles, and invited GEO Members and Participating Organizations to help implement the task....

CODATA helps to form COMMUNIA, a European Thematic Network

Working back through the year, in November CODATA signed a European Thematic Network Agreement with the Politecnico of Turin, Italy on the “Public Domain in the Digital Age.” This 36-partner project, known as COMMUNIA, aims to build a network of organizations to become the single European point of reference for high-level policy discussion and strategic action for all issues relating to the public domain in the digital environment....

CODATA hosts Workshop in Paris in collaboration with the Global Biodiversity Information Facility (GBIF) and Science Commons

In September 2007, CODATA, in collaboration with GBIF and Science Commons , hosted a meeting at the Sorbonne Paris on common use licensing of scientific data products....

CODATA Task Groups

The Task Group on Preservation of and Access to Scientific and Technical Data in Developing Countries organized a major workshop in Brazil in May....

More on the need for faculty education

Andrea Foster, Librarian: Ohio State Professors Need Copyright Refresher, The Wired Campus, January 14, 2008.

Beware of faculty members who are clueless about whether they hold the copyrights to their research papers, Trisha Davis, a librarian at Ohio State University, told a group of librarians today at the midwinter conference of the American Library Association.

She made the remark while discussing the challenges Ohio State faced in building an institutional repository. The university has over 21,000 articles — including conference papers, teaching materials, photographs, and multimedia works — in the archive.

Faculty members will submit research papers to the repository often unaware that they have signed away the rights to their work to a journal publisher, Ms. Davis said. “They are stunned that they have not retained the copyrights,” she said. “They’re vehemently adamant” that they still have rights to the work....


  • True and deplorable.  But for perspective, ask which is worse:  faculty members who don't realize that they regularly sign away their copyrights and are appalled learn it, or faculty members who know what they're doing and are happy to do it.  Or:  faculty members who don't know what they do with their copyrights and don't want to self-archive either, or faculty who at least want to self-archive.
  • Also for perspective:  it's equally true, and for most faculty members even less well-known, that the vast majority of publishers allow authors to self-archive their preprints, their postprints, or both, even after transferring copyright.

Following up OA mandates with faculty education

John Willinsky, When Free Access to Research Is Mandated by Law,, January 14, 2008.  Excerpt:

My hope for 2008 is that Stevan Harnad will prove prophetic once again, this time by declaring this the year of the mandate, or as he puts it with his notable precision, “the year of institutional Green OA self-archiving mandates.” ...

Coming into 2008, archiving mandates are in place for both the Canadian Institutes of Health Research and the U. S. National Institute of Health. Given that the NIH has an annual budget of $28 billion leading to 80,000 articles annually, this particular mandate has all the makings (and weight surely) of a tipping point (with credit going to Heather Joseph, Director of SPARC, who led the fight in Washington on the NIH policy, as well as other strong archiving advocates, such as Peter Suber and Stevan Harnad).

The archiving mandate may seem to strike the perfect balance among three of the critical factors at issue in scholarly communication at this point. The posting of the articles leads to greater openness in science; it doesn’t violate the academic freedom of researchers (who are free to publish anywhere); and it protects the business side of scholarly publishing. Or does it? Well, the American Association of Publishers says the NIH manadate threatens their interests, even though it is designed to protect the value of journal subscriptions, both by delaying the availability of the copy for typically 12 months after publication and by allowing the posting of only the “electronic version of their final, peer-reviewed manuscripts,” as the NIH policy puts, it, rather the publisher’s official final version.

While I’d love to add further points to Peter Suber’s already effective rebuttal of the publisher’s position, as well as consider not always salutary implications of archiving for establishing two-tiered access to scholarship, let me save all of that for a future blog post. It is going to be the year of the mandate, after all. What’s needed at this point, far more urgently than critical musings, are some ideas for kindling a little more faculty interest and enthusiasm on university campuses for archiving scholarly work, whether it is mandated by a funding agency or not.

The mandates have proven necessary because of a certain indifference and lack of awareness among faculty members of the scholarly benefits of open access as it can “help advance science and improve human health” as the NIH mandate puts it. It falls, then, to those who can see the value and good of greater access to help their colleagues realize that archiving is more than just another bit of bureaucratic bumph thrown in their path, slowing down their scholarly productivity....

The first step is to ask whether the university has an archive available for posting this work....

The next step is to help the institution make sure that, whether researchers face an archiving mandate or not, people know why they should archive. The archive, as well as any emails sent out about the archive, should have links to a webpage that does its best to inform and encourage faculty and students by presenting the basic case for increasing access to research in this way. That page could look something like this and do feel free to use these examples:

    Reasons to Archive Journal Articles

    (a) Archiving advances the very spirit of openness and exchange that is basic to scholarship, as it demonstrably increases the readership and citation of work that is made freely available, according to a continuing series of studies;

    (b) Archiving contributes to current efforts (e.g., INASP) to ensure that scholars worldwide have access to more of the research which their institutions would otherwise not be able to afford;

    (c) Archiving is currently permitted (whether there are mandates in place or not) by the majority of journal publishers (including such major players as Elsevier, Springer, Wiley, and Sage) with the publishers’ archiving policies now part of a convenient database that can be readily searched and consulted;

    (d) Archiving can be further ensured and the author’s intellectual property rights further asserted, through alternative licensing agreements that researchers are now using with publishers when their articles are accepted;

    (e) Archiving is now being mandated, out of a recognition for what it contributes to the circulation of knowledge, by a number of funding agencies and institutions;

    (f) Archiving is part of a larger movement to increase access to knowledge that also includes alternative forms of open access publishing for journals and conference papers, involving dedicated open access publishers (e.g. BMC), university libraries (e.g. UBC ) or groups using open source software (e.g. DpubS), while at the same time, this new spirit of openness is contributing to open data (e.g., Dataverse Network) and open notebook science (e.g. Useful Chem) initiatives....

Monday, January 14, 2008

Milestone for UC eScholarship Repository

UC eScholarship Repository exceeds 5 million full-text downloads, a press release from the University of California, January 14, 2008.  Excerpt:

The University of California announced this week that its widely used eScholarship Repository has surpassed the 5 million mark for full-text downloads of its open access scholarly content. This major milestone reflects the impressive adoption and usage rate the repository has enjoyed since its inception in 2002, with University of California academic units and departments from its 10 campuses publishing or depositing more than 20,000 papers and works.

The eScholarship Repository, a service of the California Digital Library, provides a robust full-spectrum, open access publishing platform for pre-prints, post-prints, peer-reviewed articles, edited volumes and peer-reviewed journals. The repository houses a broad range of scholarly content from disciplines across the humanities, social sciences, mathematics and sciences.

The rate of usage of these materials has grown exponentially in the past five years, now often exceeding 55,000 full-text downloads per week.

As evidenced by this rate of activity, the eScholarship Repository represents one of the University of California’s most successful and sustained efforts to improve and provide innovative alternatives to the troubled scholarly publishing system – a system that increasingly struggles to serve the needs and requirements of the academic community....

Update. Also see the brief note on this news in The Wired Campus, which is generating some interesting reader comments. (Thanks to George Porter.)

Update (1/23/08).  Also see Anna Opalka's story in The California Aggie.  Excerpt:

..."The CDL is delighted by the substantial deposit rate and traffic sustained by the eScholarship Repository thus far," said Catherine Mitchell, acting director of the eScholarship publishing group and manager of the California Digital Library.

Mitchell said 90 percent of the visitors to the repository are directed from Google.

"We believe publishing in open access both increases the visibility of UC's substantial research efforts and provides a service to those academic institution and policymakers who might not otherwise have access to these kinds of materials - either because of their own restricted library holdings or because of a lack of awareness of the work being done at UC," Mitchell said....

Mitchell also said the CDL hopes to expand its faculty base, especially in the humanities, which "have been historically underrepresented in institutional repositories." ...

New OA projects in Sweden

Sweden's has updated its English-language page of new projects, January 14, 2008.  Excerpt:

The Autumn 2007 Call for Proposals in the programme was met with considerable interest. Seventeen project proposals were received representing a total sum of funding requested of 11 million SEK. Following the proposal of the programme's Steering Committee the National Librarian decided to fund 10 projects with a total sum of 4 million SEK during 2008-2009. A new co-funder, the Knowledge Foundation, entered the programme during this period with a special interest in promoting access to digital learning resources created within universities.

Content in Open Archives - Creating a Critical Mass

A number of projects aim to stimulate the growth of content in institutional repositories and to provide researchers with more information about new alternatives for them to make their research visible.

Two of these are continuations of earlier projects:
Information for Researchers about OA will further develop a web-resource created within the project Open Access - tutorial for researchers, and using it as a starting point arrange a series of seminars for researchers about Open Access at different universities across Sweden in close co-operation with local repository managers.

Journal Information to Researchers - Journal Info 2.0 will continue to develop the already internationally well known and well received service Journal Info....

Parallel publishing of scientific articles will together with researchers from different universities study and test workflows, to identify obstacles and needs for support functions in connection with parallel publishing.

Open Access Domain models … subproject II Domain modeling of rights and terms in connection with parallel publishing of scientific articles will work with structuring and adding information about publisher policies on rights and terms, aiming for services that connect rights data to individual articles in a consistent way.

Widening the Range of Material in Open Archives...

Open Educational Resources (OER) in open digital archives will survey the present state of production and use of OERs within a number of Higher Education institutions. It will then explore models and methods to make these resources more accessible within the framework of present IRs.

The two other projects are explorative studies concerned with managing complex, digital multimedia objects in open archives....

Hot Topics of Discussion...

Citation patterns in OA journals will make a systematic comparison of citation patterns in OA and non-OA scientific journals.

Open Access and information provision to private businesses will study the potential role of Open Access material as an information source for a number of Swedish companies and how private firms could contribute to the cost of publishing in an Open Access publishing model

Practical Support for Open Access Journals Publishing

Best Practices Guide to Open Access Journals Publishing aims to create a practical, comprehensive, electronically available guide that addresses the key issues for editors, researchers, librarians and university presses. The project is led by people with a long experience of commercial journal publishing and will systematically collect existing experience from the field and from earlier projects.

Interview with Christopher Leonard

John Dupuis, Interview with Christopher Leonard, Associate Publisher of PhysMath Central, Confessions of a Science Librarian, January 14, 2008.  Excerpt:

Welcome to the latest installment in my occasional series of interviews with people in the scitech world. This time around the subject is Christopher Leonard, Associate Publisher of PhysMath Central....

Q0. Hi Chris, tell us a little about yourself and how you ended up as Associate Publisher of PhysMath Central?

...As a publisher in Elsevier I certainly got a feel of the issues facing researchers and librarians very quickly. I was in charge of various portfolios of journals in mathematics, physics and computer science and all of these disciplines had the same concerns - namely the prices of subscription journals, 'big deal' packages for online access and the questioning of the value-add by the publisher -- this by people fully au fait with TeX and the issue of hosting articles for retrieval in a large database.

It didn't take me long to see that Open Access was the future. I was aware that the biosciences were well served by PLoS and BioMed Central, but there were few options for chemists, physicists, computer scientists etc, to get research published in peer-reviewed, open access journals. Then I heard that BioMed Central were looking to expand into exactly these areas. My former colleague at ChemWeb, Bryan Vickery, explained what they wanted to do with PhysMath Central and I was sold. So here I am, back in Middlesex House again, sat next to Bryan again!

Q1. Please tell us a little about PhysMath Central and how it works? What is your vision of a sustainable publishing business model, if I may use that term.

PhysMath Central is based on the same model as BioMed Central - namely that the costs of publishing are covered before publication. This way we can make the article available, permanently, for free. The articles are published under a Creative Commons license, which allows anyone to sue the article freely, so long as the original authors, citation details and publisher are identified. This means that an individual could, in theory, have a copy of all articles published by BMC or PMC, in full, on his hard disk, or even website. It also opens up a host of possibilities with data mining since all our articles are available in full text as XML/MathML.

We also make sure all of our articles are deposited and permanently archived in a number of national archives. For PhysMath Central, this includes, whom we work closely with (indeed authors can submit to us using just their arXiv IDs).

These 3 elements are what I call the ABC's of true open access publishing: Access (immediate and free), Back-ups (permanently archived by third parties) and Creative Commons (copyright model).

As to what makes a sustainable model, that answer will vary slightly from discipline to discipline. BioMed Central are just about to break even after 7 years of publishing with a model which combines institutional prepay subscriptions, payment from author research grants, sponsorship from funding bodies, and many other smaller sources. We also offer waivers automatically to all authors from countries with low-income economies (as defined by the World Bank).

In addition, in physics we are witnessing the birth of an exciting project called SCOAP3, which intends to centralize library budgets in high-energy physics to pay for open access to every paper in this field.

In other areas, research grants are typically either non-existent or very small. Here it is unreasonable to assume a grant-paying model will work, and as such we will investigate other options for covering open access charges. A central fund for an institute, or a whole field (as with SCOAP3), or even sponsorship would be a more likely way to achieve sustainable open access for these disciplines.

Open access publishing doesn't cost more than traditional publishing (indeed, it is much less)....

Q5. First it was BioMed Central, then Chemistry and PhysMath Central, what's next? Is computer science going to be included in PhysMath Central? It seems that the engineering disciplines are particularly poorly served by Open Access.

We certainly plan to launch our first computer science journals in 2008, so yes, computer science journals will be included in PhysMath Central, but given the name of that platform, it may make sense to have a dedicated computer science platform instead. Expect to hear some news on this before the summer.

You are right in saying that Engineering is not well served by Open Access - although Hindawi have some titles in this area. However once central funds for open access are in place, you will see an expansion of journals in all areas, including engineering and the humanities....

Q7. Is there an Open Access tipping point in the near future, where suddenly all sane scientists will suddenly think to themselves, "Of course! It all makes sense now! Why would I ever publish in a journal that wasn't Open Access?"

There was an excellent 'guerilla' project in some major US university libraries recently where students placed stickers on journals informing the reader of the price of a subscription to that journal. When you're suddenly confronted with the fact that your Nuclear Physics A subscription costs the library $25000, that's a shock the system....

I think the big tipping point, however, may be a financial one. As library budgets plateau or decrease, the 'big deal' packages simply become unaffordable. Having access to 1000s of journals switched off overnight may jolt people into realizing that open access isn't an option anymore, it's the only way which makes sense.

Q9. How about the recent push back from some of the commercial publishers, like the all the fuss about the PRISM Coalition?

I view a lot of this 'push back' with nagging sense of unease and disappointment. Of course directors of multi-million pound companies are going to defend their positions, but to do so with a disinformation campaign with the aim of spreading Fear Uncertainty and Doubt, leaves a sour taste in one's mouth. I think the number of publishers who have publicly dissociated themselves from PRISM is a sign that they seriously misjudged the mood of most of their members -- and the majority of scientists, if number of blog postings is anything to go by.

Thankfully with people like Peter Suber on our side, the open access movement is always ready to strike back with some facts. As each day passes it seems to me that open access for all areas of research is inevitable and unstoppable. My favourite quote on open access actually predates the movement and is from Albert Einstein: "The free, unhampered exchange of ideas and scientific conclusions is necessary for the sound development of science, as it is in all spheres of cultural life". Open access publishing is huge step forward to make this happen sooner, rather than later.

More on OA in Cuba

The new issue (vol. 10, no. 1, 2008) of Medicc Review is devoted to eHealth: Cuba Faces the Digital Divide.  (Thanks to Matt Cockerill.)  Here are the OA-related articles:

Austrian physics institute joins SCOAP3 project

Setback for solving the orphan works problem

Wendy Davis, Copyright Protection Stymies Online Archive, Online Media Daily, January 9, 2008.  Excerpt:

A campaign to make it easier to create online archives of books, films and software was dealt a setback this week by the U.S. Supreme Court, which declined to hear a case about the copyright law's constitutionality.

In the case, Internet Archive founder Brewster Kahle challenged the copyright protection given to so-called "orphan" works, or material for which the owner can't be found. Currently, if Kahle or other Web companies post such material, but the owner later steps forward and sues, the companies can't defend themselves on the grounds that they couldn't locate the owner.

Kahle, represented by [Lawrence Lessig and] the Center for Internet and Society at Stanford, had hoped to change that by arguing that the law protecting orphan works--passed before widespread Internet access--violated the First Amendment.

"The freedom to cultivate and spread our culture, enabled by these technological changes, is now significantly restricted by unnecessary legal regulation," Kahle argued to the federal appellate court. The 9th Circuit ruled against him last year....

Sometimes books and the like have been out of circulation for so long that the actual owners don't even realize they have a legal claim to a potentially valuable intellectual property right.

Legal experts say that locating the copyright owners in these situations is often a daunting and expensive task. "There are works that are locked up under copyright law where it is cost-prohibitive to find the owner and ask him for permission," says Eric Goldman, director of the High Tech Law Institute at Santa Clara University School of Law. In such cases, Goldman says, no one benefits by the current regime, which exposes anyone who posts orphan works to liability for copyright infringement.

The U.S. Copyright Office also recognized that the current system is problematic. "There is good evidence that the orphan works problem is real and warrants attention," states a 2006 report by that office.

Also see Lessig's reflections on the defeat and Aaron Swartz's observations of Lessig's oral argument.

Interview with editors of an OA journal

Elke Ziegler at interviews Katja Mruck und Günter Mey on OA.  Read the interview in German or in Google's English.  Mruck is editor-in-chief of the OA journal, Forum Qualitative Sozialforschung (FQS), and Mey edits FQS Conferences, FQS Interviews, and FQS Reviews.

Research access in Cuba

Cameron Neylon, Open Science and the developing world: Good intentions, bad implementation?  Science in the open, January 14, 2008.  Excerpt:

I spent last week in Cuba....I had the opportunity to talk to a range of scientists and to see the conditions they work under. One of the strong arguments for Open Science (literature access, data, methods, notebooks) is that it provides access to scientists in less priviledged countries to both peer reviewed research as well as to the details of methodology that can enable them to carry out their science. I was therefore interested to see both what was available to them and whether they viewed our efforts in this area as useful or helpful. I want to emphasise that these people were doing good science in difficult circumstances by playing to their strengths and focussing on achievable goals. This is not second rate science, just science that is limited by access to facilities, reagents, and information.

Access to the literature

There is essentially no access to the subscriber-only literature....

I talked to a few people about our protein ligation work and they were immensely grateful that this was published in an open access journal. However they were uncertain about publishing in open access journals due to the perceived costs.  While it is likely that they could get such costs waived I believe there is an issue of pride here in not wishing to take ‘charity’....

Overall though, it is clear that access to the peer reviewed literature is a serious problem for these people.  Open Access publishing provides a partial solution to this problem. I think to be effective it is important that this not be limited to self archving, as for reasons I will come back to, it is difficult for them to find such self archived papers. It is clear that mandating archival on a free access repository can help.

Access to primary data

Of more immediate interest to me was whether people with limited access to the literature saw value in having free access to the primary data in open notebooks. Again, people were grateful for the provision of access to information as this has the potential to make their life easier....

There were two major concerns; one is a concern we regularly see, that of information overload....The other concern, relating to them adopting  such approaches, was one that we have seen over and over again, that of ‘getting scooped’. Here though the context is subtly different and there is a measure of first world-developing world politics thrown in. These scientists are, understandably, very reluctant to publicise initial results because the way they work is methodical and slow. Very often the key piece of data required to make up a paper can only be obtained on apparatus that is not available in house or requires lengthy negotiations with potential overseas collaborators. By comparison it would often be trivially easy for a developed world laboratory to take the initial results and turn out the paper.

The usual flip side argument holds here; by placing an initial result in the public domain it may be easier for them to find a collaborator who can finish of the paper but I can understand their perspective....

The catch…

[There is] a very simple problem: bandwidth....

Call for open courseware in Canada

Michael Geist, Our universities could learn plenty from MIT, Toronto Star, January 14, 2008.  Excerpt:

In 1999, the Massachusetts Institute of Technology (MIT) faculty gathered to consider how they could use the Internet to advance knowledge and educate students around the world in science and technology.

The result was an ambitious plan – make the institute's course materials, including syllabi, lecture notes and exams, freely available online for a global audience.

Two years later, a pilot project called the MIT Open Courseware debuted with 50 courses. A year later, the project formally launched with 500 courses. Today, MIT Open Courseware features nearly every course offered by the institute – about 1,800 in all....

More than 90 per cent of MIT's faculty voluntarily participates in the program....

MIT Open Courseware attracts over 2 million visits each month....

What started with just MIT has grown into a consortium of dozens of universities from around the world that has published 5,000 courses in many different languages....

The sole Canadian participant in the Open Courseware consortium is Capilano College, a relatively small school with 6,700 students located in North Vancouver, B.C. The rest of Canadian higher education – Toronto, York, UBC, Western, Alberta, Queen's, Ottawa, McGill, Dalhousie, Waterloo and dozens more – are inexplicably missing in action....

Canadians pride themselves in being one of the world's most connected countries; however, the failure to lead on issues such as the Open Courseware consortium and open access to the results of Canadian research suggests that we are still struggling to identify how to fully leverage the benefits to education of new technology and the Internet.

Many of Canada's top universities may liken themselves to MIT, but the near-total absence of Canada from the Open Courseware consortium suggests that there is still much to learn.\

Comment.  MIT is also considering an OA mandate for the institutions's research output.

Sunday, January 13, 2008

EPL converts to hybrid OA, joins SCOAP3

EPL and Open Access Articles, Europhysics Letters, January 2008.  A "Publishers' Note" from the six-person EPL Management Committee.  Excerpt:

In May 2007 the EPLA [Europhysics Letters Association] Board of Directors welcomed the CERN initiative for the creation of a Sponsoring Consortium for Open Access Publishing in Particle Physics (SCOAP3) and agreed to enter into negotiations to enable high energy physics papers to be published in EPL with selective open access.

At a subsequent meeting in August 2007, the Board decided to offer substantial initial discount while open access remained a small fraction of the content of EPL. A necessary precursor to negotiation with SCOAP3 is a general open access policy. The Directors agreed that this policy should offer a free-to-read option for all authors in all sections of EPL and so provide fair opportunities across the broad range of physics covered by EPL. The policy for the journal should allow individual authors, their institutions, funding agencies or sponsoring consortia to pay for published articles to be freely available to all, permanently....

EPL would remain a subscription journal for content that is not free to read and authors, institutions or funding agencies may choose to pay for their articles to be open access.

As an initial step in this open access venture, a single-article fee of € 1000 ($ 1330) can now be paid by individuals who choose to have their article published free to all. This pricing, which is substantially discounted, ensures that EPL remains competitive with other similar journals. EPL will continue to ensure this policy is sustainable although the journal must remain financially viable and the pricing scheme will be under continual review.

At this stage we welcome enquires concerning an institutional membership fee that would allow that institute to pay in advance for open access publications in EPL for authors from that institute....

Remember all articles are already free to read for 30 days from their online publication date....

OA in India

Frederick Noronha, Open access publishing takes off in India, IANS, January 13, 2008.  Excerpt:

A small but growing number of Indian journals are moving to the free open access format of internet publishing.

"Many leading journals published in India are already open access. These include the journals published by the Indian Academy of Sciences, the Indian National Science Academy, Indian Council of Medical Research and the Calicut Medical College," Subbiah Arunachalam, a prominent Indian campaigner for open access, told IANS....

Both the government of India-run National Informatics Centre and the Mumbai-based private firm MedKnow publish open access journals on behalf of about 75 societies.

"India publishes about 100 OA journals. Actually, these are [dual-edition] journals - print plus electronic, with the print version sold against a subscription. No Indian journal charges a fee from the authors for publishing papers," said Arunachalam.

In Mumbai, the MedKnow model, run by a young medico, D.K. Sahu, who opted out of practising medicine and chose publishing, is considered an innovative model by standards in India and beyond.

Beyond individual journals, OA is making its impact at the level of repositories too.

About 30 institutions have set up their own institutional open access repositories using free software and open source software such as EPrints and DSpace.

The Indian Institute of Science (IIsc) was the first to set up the IISc EPrints archive, which has over 8,000 records.

The National Institute of Technology at Rourkela is the only Indian institution to have mandated open access for all faculty and student research publications.

There are three subject-based central repositories - one each for library and information science, medicine (NIC) and catalysis (Indian Institute of Technology at Madras)....

Meanwhile, the National Knowledge Commission has recommended mandating open access to all publicly funded research and the recommendation is now with the Prime Minister....

Groups like the Indian National Science Academy have also been looking deeper at the potential of OA.

The Indian Academy of Sciences is reportedly planning to place all papers by all fellows, past and present, on an open access archive. But such plans take time to implement....