Open Access News

News from the open access movement

Saturday, January 31, 2009

New OA journals of pharmacy science

Vikas Anand Saharan points out several new OA journals of pharmacy science:

New OA journal of Flaubert criticism

Flaubert. Revue critique et génétique is a new OA journal published in French by the Institut des textes et manuscrits modernes on the platform. See the January 30 announcement.

Flat World Knowledge starts public beta

David Wiley, Flat World Knowledge Public Beta!, Iterating Toward Openness, January 29, 2009.

[Flat World Knowledge], the open source textbook publishing company, has come out of private beta! ...

As a quick recap, FWK textbooks are much like traditional textbooks in that they are:

  • beautiful looking printed books,
  • written by world-class authors,
  • supported with all the supplementals and teaching aids (like an instructor manual, slides, and assessments) teachers expect, and
  • available as review copies (for teachers),

FWK textbooks are UNLIKE traditional textbooks in that they are:

  • licensed CC BY-NC-SA,
  • always available in full-text online for free,
  • offered in a variety of additional, affordable formats (paperback black-and-white ($30), full-color ($60), audio book ($30), individual book chapters as audio ($3), etc.),
  • supported by a variety of study aids available at the student’s option (NOT forcibly bundled with the book)

I’m SO excited about FWK because we’re going to show the world that extremely high quality open educational resources can be produced and disseminated in a way that is sustainable over the long term. Jump over to the Catalog page, choose a book with a Feb 2009 publication date, and click “Start Reading” to see what I’m talking about.

Another call for OA in Canada

Stephane Couture, Pour l’accès libre à la connaissance scientifique, Alternatives, January 29, 2009. Read it in the original French or Google's English.

The end of an early OA medical journal

Ivan Oransky, RIP: The Medscape Journal of Medicine, an open access pioneer, stops publishing new papers, Scientific American, January 31, 2009.  Excerpt:

A pioneering medical journal has fallen victim to the dramatic and wrenching changes that are overtaking the publishing industry: The Medscape Journal of Medicine (MJM), the first open access general medical journal, will no longer publish new papers, editor-in-chief George Lundberg and colleagues announced yesterday....

MJM came before BMC and PLoS. It was not the first open access medical journal when it was established in April 1999 as Medscape General Medicine, but it was among the first. Its offerings included original research, video commentaries, and even letters "to help clean up the messes made by any medical journals."

Unlike BMC and PLoS, which charge authors up to several thousand dollars per paper, MJM never charged author or publication fees. (Such fees are generally paid out of research grants, and are waived for those in developing countries.)

Perhaps because MJM did not have that source of revenue, Medscape -- owned by WebMD -- has decided to adopt a strategy similar to that of many other publishers: "We believe that we can provide the most value to our members by focusing on our role as an aggregator and interpreter of medical information and not as the primary source for original scientific articles." Its archives will remain live at Medscape, according to yesterday's announcement....

PS:  See our past posts on MJM and George Lundberg.

German regional law recognizes role of OA in a university's mission

The new Brandenburg Higher Education Act, adopted last month, refers to the fostering of OA as a forward-looking part of a university's mission.  Read it in German or Google's English.  (Thanks to Eric Steinhauer via Klaus Graf.)

Dutch agreement on digitizing orphan works

FOBID (Netherlands Library Forum) and VOI©E (Vereniging van Organisaties die Intellectueel eigendom Collectief Exploiteren, or Netherlands Association of Organisations for the Collective Management of Intellectual Property Rights) have issued a joint declaration on the digitization of orphan works.  From yesterday's press release:

...As far as is known, this is the first agreement of this type anywhere in the world between libraries and right holders. There is concern in many other countries too regarding how to deal with the rights of right holders who cannot be traced, i.e. the holders of rights in “orphan works”. If the arrangement that has now been accepted in the Netherlands is imitated in other European countries, it will have an enormous effect on the availability of recent works in the “Europeana” digital library....

The essence of the agreement is that the libraries that are represented receive permission, on certain conditions, from virtually all right holders to digitise their collections and make them publically available on their own premises for teaching or research purposes. The works concerned must be part of the Dutch cultural heritage and no longer commercially available. The libraries do not need to pay the right holders as long as the works are only made available on their own premises.

Separate consent is required, however, if the digitised works are made more widely available, for example by means of remote access or via the Internet. In that case, an agreed payment must be made....Even then, the library will not need to go in search of the right holders because this will be done by collecting societies such as Lira and Pictoright....

Individual right holders can naturally still object to their work being digitised and made accessible. In that case, the libraries and archives concerned are required to cease making the works accessible; in practice, very few titleholders actually object.

Kees Holierhoek, the chairman of the Lira copyright holders’ organisation and of the digital right holders working party, has this to say about the new agreement: “I’m very pleased about this agreement. It’s important for us that copyright should be respected, and that has been done in this case. At the same time, the agreement has done away with a major obstacle to making texts and photos accessible. Authors, freelance journalists, photographers, and publishers will all have a veto right if they do not wish to participate. If they do wish to participate, they can claim payment if their material is made accessible outside the institution’s own premises.”

Martin Bossenbroek, the acting General Director of the National Library of the Netherlands, says: “This agreement is a real breakthrough. It’s extremely good news for libraries like the National Library of the Netherlands whose core task is to manage nationally important heritage collections and make them available. The agreement regulates digitisation and the availability of digitised collections on our own premises. But that is only the first step, because we naturally want to also make the digitised collections available online. I think the real benefit of this agreement is that it shows how all the various interested parties understand one another’s positions and arguments. That constructive attitude will also make it possible to arrive at good follow-up arrangements for provision of material on the Internet.”

From yesterday's declaration:

...The Digiti©E Committee, made up of representatives of the right holders acting together as VOI©E...and representatives of libraries, archives, and museums...acting together as FOBID; ...

Declares on the authority of the Foundation of Copyright Interests and VOI©E that institutions shall be given general consent to digitise their collections and make the digitised versions available subject to the following conditions:

1. The institution is a publicly accessible library, museum, or archive which does not have as its object – either in general or with the activity concerned – the achievement of any direct or indirect economic or commercial advantage;

2. Only works forming part of the institution’s collection will be digitised;

3. The works to be digitised form part of the Dutch cultural heritage;

4. The works to be digitised have been lawfully acquired by the institution;

5. To the best knowledge of the institution, the works to be digitised are no longer commercially available;

6. To the best knowledge of the institution, the rights regarding the works to be digitised are vested in Dutch right holders or in right holders who can be represented – or most of whom can be represented – by a Dutch collecting society;

7. It is difficult for the institution to contact individual right holders;

8. As long as no other arrangements have been made with or on behalf of the right holders, the digitised works shall be made available solely via a closed network on the premises of the institution and for the purposes of teaching, research, or private study;

9. In the case of visual material, the quality of the digitised representation shall be such that digital reproduction cannot have a negative effect on the opportunities for exploitation on the part of the original right holder;

10. Either on the institution’s website or in some other way directly associated with display of the digitised works, right holders will be offered the opportunity to object to access being provided to the digital copy of the work in which they can still exercise copyright; Should such objection be received, the institution shall immediately cease the provision of access to the digital copy until agreement has been reached with the party or parties concerned; ...

Comment.  This agreement does not directly grant rightsholder consent to OA.  But it does grant consent for digitization and restricted online access, and it streamlines the process of obtaining consent for OA.  For orphan works still under copyright, these are significant steps forward.

Friday, January 30, 2009

Views of SCOAP3 from the U of Texas

Bavi Selk, Physicists push for free online journal access, Daily Texan, January 30, 2009.  Excerpt:

...[T]he plan has met skepticism from journal publishers as well as some scientists.

UT physics professor Charles Chiu said journals should be priced in the free market but the group’s plan might be an improvement on the current system.

“Certainly there is room in improving the present journal-publication system,” he said. “It is conceivable that [the group] may lead to an efficient program, which has the potential to win out in competing with the current system of journal publication. Personally, I am comfortable using either system.”

[Joseph Serene, publisher and treasurer of the American Physical Society] said the plan was well-intentioned and the American Physical Society would participate if the consortium’s organizers collect enough money. But he said he worries that some of the participating libraries and research groups might eventually pull out of the project, leaving the journals with no financial support.

“If it comes unglued, getting the subscription money back would be a real challenge,” he said....The American Physical Society is a nonprofit group, and Serene said the price increases of journals are probably coming from commercial publishers....

Serene said UT pays about $17,000 a year for an online-only subscription to the American Physical Society’s seven major journals, which includes about 16,000 papers a year. He said that after adjusting for inflation, that price had remained basically unchanged since 2003.

Additionally, Serene said his society allows its contributors to publish their papers on open-access Web sites like arXiv, a free database that hosts draft versions of scientific papers.

“Most anybody who really wants to see a paper can see it,” Serene said, though he added that access was more restricted in other fields like biomedicine.

“It’s the kind of thing that sounds wonderful,” Serene said of the consortium’s plan. “We’d all love if scientific literature could be open to everyone. But we’re concerned it may have stability problems.”

An argument for irreversible libre OA licenses

Klaus Graf, Embargo-Zeitschriften: Kein Open Access, Archivalia, January 30, 2009.  An argument for libre OA licenses.  Because these licenses are irreversible, they prevent publishers from adding access restrictions to an article retroactively.  Read it in German or Google's English.

EDINA launches an OA repository for geospatial data

ShareGeo launched, a press release from EDINA, January 28, 2009.  Excerpt:

A new geospatial data sharing facility has been released today which enables Digmap users to find and share geospatial datasets.

ShareGeo forms part of the EDINA Digimap suite of services....

ShareGeo is free to use for all Digimap users.

Use ShareGeo to contribute your own (derived or user-generated) geospatial datasets, or to download datasets that are already there for your research, teaching or personal use....

Geospatial datasets in a number of formats (raster, vector and tabular) can be contributed for anywhere in the world. A minimal amount of metadata is required for each dataset....

Licensing restrictions may apply to some datasets, so for example if a dataset is derived from Ordnance Survey data, only Digimap users registered for the Digimap Ordnance Survey collection will be able to access it....

PS:  ShareGeo is the successor to EDINA's GRADE (Geospatial Repository for Academic Deposit and Extraction).  See our past posts on GRADE and EDINA's other geospatial projects.

Wikileaks will create OA for 10,000 CRS reports

Clay Shirky reprints a message from the Wikileaks announcement list:

Wikileaks to release nearly 10,000 Congressional Research Service reports

Wikileaks has obtained nearly 10,000 US Congressional Research Service (CRS) reports which it is preparing for publication. The CRS spends around $100M a year preparing high quality reports for members of Congress and Congressional committees. When members feel publication of a report is in their political interest, they are released. Alternatively reports that are not viewed as politically favorable are kept from the public eye....

Comment.  This is excellent news.  It's not the first project to provide retroactive OA to CRS reports, but it's probably the largest.  I don't say "retroactive and unauthorized OA" because CRS reports are uncopyrightable from birth, and no permission or authority is needed.  All that is needed is to get one's hands on a copy.  For details on the other CRS OA projects, and background on the quality and access barriers to CRS reports, see our past posts.

Overview of OA, especially in Australia

Arthur Sale, Beyond Open Source, a slide presentation at Free as in Freedom (Hobart, Tasmania, January 20, 2009).  (Thanks to Dorothea Salo.)

Abstract:   The Open Source movement, of which Linux is a shining example, is a showcase of how accessibility makes for excellence. A parallel thrust is currently being conducted in the research institutions and the publishing industries of the world to create Open Access to the world’s publicly funded research. Arthur Sale will trace the origin of the movement, its economics and the forces holding it back, and where we are now, particularly in Australia. Open Access, or OA, has very many more active participants than Open Source, and many more nay-sayers, cautious Scrooges, and ignorant people. The struggle is titanic – the benefits equally large!

ALA site on the Google settlement

The American Library Association (ALA) Office for Information Technology Policy has launched Google Books Settlement --"a small informational site about the Google Books settlement."  (Thanks to Charles Bailey.)  It contains the settlement documents and a blog to track new developments and commentary.

ALA panel on the Google settlement

Norman Oder, At Panel on Google Book Settlement, Support, Criticism, Contentiousness, Library Journal, January 29, 2009.  Excerpt:

In a lively, sometimes contentious discussion Saturday at the American Library Association (ALA) Midwinter Meeting in Denver, Dan Clancy, engineering director for the Google Book Search Project, diligently explicated the proposed settlement with publishers and authors over books scanned from libraries, but was unable to answer some pressing questions from librarians, noting that the settlement itself remains unresolved.

While the cost for institutional subscriptions to the entire database would be based on FTE students, Clancy was not ready to talk pricing. Paul Courant, dean of libraries at the University of Michigan, a partner in the Google product, acknowledged that pricing was key, but speculated that the “pricing’s going to be pretty good,” because the retail market places a limit on institutional prices and Google’s business model is “eyeballs on their pages.”

Moreover, should prices be too high for the database, he said, “We’re no worse off. It just becomes a richer finding tool, better than search and snippet.”

Laura Quilter, a librarian and attorney, was a bit less optimistic. “The pricing is clearly set up in favor of the rights holders,” she warned.

Clancy later said he was surprised no one had pointed out that, for consumers, rights holders can set a price of zero and display the book under a Creative Commons license. “My personal feeling is there are more than enough rights holders” who don’t consider books their main income and “who will just want their books read." ...

While Courant acknowledged compromises in the settlement, he suggested one else would have done it, taking aim at the idea, “very nicely articulated by Bob Darnton in the New York Review of Books," that the library community could have achieved a similar result. “The world I live in was not going to produce that,” he said. “The key issue is that literature of most of the 20th century is unfindable and undigitized.” ...

Mitch Freedman, past president of ALA, wondered about changes to the “free to all” ideology of libraries, asking whether Google would permit, as do other databases, site licenses for public libraries. Clancy said that, given the consumer market, there was no agreement on remote access, but that could change down the road. “Authors and publishers were not comfortable with remote access.” ...

Clancy noted that, according to the agreement, should Google restrict anything for editorial reasons —he said pornography has been cited as an example, but “at this point, we have no plans for anything like that”— the scan would be made available so libraries can work with the registry to provide access....

Near the end of the Q&A, [Brewster] Kahle posed a series of questions, getting Clancy to acknowledge that no other registry can serve the same role, that the registry can be modified, and that —as with other class-action settlements— it was negotiated under a non-disclosure agreement.

Does the settlement, Kahle asked Courant, make it more difficult for others to enter the arena of out-of-print digital book provisioning?  “My completely amateur opinion is, yeah,” Courant responded....

Thursday, January 29, 2009

Call for OA to publicly-funded research in Canada

Jonathan Vianou, L'infrastructure technologique devra attendre, Le Devoir, January 29, 2009. Read it in the original French or Google's English. (Thanks to Olivier Charbonneau.) Le Devoir is a daily newspaper in Montréal.
... Another disappointment: that the government invests close to three billion dollars in research of all kinds, but it does not implement a technology infrastructure allowing free access to results that will result. ...

Cairn and collaborate

Partenariat Cairn/, announcement, undated but apparently recent. Read it in the original French or Google's English. A partnership to cross-publish between the two sites, similar the agreement between and Persée which we covered recently. (Thanks to Marlène Delhaye.)

More on OA to bibliographic data

Norman Oder, As ‡ Emerges, a New Opportunity for Catalogers (and Competition with OCLC)?, Library Journal, January 27, 2009.

... After beta testing since November, with 200 testers, [] was unveiled just before the American Library Association Midwinter Meeting. More than 1000 people have signed up in a week. A presentation Monday by LibLime CEO Joshua Ferraro drew a curious audience.

The ‡ record database is licensed under the Open Data Commons license, with some 35 million records, most from OpenLibrary, which aims to build one web page for every book ever published and has drawn on major universities and the Boston Public Library. Ferraro acknowledged to LJ that OCLC’s WorldCat has some 135 million records ...

Ferraro said ‡ grew out of an idea he had "when I came to libraries in 1999, that libraries should have a free repository of metadata—it really just meshes with the whole philosophy of libraries, access to open ideas and information. It doesn’t make sense that we’re freely making materials available, but the stuff we’re creating ourselves is not available freely."

Does LibLime face legal constraints? "There are records that were donated by libraries to the OpenLibrary project that are also in OCLC. Those records were uploaded to the Internet Archive in MARC format, forever placing them into the public domain according to U.S. law," Ferraro said. ...

Norman Oder, OCLC Defends Records Policy, Faces Questions, Suggestions, and Criticisms, Library Journal, January 27, 2009.

... [I]n much of [OCLC Vice President Karen Calhoun's] presentation Monday at the American Library Association Midwinter Meeting in Denver, she firmly defended the intent of the policy, suggesting that critics in the blogosphere had an unrealistic view of the library ecosystem. In response, some panelists suggested that OCLC itself was failing to modernize. ...

A revision is expected by the third quarter of 2009. Calhoun acknowledged that “OCLC is caught, along with many other organizations, in this painful transition,” one in which new business models emerge and potential competitors like the upstart ‡ provide similar services.

Calhoun said the factors driving the policy change included documented efforts to systematically download and copy the entire database; commercial entities reselling records without helping support the shared data creation system; and the need for a legal mechanism to encourage organizations outside the cooperative to negotiate with OCLC.

She likened OCLC records to a swimming pool, a shared community asset supported by the community at large but open only to those who contribute. ...

OCLC’s study group noted that the prevailing opinion in the blogosphere is that data should be free and open. However, she noted, “nearly every organization has terms and conditions for data sharing.” (She cited a presentation at IFLA.) ...

John Mark Ockerbloom, of the University of Pennsylvania Libraries, suggested the policy posed another cost, that of missed opportunities for librarians to create new features, for example, expanding on work he’s done on subject maps. He noted some tough language in the policy, which cited “reasonable use,” and noted that the policy was subject to unilateral change by OCLC.

He suggested, by contrast, taking cues from the world of open source, which does include terms and conditions for sharing. He acknowledged that he hadn’t done a fiscal analysis. “There is a significant cost to coordinating a high quality knowledge base,” he said, “but there’s also a substantial cost in keeping our metadata closed off from full access and use”—a cost that is hard to quantify. ...

One way to handle the messy situation is to put the data out in some kind of open data license, said [Peter Murray, of OhioLINK], nothing that “anything other puts up roadblocks.” ...

No one in the audience was particularly happy with the policy; one called it draconian ...

Richard Wallis, Sharing Usage Data – Dave Pattern & Patrick Murray-John, Talking With Talis, January 22, 2009; podcast.

Last month, Huddersfield University’s Dave Pattern announced that he was sharing usage data derived from circulation transactions held in their Library Management System ...

Within a matter of days Patrick Murray-John from Mary Washington University had taken a copy of that data, transformed the data to RDF and published it in a Semantic Web form.

In this conversation we explore the motivations behind Dave’s work and the benefits to the sharing process of the Open Data Commons license he chose to release the data under. Patrick then takes us through how he worked with the data and demonstrated how simple it was to produce and RDF version.

We then explore how the principles demonstrated by their work could be expanded upon to add wide value to the library scene ...

Notes from academic publishing conference

Janneke Adema, Highlights from APE 2009 – Preconference Day, Open Reflections, January 27, 2009. Notes on Academic Publishing in Europe (Berlin, January 19-21, 2009).

Update. See also the notes on days 1 and 2.

Consultation on access management for repositories

JISC has launched a consultation on "scenarios in which access to material in repositories needs to be managed" (i.e. restricted). See background here, or see the consultation site, open until the end of February.

HathiTrust items to get WorldCat entries

HathiTrust and OCLC to work together to enhance discovery of digital collections, press release, January 26, 2009.

HathiTrust, a group of some of the largest research libraries in the United States collaborating to create a repository of their vast digital collections, and OCLC will work together to increase visibility of and access to items in the HathiTrust’s shared digital repository. ...

OCLC and HathiTrust will work together to increase online visibility and accessibility of the digital collections by creating WorldCat records describing the content and linking to the collections via and WorldCat Local. The organizations will launch a project in the coming months to develop specifications and determine next steps. ...

On the value of repositories

Gideon Burton, The Coming Change in Humanities Publishing (7): The Online Archive, Gideon Burton's Blog, January 28, 2009.

... [W]ait a minute, you might say, archiving and publishing are very different things. Scholars publish; librarians archive. Right? Well, the lines have blurred, as has the very definition of what constitutes publishing today. ...

"Publishing" one's work in an institutional or disciplinary repository exposes that work intelligently to the web through metadata--and this goes for scholarly work that has already appeared in conventional journals as well as for work undergoing review. All work submitted to these archives has its content described in appropriate metadata fields, making it machine readable (see my prior post about how this works with data harvesters). The result is that scholarship (even work still under review) can have relevance far beyond the limited scope of the subscribers to a specific academic journal--and it can appear far sooner than in that journal where the work will finally get its more limited release. Of course, any pre-print gets linked to the "official" version (wherever that may reside) once that version appears.

Depositing one's scholarship in one of these electronic archives can be seen as a kind of co-pubishing. The organizations that run them don't review the scholarship; they leave that to traditional methods. ... or Social Science Research Network are now high-profile, high-traffic hubs of scholarly inquiry. They are what journals have been to a prior generation, or perhaps journals and academic conferences combined. They are not just huge databases passively hosting documents until a search awakens their sleeping contents. No, they are destination sites for scholars keeping up in their fields. ...

The humanities lack a significant central disciplinary repository such as, but minor disciplinary repositories for the humanities have begun to appear, such as the Nordic arts and humanities e-print archive ( ...

Still want to keep your publishing and research limited to that small set of reputable journals?

OA to PSI: Paradigm shifted?

John Wonderlich, an Important Moment, The Sunlight Foundation Blog, January 28, 2009.

Something’s missing from the public debate about, the Web site proposed for public stimulus disclosure. ...

What’s missing ... is a recognition of what has changed.

The Internet has been recognized as having a central — even fundamental — role in enabling oversight and public access. The gap between public expectations and legislative initiatives has become significantly smaller, as Congress and President Obama recognize that a recovery or economic stimulus without a public Web site would be an immediate public disappointment — car without a dashboard, or a school without grades. Beyond that, we’re even seeing lawmakers and staff explore the positive aspects of realtime disclosure.

Recent amendments show that Congress is being forced to think through Internet’s real policy relevance. ...

This is a new enthusiasm, and shouldn’t be overlooked. The same Congress that fought their way through enacting and implementing the Honest Leadership and Open Government Act is learning to apply digital public disclosure lessons to everything they do, albeit slowly. This is no small task.

Somewhat devoid of technological expertise, lawmakers, staffers, legislation drafters, Congressional Leadership, and even Presidents are learning what they can require, and what they have to gain, from engaging in our publicly networked culture. The Open Government community has swiftly pivoted from pressuring for digital disclosure to prescribing how it should be done better. ...

[W]e should recognize sea changes as they occur., now appreciated only as a matter of course, has been accepted by lawmakers as an integral part of the recovery plan - emblematic of just such a fundamental shift. ...

Comment. One wonders how widely this thinking applies. Lawmakers have recognized public access as vital for government spending data -- to what extent does the same thinking carry to other types of public sector information, or to publicly-funded scientific information?

Collective action and open science

Michael Nielsen, The Logic of Collective Action, Michael Nielsen, January 28, 2009.

It is a curious fact that one of the seminal works on open culture and open science was published in 1965 (2nd edition 1971), several decades before the modern open culture and open science movements began in earnest. Mancur Olson’s book “The Logic of Collective Action” is a classic of economics and political science, a classic that contains much of interest for people interested in open science.

At the heart of Olson’s book is a very simple question: “How can collective goods be provided to a group?” Here, a “collective good” is something that all participants in the group desire (though possibly unevenly), and that, by its nature, is inherently shared between all the members of the group.

For example, ... Scientists desire shared access to scientific data, e.g., from the Sloan Digital Sky Survey or the Allen Brain Atlas.

What Olson shows in the book is that although all parties in a group may strongly desire and benefit from a particular collective good (e.g., a stable climate), under many circumstances they will not take individual action to achieve that collective good. In particular, they often find it in their individual best interest to act against their collective interest. The book has a penetrating analysis of what conditions can cause individual and collective interests to be aligned, and what causes them to be out of alignement. ...

The notes in the present essay are much more fragmented than my standard essays. Rather than a single thesis, or a few interwoven themes, these are more in the manner of personal working notes, broken up into separate fragments, each one exploring some idea presented by Olson, and explaining how (if at all) I see it relating to open science. ...

New OA journal of political philosophy

Dissensus is a new OA journal of political philosophy published in French by the Université de Liège. The first issue was released in December 2008. (Thanks to Michel Bauwens.)

Notes on IR discussion

Dorothea Salo, Light at the end of the tunnel, Caveat Lector, January 25, 2009. Notes on ALCTS Acquisition Managers and Vendor Discussion Group (Denver, January 25, 2009).

... Some conclusions were come to regarding the successful IR that I think are worth retailing here. One is that technology platform counts, but isn’t itself a determiner of success. Another is that clear mission, sensible planning process, and quantifiable, evaluable (and evaluated) goals do determine success, and any IR without them is swimming in fog.

A third conclusion, which I can’t emphasize enough, is that a lone IR is a dead IR, and the “maverick manager” model of IR staffing is an antipattern. The examples of successful IRs presented during the panel all had broad-based library support, with the main muscle for content development often coming from technical services or acquisitions. A fourth conclusion is that if you’re relying on self-archiving for your content, you’re delusional.

And then there was a fifth conclusion, which I myself will take to heart. A repository with broad-based support and active content recruitment can indeed succeed. We have real examples of this now (and I’m very happy that the librarian involved with MoSpace came forward to speak), and we should make much of them and take them as our models. ...

Launch of a new collection of OA video lectures

Academic Earth is a new collection of OA "video lectures from the world's top scholars".  (Thanks to Richard Ludlow.)  From the about page:

Academic Earth is an organization founded with the goal of giving everyone on earth access to a world-class education.

As more and more high quality educational content becomes available online for free, we ask ourselves, what are the real barriers to achieving a world class education?  At Academic Earth, we are working to identify these barriers and find innovative ways to use technology to increase the ease of learning.

We are building a user-friendly educational ecosystem that will give internet users around the world the ability to easily find, interact with, and learn from full video courses and lectures from the world’s leading scholars.  Our goal is to bring the best content together in one place and create an environment that in which that content is remarkably easy to use and in which user contributions make existing content increasingly valuable....

The videos are OA.  Registered users may create their own collection of favorites, but even unregistered users may grade the lectures.  (From the FAQ:  "Lectures start out with a grade of B.  From there the grade is an average of the grades that our users have given it.  Course grades are based on the grades given to the lectures in that course.  Highly rated content will show up first in browse results and in the Top Rated sections in our homepage.")

Update (2/2/09).  More from Jeffrey Young at Wired Campus:

...The company has simply grabbed the videos off the universities’ own Web sites and plans to offer tools to students who want to talk about the content — along with a chance to grade the quality of the lectures.

Richard Ludlow, the company’s CEO and founder, said in an interview today that it is allowed to republish the videos because they were released by the universities under Creative Commons licenses....Mr. Ludlow says that his company will not place any advertising on Web pages that contain university videos, though he hopes to expand the site in the future to include sections where videos from other sources are shown with advertising....

Mr. Ludlow...[will] meet this week with officials from MIT to talk about [his company's] plans.

How do the universities feel about the company republishing their lectures?

“I haven’t looked at his example enough to give you a definite answer,” said Steve Carson, external relations director for MIT’s Opencourseware project, which publishes free materials from the institute’s courses, including complete videos from some 30 courses. “It might be OK—as long as the use adheres to the terms and conditions on our site, we encourage the material to be redistributed for educational purposes....They’re putting interactive services around it — it could be very complementary to what we’re doing.” ...

Mr. Ludlow points out that some of the colleges and universities use more open Creative Commons licenses than others. MIT and Yale allow “derivative use” of their content, meaning that the company can cut the lectures into various sections, based on topics, he said. Berkeley does not allow such derivative use, nor does Stanford for some of its courses, he added....

Two approaches to OA for public domain art

I saw these two developments within two days of one another and thought they should be blogged together.

  1. Wikipedia Loves Art.  (Thanks to Jonathan Gray.)  Excerpt: 

    Wikipedia Loves a scavenger hunt and free content photography contest among museums and cultural institutions worldwide, and aimed at illustrating Wikipedia articles. The event is planned to run for the whole month of February 2009. Although there are planned events at each location, you can go on your own at any time during the month.

    The project is coordinated by the Brooklyn Museum, with the participation of the Art Gallery of New South Wales, the Carnegie Museum of Art, the Film Society of Lincoln Center, the Honolulu Academy of Arts, the Houston Museum of Natural Science, the Hunter Museum of American Art, the Indianapolis Museum of Art, The Jewish Museum (New York), the Los Angeles County Museum of Art, the Metropolitan Museum of Art, the Museum of Modern Art, the New York Historical Society, the Smithsonian American Art Museum, the Taft Museum of Art, the Victoria and Albert Museum. In all, there are 16 different museums and cultural institutions participating....

  2. Adam Hodgkin, Google Pictures and Google Books, ExactEditions, January 27, 2009.  Excerpt:
    Did you investigate the recent news stories about Google putting 14 masterpieces from the Prado into Google Earth, so that you can zoom and pan these great works in extraordinary detail (search for 'Museo del Prado' on Google Earth)? Over 3 months Google engineers/photographers took 8,000 highly detailed photos of the 14 paintings and in painstaking fashion they have been pieced together to form magnificent reproductions. Here is a tiny detail from Goya's picture of Executions on Principe Pio hill....

    This is a great project and for me it immediately raises the question: "Will Google attempt to do for pictures and the world's great art what it is now doing for books and all the world's published literature?" ...

    I suspect that museums and art galleries will do a lot of this sort of work themselves. This virtual Prado does not carry any ads, but let us face it, its a spectacularly good advertisement for the Prado itself....Mind you, if it turns out that museums and art galleries are quite capable of databasing their own collections, it may also turn out that more libraries decide to do a similar job for their own collections. The fact is that the tools used by Google for capturing information held in books, libraries, pictures and museums are increasingly available to us all. Capturing information on your digital camera or your iPhone is almost a democratic right. It will be very hard for companies to build exclusive monopoly holds over information which anyone can collect with a wave of their hand.

Update on INASP and PERii

The Winter 2008 issue of the INASP Newsletter is now online.  (There are no deep links to individual articles.)

From Sara Gwynn, Launch of PERii!:

We are delighted to share news of the launch of PERii —the second five-year phase of INASP’s Programme for the Enhancement of Research Information. PERii focuses on the needs of people in developing and emerging countries, working with partners to support global research communication by helping to strengthen existing knowledge, skills and capacity.

Taking advantage of the possibilities offered by ICTs, PERii focuses on a number of core programme areas [including] access to international scholarly literature....

In [participating] Network countries INASP commits the time and funds to enable access to resources such as training materials, publications, and a directory of free and open access resources via the country-specific web pages....

From Lucy Browse, PERii: Information Delivery:

Through PERii, INASP works with over 50 publishers and aggregators to negotiate access agreements for their resources on behalf of partner and network countries. It is important that this access is sustainable in the long term, so the models for access are a combination of heavily discounted and free subscriptions. Each agreement is negotiated in a way that is context-specific to the socio-economic and infrastructure situation within each country. Because equity is at the heart of INASP’s activities, along with the hope of encouraging growth and strengthening of consortia within countries, the information delivery team negotiate for countrywide access for all eligible institutions whenever possible.

Our activities are demand-led by Country Co-ordination (CC) teams and the resources available comprise online journals, books, databases and grey literature across all disciplines. Currently, consortia are able to select from online resources comprising over 25,000 journals (including 18,000+ full text) and 30 book collections containing over 11,000 books. Because the developments in open-access materials also provide an important source of information, INASP maintains a Directory of Resources that contains links to programmes complementary to PERii, open access resources available to all researchers and resources that are freely available to researchers in qualifying developing and emerging countries. At the current time the directory includes multi-disciplinary, subject-specific resources and links to institutional repositories....

Each year we are seeing growth in the number of institutions registering for access, and at the end of 2007 this rose to 989....

In 2007 we saw over two million downloads from the partner countries, and these present only a fragment of the picture as they represent downloads from institutions with fixed IP addresses rather than from those accessing using usernames and passwords (an important route of access for researchers in the field, open universities, etc)....

From Peter Burnett, PERii: Library Development:

The library development component of PERii works with librarians to enable them to develop and enhance their skills to exploit and provide access to the resources of their institutions, to train their users appropriately, and develop digital services to meet the growing expectations of scholars and researchers....

From Julie Walker, Publishing Support at INASP:

Following on from the success of African Journals Online (AJOL), which celebrates its 10th anniversary this year, INASP has been working in conjunction with the Public Knowledge Project (PKP) at Simon Fraser University in Canada to establish four new Journals Online (JOL) services in Asia: Vietnam Journals Online (VJOL), Nepal Journals Online (NepJOL), Bangladesh Journals Online (BanglaJOL) and Philippine Journals Online (PhilJOL)....

In order to strengthen and consolidate the JOLs set up in Asia, we have launched AsiaJOL, a harvester developed by PKP which allows searching across all the JOL content. AsiaJOL will also include Indonesia Journals Online (IJO), which was set up independently and includes content from 14 journals....

Support has been given to the the International Centre for Theoretical Physics (ICTP) for the production of a Practical Guide to implementing Open Access Archives in Developing Countries. Although open access archives are growing in number across the developing world, there are often no instructions on how to set up and maintain them or how to address key implementation issues in countries with poor ICT infrastructure. This guide will seek to address that gap and to promote the sustainability of open access archives....

Visualizing lines of research impact

The Eigenfactor Project and Moritz Stefaner have produced some stunning visualizations of Eigenfactor data.  (Thanks to Emma Marris.)

Comment.  Eigenfactor is an OA measurement of research impact.  For background, see our past posts on it.  Because Eigenfactor data are OA, they are free for mashing up or analysis by other projects as well.  It's easy to be inspired by the beauty and intelligibility of these images.  But we should also be inspired by their suggestion of what can be done with open data.

Guide for developing a scholarly communication program

The ARL has launched a new web site to help libraries develop a scholarly communication program, January 29, 2009.  Excerpt:

Want to start a scholarly communications program at your institution? Do you have an existing program and are looking for tools that other institutions have found useful? The goal of this site is to enable ARL/ACRL members to find help at your point of need, and to leverage the expertise and experience of library colleagues everywhere.

Scholarly communication initiatives can take many different forms and focus on many different issues, such as the Harvard Arts & Science faculty’s Open Access resolution, the University of California’s innovative recruitment of faculty publications into its eScholarship Repository, the University of Minnesota’s author’s rights education program, or SPARC’s student-focused “Right to Research” campaign. Whatever the issues particularly relevant to your institution, librarians can engage faculty members, students, and administrators to make a significant impact on the scholarly landscape....

The site was created by Kris Fowler, Gail Persily, and Jim Stemper.

GreyNet buys rights to deposit papers in its OA repository

GreyNet secures open access for its former collections in the GL-Series, a press release from GreyNet, January 26, 2009.  Excerpt:

GreyNet has recently purchased permission from Emerald to make available the papers published in the GL Conference Proceedings, 1994-2000. Since its relaunch in 2003, GreyNet has sought to recover this earlier research in the field of grey literature and make it available to librarians, researchers, educators, students, and net-users alike. These earlier collections will be included in the OpenSIGLE Repository and will rely on the efforts of INIST-CNRS to OCR the full-text of the conference papers and GreyNet to provide their corresponding metadata records.

In 2008, GreyNet’s conference based collections 2003-2007, were included in the OpenSIGLE Repository. And, by the close of this year, it is anticipated that all of the papers in the International Conference Series on Grey Literature will be fully accessible via the OpenSIGLE Repository.

Comment.  GreyNet started making its conference proceedings OA through its repository in May 2008.  I applaud its determination to complete the collection retroactively, even if it means buying permission from a publisher.  Note to other conference organizers:  This is a reason to self-archive your proceedings as you go, or at least to retain the right to self-archive them without a fee.

More on the EU's sui generis database right

John Wilbanks, Data, Copyrights, And Slogans, Part II, Common Knowledge, January 26, 2009.  Excerpt:

...Most of the time the [copyright] conversation assumes that a creator needs an incentive that excludes others - that a monopoly is the best way to give creators the reason to create. It's pretty rare that the incentive conversation focuses on the rights of the users. But that's a different argument, so let's talk about the incentives to create.

In the US, the courts have been pretty clear that simply collecting things doesn't make them worthy of protection. That's the Feist decision I wrote about over the weekend. This decision hasn't hurt the US database industry, so it doesn't appear that the monopoly protection is essential to create a thriving economy around data and databases.

But in the EU, the database industry successfully convinced the European Parliament to create a completely new law for data: the Database Directive....

The Directive also creates a sui generis right around databases....

Like the platypus, the Directive's sui generis right appears to have been cobbled together from spare parts. It's got some ideas from copyright - like prohibiting copies. Copying that is regulated gets defined as when someone performs "substantial extraction" of data. This concept of substantial is of course itself *not* defined. Rights last 15 years (better than copyrights, for sure).

But the Directive also has the idea that these rights ought to be re-booted any time there is a "substantial change" or "substantial investment" - again, without determining substantial....

The Directive was written to provide incentives to creators - the right to sue will surely generate more of an industry, the thinking goes. However, the first evaluation by the EU didn't find evidence for this supposition having been true. Their own words are the best ones:

"The economic impact of the "sui generis" right on database production is unproven. Introduced to stimulate the production of databases in Europe, the new instrument has had no proven impact on the production of databases." ...

The key result of all this is that despite the absence of the incentives supposedly created by "property" rights, database creation in the US not only stayed healthy but out-competed the same industry in the EU. So the incentive argument appears to be a failure here.

On top of that, the existence of the sui generis right screws up then international regimes around data. When we do database integration at Science Commons, we can't use data from the EU because of the impacts of the regime. Weird data licensing regimes screw up data integration (even when they're "open" regimes - this is why the HapMap "clickwrap" license was removed, because it was preventing integration with other genomic databases! See the US government's exact words on the issue, down in Appendix B)....

PS:  See John's first post in this series (1/24/09) or the excerpt blogged here.

The future of repositories

Stephanie Taylor, Repositories - The Future, a five-slide presentation at Repository Services Day (Bath, January 15, 2009).

More on the Google settlement

Peter Brantley, A fire on the plain, Peter Brantley's thoughts and speculations, January 26, 2009.  Excerpt:

...Recently, I spoke with several individuals who had been involved in the settlement negotiations as rightsholders (author or publisher representatives), and it became apparent from their perspective that one of the things that most rankled in these recent commentaries was [Robert] Darnton's suggestion that a single terminal per public library building constituted inadequate open access to the GBS collection – an important issue that I had earlier addressed with considerably less eloquence and historical context.

From the rightsholders' perspective, one terminal per library, instead of being stingy, quivered with profligacy. I heard remarked by several individuals (and often enough now to feel it corroborated) that indeed this concession started out far more restricted: either no public access, or starkly limited access – perhaps apocryphally, a single terminal in each State capitol, or one terminal in each city. In short, it was impressed upon me that libraries were lucky to get as much as they did.

As I understand it, rightsholders feared that having unhindered access to books online at libraries might (among other issues) encourage libraries to decelerate buying print books, thereby reducing royalties to authors and profits to publishers. In this equation, more public access = less revenue.

It is difficult to credit that frustrating access is ever able to delay or stem fundamental social trends – for example, the increasing importance of visual and interactive media. Or impact essential library acquisitions....Or the fact – hold on, this is wild-assed stuff – that access to online books that can't be satisfactorily printed because of restrictive rules might actually generate sales of both print and ebooks. Or the possibility that searching and reading networked books for anyone under the age of 40 might be an inherently social activity that generally increases enthusiasm for all forms of reading. These  conceptions of social engagement, participation, and interaction did not prejudice the conversations in New York City where content owners made decisions about what libraries could do with their new digital books....

Let us consider a far more basic, more fundamental concern: the proposed Google Book Search settlement is embedded in a set of conceptions about books, reading, and information access which is as profoundly obsolescent as the printed Encyclopedia. This settlement was crafted by well-established actors: authors and publishers whose primary cognitive map of the world of books was established in 1965, and these days that they inhabit are only a reaction to it.

I understand and appreciate that the settlement proposal provides for an increase in the number of public terminals, should it pass muster with the rightsholders. But let us call this for what it is: an appropriation of sponsorship of access to our culture that is inadequately informed by imagination, possibility, and fairness....

Two mechanisms for treating knowledge as a public good

James Love, Knowledge as a Public Good: Two Mechanisms, a presentation at the Fórum Mundial Ciência E Democracia (Belém, Parã, Brazil, January 26, 2009).  The two mechanisms are medical innovation prizes and WTO agreements.

New OA journal of art history

Kunstgeschichte is a new peer-reviewed OA journal from DiPP.  For more detail, see the announcement (January 2009).

The journal posts submissions first as discussion papers, and after six months of peer review accepts some of them as journal articles.  The first wave of 10 discussion papers is now online.

Notifying authors when they are cited

Elsevier has launched CiteAlert, a free service notifying authors when one of their papers is cited by an Elsevier journal.  (Thanks to ResourceShelf.)  The service only covers citations to articles published since 2005 in journals indexed by Scopus.


  • This is useful as far as it goes, and I can see why Elsevier can't take it much further on its own.  But imagine if all journal publishers offered similar services.  The utility of receiving their reports, knowing that they comprehensively covered the field, would be immense.  But the labor of signing up for each one separately would also be immense, not to mention the labor of re-creating the service at thousands of different publishers.  The bother of reading separate reports from separate publishers would also be immense.  I understand that Elsevier's portfolio is larger than anyone else's, but the long tail of academic publishing means that Elsevier's titles still constitute less than 10% of all of peer-reviewed journals.
  • I'd like to see a service that notifies authors when one of their works is cited by any journal, regardless of its publisher.  If this can't be done by a creative developer harvesting online information (because the harvester doesn't have access to TA sites), then how about a consortial solution from the publishers themselves?  And don't stop at emails to authors.  Create RSS feeds which users can mash-up in any way they like.  Imagine getting a feed of your citations from this hypothetical service and a feed of your downloads from your institutional repository.  Imagine your IR feeding the citations in your articles to an OA database, upon which anyone could draw, including this hypothetical service.
  • Who could do this?  OpenURLCrossRefParaCiteGoogle ScholarOCLC (after it acquires OAIster)? A developer at an institution like Harvard with access to the bulk of TA journals?  Perhaps someone could build the OA database now, with the citation-input and email- and RSS-output functions, and worry later about how to recruit publishers and repositories and/or how to harvest their citations.

Update.  In my list of players in the final bullet, I should have mentioned ISI Web of Knowledge, which has offered citation alerts since 1965.  (Thanks to Eugene Garfield.)  I knew about ISI, of course, but I didn't know (or forgot) about its email citation alerts.  They come very close to the service I was describing, if only they covered citations published in any journal (not just ISI-indexed journals) and were available without charge to any user (not just those at ISI-subscribing institutions).

Update.  Also see Stevan Harnad's comments on this post.  [They are now also on his blog.] Excerpt:

It is clear who should notify whom -- once the global research community's (Green OA ) task is done....Once all current and future articles are being immediately deposited in their authors' IRs, the rest is easy:

The articles are all in OAI-compliant IRs. The IR software treats the articles in the reference list of each of its own deposited articles as metadata, to be linked to the cited article, where it too is deposited in the distributed network of IRs. A citation harvesting service operating over this interlinked network of IRs can then provide (among many, many other scientometric services ) a notification service, emailing each author of a deposited article whenever a new deposit cites it. (No proporietary firewalls, no toll- or access-barriers: IR-to-IR, i.e., peer-to-peer.)

A brief response:  Of course I agree with Stevan that authors should self-archive, and that the service I was describing would be easier to build if 100% of peer-reviewed research articles were on deposit (with accurate metadata) in OA repositories.  And I imagine that Stevan agrees with my point that, until we reach that goal, the service will be thwarted by the difficulty of harvesting citations published in TA journals.  My question was not whether to launch this service instead of working toward 100% OA, but who could launch this service, or make a significant start on it, before we reach 100% OA.

Update (1/29/09).  From Michael Kurtz on the AmSci OA Forum:

The weekly myADS-arXiv service includes a notification when papers in arXiv cite the subscriber, for an example see the one for me.

There is an equivalent myADS service for the Astronomy and Physics journals (nearly all TA).

Update (1/31/09).  Christian Zimmermann of RePEc tells me that "RePEc has been notifying authors about found citations for several years already. Our CitEc project is analyzing bibliographies of all documents it can grasp in OA, plus bibliographies that some publishers provide. Interestingly, Elsevier explicitly prohibits RePEc from analyzing its articles."  Also see the CitEc FAQ.  From the CitEc home page:

Citations in Economics provides citation analysis for documents distributed on the RePEc digital library. For each document made available in electronic format we automatically extract and parse its list of references. In this way we know which documents have been cited, how many times and what the citing documents are.

Citations in Economics uses CiteSeer algorithms in the process of identification and parsing of references....

Data created by Citations in Economics is not intended for direct user access. On the other hand, it is made available to RePEc services so that they may improve their added value to the research community. The following RePEc services have already implemented citation data: Socionet, EconPapers and IDEAS....

Notes on Free as in Freedom workshop

Brianna Laugher, Free as in Freedom miniconf recap + slides, All The Modern Things, January 26, 2009. Notes on Free as in Freedom (Hobart, January 20, 2009).

All Slideshare slides are available at [Slideshare] and you can also download plain pdf slides if you like. They’re all available under the CC-BY-SA license.

Video has not yet been released but I will post again when it is available. ...

If you attended, and you took pictures of any of my speakers and published them on the web, let me know and I will add links to my photos page. ...

First up was Arthur Sale talking about open access and progress in Australian universities under the “green road”. He explained that the only way to make open access a reality is through a “distributed” model of institutional repositories (IR), because it’s the institutions that employ the researchers and thus have the power to mandate something.

Students or researchers in universities might be interested in checking out the Wheeler declaration, and finding ways to pressure their university to adopt an OA mandate.

Next up was Laura Simes on future directions for copyright law, which looked at the mixed messages we are getting at the moment from the government, with the Anti-Counterfeiting Trade Agreement looking like a disguised booby trap on one hand, and the encouraging words from the National Innovation Review, and other government departments exploring the use of free licensing, on the other. ...

Last one before lunch was Jessica Coates with Freedom Fighting: How do we convince TPTB to relax their grip?. Jessica put this together virtually at my direct suggestion, because I know Creative Commons Australia has done lots of excellent quiet lobbying with government departments in particular. Just look at the ABS decision to put everything CC-BY. Government departments just don’t make awesome decisions like that all by themselves. :) So if you’d like to do a bit of lobbying for your cause, this talk should be helpful. ...

After Jeff was another hit session, Simon Greener on Free and open geodata in Australia. This was a great “cliff’s notes” introduction to what geodata refers to, who the providers in Australia are, and a number of open source or open source-like projects in Australia to extend such collections. Simon has many years experience in this field and it really showed in his talk. He spend some time talking about accuracy, and was critical of projects such as OpenStreetMap for re-creating road networks from scratch, instead of concentrating on what he saw as more useful — Point of Interest (POI) data. ...

Richard Poynder interviews Tom Hill of Libertas Academica

Richard Poynder, The Open Access Interviews: Libertas Academica, Open and Shut? January 28, 2009.  Excerpt:

About a year ago people began to contact me to complain about the activities of a number of start-up Open Access (OA) publishers....

[One] concern was that the quality of the peer review process provided by some of these new publishers appeared as though it might not be adequate....

So I began to contact these new companies in order to ask them about their activities. To my surprise, some of those I contacted greeted my enquiry with a variable mixture of silence, stonewalling, criticism, and even threats of legal action....

I should stress that the more established OA publishers — e.g. Public Library of Science (PLoS), BioMed Central (BMC), Hindawietc. — have always responded readily to questions about their business practices, and with a good deal of openness.

Similarly, amongst the new crop of OA publishers both Dove Medical Press and Libertas Academica replied willingly to my questions when I contacted them. My interview with Tim Hill was published last November, and below I attach an email interview I undertook recently with the managing director of Libertas Academica, Tom Hill (Tim Hill's son)....

It seems likely...that — sooner or later — OA publishers are going to have to become much more transparent than traditional subscription publishers ever had to be, both with regard to the process that papers undergo before they are published, and in explaining what the research community is getting for its money.

This transparency will be all the more important given the high degree of automation that OA publishers utilise. A recent incident in which Libertas Academica mistakenly published a paper that had been rejected by the referees (a consequence says Tom Hill of a database error) highlights the potential dangers of automation, and the bad press that can circulate in the blogosphere when such an error occurs....

I discuss these and other issues below with Libertas Academica's managing director Tom Hill.

Spain's new OA telescope

Eduardo Martínez, The world’s first free, open access astronomical observatory officially opened, Innovations Report, January 27, 2009.  Excerpt:

The Montegancedo Astronomical Observatory, based at the Universidad Politécnica de Madrid’s School of Computing and part of the Madrid Region’s ASTROCAM network, was officially opened on 23 January 2009, coinciding with the International Year of Astronomy 2009.

In his brief talk, [Francisco Sánchez Moreno the head of the observatory and professor at the School of Computing] explained...that this is the first free, open access astronomical observatory in the world....

[The observatory software, developed at the school] provides a number of tools for running astronomical experiments, building scenarios and remotely controlling telescopes, cameras and domes. Also it enables internauts to access the observatory from their own homes and experience different astronomical events. Last December, [the software] was awarded second prize in the 8th New Applications for Internet contest, organized by the New Generation Internet Chair....

PS:  If I understand it, the Montegancedo observatory provides open access to the equipment as well as --or because-- it provides open access to its data.

Wednesday, January 28, 2009

What goes in an IR?

Leslie Carr, Repositories vs Learning Object Repositories, RepositoryMan, January 27, 2009.

I got into a bit of an argument on the JISC-REPOSITORIES list yesterday, about whether general repositories (EPrints, DSpace, Fez etc) could take on the functions of a bespoke learning object repository (e.g. Intralibrary). My position is that a general repository is made to be adapted - you should be able to change the schema and the services to adapt to local requirements, but the contrary position is that a learning object repository is just too different and specialised.

We'll see. The EdSpace project at Southampton is running a learning resources repository based on EPrints, but they are experimenting with the nature of a learning object repository by introducing open access practices and sensibilities ...

However, the discussion got me thinking about the limits of plasticity inherent in an open source repository such as EPrints (or DSpace etc). ...

In theory you can adapt your repository so far in the direction of any particular agenda that you could encompass all the needs and requirements of users concerned with that agenda. However, that may require an awful lot of effort - or just more understanding and insight than you have time to achieve. ...

This does put some constraints on the amount of the terrain (agendas and services) that your repository's perimeter can encompass. So perhaps a way forward is to cheat by redefining the problem in terms of something that the repository can already do. I've already mentioned that EdSpace are getting results by making the "educational resources" problem look more like "Open Access + preservation". This approach seems to be working in other areas as well - scientific data (eCrystals), archiving fine arts (KULTUR). ...

Interview with OKF's Rufus Pollock

Jed Sundwall, Interview with Rufus Pollock of the Open Knowledge Foundation, NetSquared, January 22, 2009.
... [M]ost science people don't give out the data. They give out the research results, i.e., the summary of the data. There's huge incentive to do that: "A," it's actually secrecy, you don't want to give out your data because you can keep using it, and "B," it's much harder to check whether you've did some weird stuff with the stats to get your nice results if your data isn't actually available there, right? So there are large incentives to not put it out there ...

Planned features at ticTOCs

Roddy MacLeod, I want to be completely honest with you about ticTOCs, News from ticTOCs, January 27, 2009. Discusses feedback on the OA indexing and current awareness system ticTOCs, including which features are under development (and which aren't).

Web 2.0 vs. digital libraries

Alexey Maslov, et al., Cooperation or Control? Web 2.0 and the Digital Library, Journal of Digital Information, 2009. (Thanks to Fabrizio Tinti.) Abstract:
The Web 2.0 trend has placed a renewed emphasis on interoperability and cooperation between systems and people. The digital libraries community is familiar with interoperability through technologies like OAI-PMH, but is disconnected from the general Web 2.0 community. This disconnect prevents the digital library from taking advantage of the rich network of data, services and interfaces offered by that community. This paper presents a case study of a collection within the Texas A&M Repository that was improved by adopting the principles of cooperation embodied by the term Web 2.0.

Cancer journal converts to OA with BMC

Hereditary Cancer in Clinical Practice has converted to OA with BioMed Central. The article-processing charge is £850/US$1175/€900, subject to discounts and waivers. Authors retain copyright and articles are published under the Creative Commons Attribution License.

New OA legal education project launches

Center for Computer-Assisted Legal Instruction, Legal Education Commons Launches with Open Access to 700,000 Court Decisions, press release, January 26, 2009.

Starting today, legal educators will have the capability to search, make use of, and share more legal educational materials than ever before. The Center for Computer-Assisted Legal Instruction (“CALI”), in collaboration with Harvard’s Berkman Center for Internet and Society, is launching an open, searchable collection of resources designed specifically for use in legal education: a “Legal Education Commons” (“LEC”).

“All teachers of law have materials and notes they use in teaching,” says John Mayer, CALI Executive Director. “Many freely share their materials with colleagues, but there has never been a singular searchable, taggable space to serve that function for the entire legal academy,” he explains, “until now.”

The Legal Education Commons launches with over 700,000 federal court decisions readily available to its users. This initial collection of cases from makes the LEC one of the largest gatherings of case law freely available in one place under a creative commons license.

CALI has also donated 300 original illustrations from its popular online tutorials, “CALI Lessons,” making the Legal Education Commons the first and largest pool of free images designed specifically for use in legal education.

While the LEC opens with an extensive collection of court cases and images, it can expand its collection of resources only through contributions and donations from the legal education community. ...

Faculty and staff at CALI member schools may upload nearly any type of file – from most text documents to presentations and MP3’s as large as 50 MB – and share it with the LEC community under a Creative Commons Share Alike license. ...

See also our past post announcing the project.

More on Smithsonian 2.0

Joel Garreau, Smithsonian Click-n-Drags Itself Forward, Washington Post, January 26, 2009. (Thanks to techPresident.)

The Smithsonian has decided this whole online contraption may not be a fad after all.

Over the weekend it invited 31 luminaries of the digital age to talk with what the institution hopes are its most energetic thought leaders. The subject: dragging the world's greatest museum complex into the current century.

No small task.

Chris Anderson, editor in chief of Wired, the technorati monthly, tells of one Smithsonista who proudly observed that her operation's curators had already carefully picked 1,300 photos and uploaded them to the social-sharing Web site Flickr.

The problem is that the Smithsonian has 13 million photos.

Well, it's a start. Only 99.99 percent to go.

At this gathering -- "Smithsonian 2.0," it was called -- there was much talk among the institution's handpicked staffers about the difficulties of moving this battleship, ocean liner, glacier . . . pick your metaphor. The invited techies, meanwhile, stressed how deathly soon might come the day the Smithsonian wakes to discover itself General Motors.

The forward-looking Smithsonistas have a formidable ally. That would be G. Wayne Clough, who became the Smithsonian's new secretary in July. His previous gig, fortuitously enough, was being president of Georgia Tech. This initiative is his idea, and a major thrust of his young administration. He claims "Smithsonian 2.0" will not be one of those feel-good events after which hibernation resumes.

"With digitization and with the Web, we can see it all. We can see it all!" he exults. ...

Of the Smithsonian's 137 million artifacts, however, not only is less than 1 percent on display, but most of that is in Washington. You have to come to the Smithsonian. It doesn't much come to you. ...

The question for attendees of the "Smithsonian 2.0" discussion was, how can you get everything -- every thing, every last dung-rolling beetle -- out there where everyone in the world can get equally excited about it?

The core group of outsiders were heavy dudes, as these things go. The principal presenters included Bran Ferren, co-chairman and chief creative officer of the legendary Applied Minds Inc. (Before that he was president of research and development and creative technology for Walt Disney.) Clay Shirky is the author of the acclaimed Here Comes Everybody, the recent book that looks at the seismic changes being brought about by decentralized, bottom-up, peer-to-peer technologies. George Oates is one of the founders of Flickr, the photo-sharing phenomenon. ...

The discovery of the "long tail" principle has implications for museums because it means there is vast room at the bottom for everything. Which means, Anderson said, that curators need to get over themselves. Their influence will never be the same. ...

The problem is, "the best curators of any given artifact do not work here, and you do not know them," Anderson told the Smithsonian thought leaders. ...

It's like Wikipedia and the Encyclopedia Britannica, Anderson said. Some Wikipedia entries certainly are not as perfectly polished as the Britannica. But "most of the things I'm interested in are not in the Britannica. In exchange for a slight diminution of the credentialed voice for a small number of things, you would get far more for a lot of things. Something is better than nothing." And right now at the Smithsonian, what you get, he said, is "great" or "nothing." ...

See also Peter's recent post on the Smithsonian's new chief.

Update (Peter). Also see Dan Cohen's notes on the same meeting.

OA in Poland

Mirosław Garbacz, et al., Krajowe czasopisma Open Access, presented at Komputerowe wspomaganie badań naukowych (Polanica Zdrój, Poland, October 22-24, 2008); self-archived January 25, 2009. English abstract:
This paper elaborates on basic definitions of Open Access (OA) and principles concerning a journal publication in this module. The paper presents the origins of so called Open Access idea, putting a great emphasis on its role in Poland. The report includes the analysis of OA journal home market based on data from the DOAJ service and WWW publishers’ pages. A great emphasis was put on the status of OA journal editors, the availability of journals, the cost of the article publication and the issues connected with the copyrights.

On OA and mandates

Hélène Bosc, Open access to the scientific literature: a peer commons open to the public, presented at Copyright Regulation in Europe – An Enabling or Disabling Factor for Science Communication (Berlin, November 14-15, 2008); self-archived January 19, 2009. (Thanks to Fabrizio Tinti.) Abstract:
Ninety percent of research worldwide is publicly funded, hence the results of these research should be made publicly accessible online. Research publications, a common good, created by researchers for researchers, need to be freely accessible to all. Immediate "open access" can be provided through author self-archiving in the growing number of institutional repositories (IRs) (1100) created in recent years. But these IRs currently contain only about 15% of global research output today and are not filling rapidly and reliably enough. Self-archiving mandates by funders and institutions are accordingly needed. Arthur Sale has shown that if deposit mandated, IRs achieve 100% self-archiving within 2 years. The number of mandates is steadily increasing worldwide, particularly from research funders. In 2008, the library association SPART (together with Creative Commons) called on universities, in particular to adopt IR deposit mandates. In Australia, September 2008, at the Brisbane Conference on Open Access, academics and politicians called for the adoption of both funder and institutional mandates, stressing especially the benefits to industrial R&D applications and progress from the open online sharing of access to research results. Open online access to research is creating a distributed "cognitive commons" that endows the human mind with a new power to accelerate research progress at the speed of thought (Dror and Harnad).

New discussion list on OA in India

Indian Open Access Forum is a new discussion list devoted to achieving OA in India, now open for new subscribers.

New OA journal of oral microbiology

The Journal of Oral Microbiology is a new peer-reviewed OA journal from Co-Action Publishing.  For more detail, see today's press release or Ingar Olsen's editorial in the inaugural issue.  From the editorial:

...Publication of an article in JOM involves a relatively modest cost. However, to emphasise our commitment to extend an arena for publication in which developing settings can publish their research results, the publication fee may be waived for authors from institutions or projects unable to pay.

Open Access serves the interests of all readers: authors, teachers, students, libraries, universities, funding agencies, governments and citizens. It increases the visibility of individual authors’ work. Key resources are equally accessible to rich and poor. The mission of most universities to disseminate and share knowledge is facilitated, and funders (including governments) are given return on investment. It is high time to make research results in the field of oral microbiology freely available!

An OA portal for economics

AccessEcon is a new OA portal for the field of economics, launched January 1, 2009.  (Thanks to Mike Carroll.)  From the home page:

AccessEcon is an outgrowth of software written in 1999 and 2000 to support the publication of the Economics Bulletin, an open-access online journal meant to compete with the Elsevier Publication: Economics Letters. EB has been more successful than we ever would have imagined and had over 1000 submissions in 2008. In 2005 and 2006 we began to adapt this software to help us organize the Public Economic Theory meetings and soon realized that it was possible to go further to create a general tool that would allow us to host open access journal, editorial back offices for paper-based journals, working paper series and provide turnkey conference organization interfaces. Since our objective is to return scholarship to scholars, we offer it as a free service to qualified members and groups in the economics research community. If you are interested, please contact John P. Conley or Myrna Wooders for more information.

From the mission statement:

In the...paper-centered era before 2000, commercial publishers served a useful and necessary purpose. In the electronic era, post 2000, the academy has very little to gain from commercial publishers who may actually impede rather than facilitate scholarly communication. It seems to us to be ridiculous that we as scholars write, referee, and edit economic research, give it away to commercial publishers, and then buy it back through our libraries at enormous prices.

The mission of the AccessEcon is to foster free and rapid scientific communication across the entire community of research economists. Specifically, AccessEcon is a sophisticated web-based workflow and content management tool that makes it possible to publish electronic journals, run editorial offices, publish working paper series, and organize conferences....

From the page of services and costs:

Why "Free" is a good business model

As economists, you should be asking: why is “free” a sensible business model? There are several reasons. First, our purpose in writing this software to begin with was to support JPET, APET, and EB. Thus, we would have incurred the fixed cost of creating this software in any event. There are very few additional fixed costs to recover. Second, the marginal cost of allowing others to use the system is very close to zero. We are good enough public economists to know that the efficient price is zero in this case. The only marginal costs to us are the time it takes to help get others started on the system. This is the reason for the “mutual support” condition. Third, after careful reflection, we realized that we simply are not business people. The cost in terms of time away from research of shilling, billing and advertising is just not worth the potential financial benefit. To mangle the old joke about arbitrage: if there are five dollar bills lying on the ground it must cost ten dollars to pick them up. Finally, because of our experience at JPET and EB, we sincerely want open-access to spread as rapidly and widely as possible, especially in economics. To nickel and dime people who share this vision seems completely self-defeating....

Also see the co-founders' undated article (apparently a preprint), providing more of the background and rationale.  Excerpt:

Our conclusion is that, in the electronic era, open-access electronic journals have an overwhelming cost advantage over commercial publishers. In addition, open-access is consistent with our mission as scholars to increase and spread knowledge and also feeds our personal and professional interests much more directly. However, we are still largely living with the system of scholarly communication we inherited from the papyrocentric era. This system will not go quietly into the night. Commercial publishers will do their best to hang on to and exploit this inherited capital as long as they can.

As individual scholars, we will continue to act in our own self-interest and so will try to place our research in the most reputable journal we can. Whether this location is an open-access journal, a cheaper society journal, or an expensive commercial journal will play little role in this decision. There is no reason to complain about this, and very little that can be done to change it.

What we can do is undertake the hard work of creating credible open-access alternatives. Creating such alternatives means finding areas that are not well served by the current set of commercial journals, enlisting the support of leaders in these research areas, and then building communities around these new journals. The incentives are there for entrepreneurial editors to do so. With AccessEcon and other electronic workflow/content-management tools available, the financial barriers to entry are close to zero.


Doing science in the open

Michael Nielsen, Doing science online, Michael Nielsen, January 26, 2009; presented at Quantum Information Processing 2009 (Santa Fe, January 12-16, 2009).

... Let me show you an example of a blog. It’s a blog called What’s New, run by UCLA mathematician Terence Tao. Tao, as many of you are probably aware, is a Fields-Medal winning mathematician. He’s known for solving many important mathematical problems, but is perhaps best known as the co-discover of the Green-Tao theorem, which proved the existence of arbitrarily long arithmetic progressions of primes.

... To understand how valuable Tao’s blog is, let’s look at [an] example post, about the Navier-Stokes equations. ...

The post is filled to the brim with clever perspective, insightful observations, ideas, and so on. It’s like having a chat with a top-notch mathematician, who has thought deeply about the Navier-Stokes problem, and who is willingly sharing their best thinking with you.

Following the post, there are 89 comments. Many of the comments are from well-known professional mathematicians, people like Greg Kuperberg, Nets Katz, and Gil Kalai. They bat the ideas in Tao’s post backwards and forwards, throwing in new insights and ideas of their own. It spawned posts on other mathematical blogs, where the conversation continued. ...

Many of the best blog posts contain material that could not easily be published in a conventional way: small, striking insights, or perhaps general thoughts on [an] approach to a problem. These are the kinds of ideas that may be too small or incomplete to be published, but which often contain the seed of later progress. ...

I’ve started this talk by discussing blogs because they are familiar to most people. But ideas about doing science in the open, online, have been developed far more systematically by people who are explicitly doing open notebook science. People such as Garrett Lisi are using mathematical wikis to develop their thinking online; Garrett has referred to the site as “my brain online”. People such as chemists Jean-Claude Bradley and Cameron Neylon are doing experiments in the open, immediately posting their results for all to see. They’re developing ideas like lab equipment that posts data in real time, posting data in formats that are machine-readable, enabling data mining, automated inference, and other additional services.

Stepping back, what tools like blogs, open notebooks and their descendants enable is filtered access to new sources of information, and to new conversation. The net result is a restructuring of expert attention. This is important because expert attention is the ultimate scarce resource in scientific research, and the more efficiently it can be allocated, the faster science can progress. ...

These new forms of contribution - blogs, wikis, online markets and so forth - might sound wonderful, but you might reasonably ask whether they are a distraction from the real business of doing science? Should you blog, as a young postdoc trying to build up a career, rather than writing papers? Should you contribute to Wikipedia, as a young Assistant Professor, when you could be writing grants instead? Crucially, why would you share ideas in the manner of open notebook science, when other people might build on your ideas, maybe publishing papers on the subjects you’re investigating, but without properly giving you credit?

In the short term, these are all important questions. But I think a lot of insight into these questions can be obtained by thinking first of the long run.

At the beginnning of the 17th century, Galileo Galilei constructed the first astronomical telescope, looked up at the sky, and turned his new instrument to Saturn. He saw, for the first time in human history, Saturn’s astonishing rings. Did he share this remarkable discovery with the rest of the world? He did not, for at the time that kind of sharing of scientific discovery was unimaginable. Instead, he announced his discovery by sending a letter to Kepler and several other early scientists, containing a latin anagram, “smaismrmilmepoetaleumibunenugttauiras”. When unscrambled this may be translated, roughly, as “I have discovered Saturn three-formed”. The reason Galileo announced his discovery in this way was so that he could establish priority, should anyone after him see the rings, while avoiding revealing the discovery.

Galileo could not imagine a world in which it made sense for him to freely share a discovery like the rings of Saturn, rather than hoarding it for himself. Certainly, he couldn’t share the discovery in a journal article, for the journal system was not invented until more than 20 years after Galileo died. Even then, journals took decades to establish themselves as a legitimate means of sharing scientific discoveries, and many early scientists looked upon journals with some suspicion. The parallel to the suspicion many scientists have of online media today is striking.

Think of all the knowledge we have, which we do not share. Theorists hoard clever observations and questions, little insights which might one day mature into a full-fledged paper. Entirely understandably, we hoard those insights against that day, doling them out only to trusted friends and close colleagues. Experimentalists hoard data; computational scientists hoard code. Most scientists, like Galileo, can’t conceive of a world in which it makes sense to share all that information, in which sharing information on blogs, wikis, and their descendents is viewed as being (potentially, at least) an important contribution to science. ...

[W]e are going to change the way scientists work; we are going to change the way scientists share information; we are going to change the way expert attention itself is allocated, developing new methods for connecting people, for organizing people, for leveraging people’s skills. They will be redirected, organized, and amplified. The result will speed up the rate at which discoveries are made, not in one small corner of science, but across all of science. ...

E-Science Talking Points now under a CC license

The E-Science Talking Points for ARL Deans and Directors from October 2008, which cover OA and open data in Point 8, now carry a CC-BY-NC-SA license

If you weren't already spreading them around, start now.

Comparing LOC on Flickr and German Archives on Wikimedia Commons

Noam Cohen, Historical Photos in Web Archives Gain Vivid New Lives, New York Times, January 18, 2009.

... [T]here are the relics from the earlier age of photography, historical photographs that have been preserved in national libraries and archives or photo agencies and news media operations. Their relative scarcity alone can make them seem like treasures.

They ... are finding their way onto the Internet. ... [O]ver the last year there have been important new efforts to put these classics online, both to find new audiences for material typically used by researchers and to use those audiences to breathe new meaning into photographs from long ago.

Last month, in what is believed to be the largest donation online of “free” photographs — that is, unrestricted for commercial or noncommercial use — the German national archive uploaded nearly 100,000 historical photographs to the Wikimedia Commons, the virtual archive for material used in Wikipedia articles. ...

The photographs donated by the German archive have a lower resolution than what you would see in print (those still cost money), but are fine for online use. These lower-resolution photographs have been available at the archive site, although watermarked and with rules against commercial use (an unreasonable restriction by Wikipedia terms). The archive agreed to change, recognizing that the number of people who visit Wikipedia so dwarfs its own online visitor traffic. ...

The archive’s motives were not entirely selfless; it hopes to harness the Wikipedia editors to improve the cataloging of the photographs, said Oliver Sander, who is responsible for the collection at the archive. There are 58,000 people in these photographs who lack an ID number assigned by the German library, and the archive would like Wikipedia editors to help identify who is in these photographs and add these codes. “Unfortunately, we don’t have the capacity to implement this with our list of people,” Dr. Sander said. “Maybe Wikipedia members could add this ID to our list. That was the first benefit from Wikipedia.”

Thus far, 29,000 photographs of people have been so coded, Dr. Sander said.

In a similar move to harness the public’s knowledge about old photographs, the Library of Congress a year ago began adding photographs with no known restrictions to a Flickr service called the Commons. The Library of Congress started with 3,500 photos and adds 50 a week.

The project relies on Flickr’s ability to allow users to leave comments, below the picture or even within the picture to fill in the blanks. In a report assessing the project (conclusion: it has been a huge success) the library detailed the information that had been gleaned from Flickr users. ...

The Library of Congress photographs, in the first 24 hours of being posted last January, received 11,000 tags — ways of categorizing and connecting the photographs. ...

Jeanne Kramer-Smyth, German Federal Archives, Crowdsourcing & the Wikimedia Commons, Spellbound Blog, January 26, 2009. (Thanks to Klaus Graf.)

... In contrast to the Library of Congress addition of 50 photos a week, the German Federal Archive plans to add “a few thousand images a month”. The Commons:Bundesarchiv To Do list is also interesting reading. ...

I will say that the learning curve for classifying images within the Wikimedia Commons in general, and the Budesarchiv project in specific, is much higher than tagging images in the Flickr Commons. There is a handy CommonSense tool (available via the ‘find categories’ tab on any image) that will suggest categories based on keywords, but even that is a bit overwhelming for a beginner. ...

I am very curious to see comparison stats of the assignment of categories/tags to images in both the Flickr & Wikimedia Commons a year from now. How will we measure success? How will we grade the accuracy of metadata assigned by the public? Which images will get more public views and usage - those added to the Flickr Commons or those added to the Wikimedia Commons? ...

OA to archival material vs. contractual restrictions

Peter Hirtle, Recent News on Open Access to Archives, LibraryLaw Blog, January 26, 2009. (Thanks to Garrett Eastman.)

A continuing source of controversy is the desire of some archives, libraries, and museums to control through contract the downstream use of reproductions and digital files of public domain items. A number of recent news items make me wonder if there is much of a future for this common practice:

First, a working group of the Max Planck Institute for the History of Science organized to address issues relating the to the use of images in scholarly research released early in January its final recommendations. While the report, "Scholarly Publishing and the Issues of Cultural Heritage: Fair Use, Reproduction Fees, and Copyrights," is concerned primarily with visual images, its conclusions would apply to all public domain archival material. Everyone should read it.

Among the many interesting recommendations for cultural institutions and scholars, perhaps the most important is the recognition that in order "to promote creative scholarship in the humanities and to foster a deeper understanding of cultural heritage, access to visual sources not covered by copyright" is needed. That means that institutions should not use their ability to control access to limit non-commercial, scholarly use of public domain material.

The report also recommends that if scholars sign agreements to secure access to a public domain repository, they "must abide by the terms of use stipulated in the contract." Three relatively recent examples of users apparently ignoring the restrictions repositories attempt to place on the use of their holdings make me wonder whether there is any future for such restrictions: [Note: omitting examples.] ...

Given the that repository-based use restrictions are being ignored by scholars and at least two commercial products, one has to wonder what is the point of imposing them at all. Wouldn't scholarship be better-served if repositories sold reproductions for whatever the market would bear but allowed public domain material to be freely available via open access solutions?

See also our past post on the Max Planck Institute recommendations.

Comparison of biological wikis

Andrew Su has posted a spreadsheet comparing various wikis in biology and related fields, including topical focus, number of recent edits and number of recent editors. See also the related FriendFeed discussion. (Thanks to Michael Nielsen.)

Dutch funder supports 12 OA-related projects

SURFshare has funded 12 projects for the online dissemination of knowledge:

Knowledge dissemination at Universities of Applied Sciences

  • National Knowledge Forum for Care and Treatment of Addiction
  • Automotive Knowledge Bank for Universities of Applied Sciences
  • DIGIPUB – Digital Publication environments
  • FUTURE – Thematised Access to Expertise, Knowledge & Research for SMEs, Students and Experts

Enriched publications

  • JALC – Journal of Archaeology in the Low Countries, enriched publications in Dutch archaeology
  • DatapluS – Repositories for Enhanced Survey Publications
  • ESCAPE – Enhanced Scientific Communication by Aggregated Publications Environments
  • Theses Plus – Ennriched theses in the Utrecht repository
  • Veteran Tapes – Enriched publication based on multidisciplinary re-use of qualitative research files


  • Tales of the Revolt Collaboratory: Sharing, Enhancing and Disseminating Sources
  • Hublab-2 – Toward successful implementation of the Liferay platform in historical research
  • Virtual Knowledge Studio Collaboratory – Understanding Scholarly Collaboration in Practice

Wheeler Declaration now in Polish

The Wheeler Declaration on open universities has been translated into Polish.

OA to 56 year run of Portuguese science journal

Biblioteca Digital Camões has provided OA to the full run (1887-1943) of Revista Lusitana.  Read the details in Portuguese or Google's English.  (Thanks to Klaus Graf.) 

This is at least gratis OA.  I can't find any licensing information on the scanned articles. 

OA for UK oil paintings and Arts Council films

Leigh Holmwood, BBC to put nation's oil paintings online, The Guardian, January 28, 2009.  Excerpt:

The BBC is to put every one of the 200,000 oil paintings in public ownership in the UK on the internet as well as opening up the Arts Council's vast film archive online as part of a range of initiatives that it has pledged will give it a "deeper commitment to arts and music"....

A partnership with the Public Catalogue Foundation charity will see all the UK's publicly owned oil paintings – 80% of which are not on public display – placed on the internet by 2012.

The BBC said it wanted to establish a new section of its website, called Your Paintings, where users could view and find information on the UK's national collection.

The Public Catalogue Foundation, launched in 2003, is 30% of the way through cataloguing the UK's collection of oil paintings.

In addition the BBC said it was talking to the Arts Council about giving the public free online access to its archive for the first time, including its wide-ranging film collection dating back to the 1950s....

Interview with James Boyle

William New, Inside Views: The Last Defence Of The IP System: An Interview With Jamie Boyle, Intellectual Property Watch, January 28, 2009.  Read the whole interview; Boyle makes great good sense.  I've had to limit this excerpt to the parts with the strongest OA connection:

James Boyle is a leading thinker on copyright and knowledge access, and is author of a new book called The Public Domain: Enclosing the Commons of the Mind (available at He is a law professor and cofounder of the Center for the Study of the Public Domain at Duke Law School....

IPW: ...What actions do [policy-makers] need to be taking if they accept that there is in fact a problem?

BOYLE: ...I really think that some of what they should be doing is just what they do in every other area. So if I come to you and I say: “I really need you to approve this new drug, and I think it will really help, and I have a friend who took some and he feels much better.” That wouldn’t pass the laugh test....But we do exactly the equivalent of that in the intellectual property arena.

If you look at the history of the database directive, [a study of other countries] was simply not done. The EU simply presumed that since the United States was doing very well with no protection for unoriginal compilations of fact, that they would clearly do much better if they had that protection. In other words, they made the assumption that more rights obviously leads to more innovation, to more investment. In other words, they focused only on the output side and not on the input side. That’s a basic conceptual error.

The first thing that our policymaker should do is realise that every time you protect somebody’s output, their intellectual work, you extend their trademark, you give them control over some gene sequence, some line of code, you have extensive software patents, you are raising the costs of the inputs to another innovator further upstream....

IPW: You cast in a positive light, and you embody your own practice as your book is available through the Creative Commons licence, as our materials are. The benefits of open access, even of new creative works…

BOYLE: Yes, and I think my publisher, Yale University Press, commendably, allowed me to make this available under Creative Commons, and the book is, so far at least, selling very well. I think open access and commercial forms of distribution are not incompatible....

So, one of the other points I make in the book is there is a really sad tendency of the intellectual property establishment to assume that any new business method which doesn’t use the rights and methods they’re familiar with - that is to say use them to exclude - is automatically anti-intellectual property. This is just ridiculous. It’s like saying that if you have a condominium with a shared stairwell, that that’s anti-physical property....It’s not an attack, it’s a use.

IPW: Do you have a sort of formula for a creator to benefit from a more open approach? Is preventing commercial reproduction enough?

BOYLE: My position is one of agnosticism, again. I think it works for some kinds of creators in some places. I think the set of creators for whom it works is probably larger than we imagine. In the book, I argue that we have something that I call “cultural agoraphobia,” that we persistently underestimate the benefits of open approaches and overestimate the benefit of closed approaches, underestimate the dangers of closed approaches and overestimate the dangers of open approaches. So I think that in lots of situations, probably more than we imagine, it might well be consistent with it....

IPW: Would you advise the new US administration that the embracing of some of these concepts could help to spark innovative capabilities, etcetera, and help to turn around economies? ...

BOYLE: Well, absolutely. Obviously enormous amounts of money have to be pumped into all of the developed economies over the next year and a half. Everyone agrees on that....What should we be pumping it into? Right, that’s the question. And under what conditions? I think there is an extremely strong case for pumping money into research that will lead to levels of innovation and technological growth and making that research available as a public good....

We have not yet facilitated the flows of scientific knowledge in ways that I describe in the book that we could. We’ve bizarrely made a world in which the internet works incredibly well for porn and for shoes, but doesn’t work very well for the flow of scientific data....I think that changing course in that direction could actually have a powerful multiplier effect, and I give some examples in the book of places where free provision of government data has been an extraordinary method to prime the economic pump of activity....

Update (2/4/09).  Boyle has given another interview to Powell's Books and elaborates his position on OA:

[Q] By the end of your life, where do you think humankind will be in terms of new science and technological advancement?

[A] It depends. I really think we have a choice. One path actually makes the web work for science as well as it works for porn or shoes. It leaves open the basic building blocks of science, provides open access to scientific literature, and unleashes the power of the semantic web by tying together data sets, articles, and research results in a knowledge ecology more fruitful than anything we have now. The other path is a fallow set of walled private gardens of knowledge, protected by digital-rights management and strong copyright enforcement, a world where the fundamental components of new technologies such as synthetic biology are patented. (Imagine trying to develop computer science, if someone had a patent over Boolean algebra.) It's one of the most fundamental decisions in the direction of our culture. I think the choice is a real one and it is a tragedy that we don't see it.

Tuesday, January 27, 2009

Free university launches, building on OERs

The University of the People is a forthcoming university that plans to offer tuition-free classes. UoP will begin enrolling students in April 2009, will offer degrees and plans to seek accreditation. See coverage by the New York Times or Wired Campus. (Thanks to Ellen Marie Murphy.) From the NYT story:

... “The open-source courseware is there, from universities that have put their courses online, available to the public, free,” [founder Shai] Reshef said. “We know that online peer-to-peer teaching works. Putting it all together, we can make a free university for students all over the world, anyone who speaks English and has an Internet connection.” ...

The University of the People, like other Internet-based universities, would have online study communities, weekly discussion topics, homework assignments and exams. But in lieu of tuition, students would pay only nominal fees for enrollment ($15 to $50) and exams ($10 to $100), with students from poorer countries paying the lower fees ...

Cato Institute meeting on open government data

A video and podcast of the presentations at the Cato Institute meeting, Just Give Us the Data! Prospects for Putting Government Information to Revolutionary New Uses (Washington DC, December 10, 2008), are now online.  (Thanks to Jonathan Gray.)

New storage features for EPrints

David Tarrant, EPrints gets Cloud Storage Support, EPrints News, January 23, 2009.

As some people may know as part of the move towards EPrints 3.2 we have rewritten the way file storage works within EPrints entirely (after the recode we had less lines of source code!). Basically we have implemented an abstracted storage layer called the EPrints Storage Controller that is able to utilize new "storage plug-ins" to store files in different places.

As a starting point myself and Tim Brody have just successfully tested an EPrints install which is storing it's content in Amazon S3/Cloudfront! ...

Of course the Amazon S3 plug-in is one example of a storage plug-in, you could just as easily write your own.

Relaunch of MediaCommons

MediaCommons re-opened on January 20.

See also our past posts on MediaCommons; from this past description:

... At its core, MediaCommons will be a social networking site where academics, students, and other interested members of the public can write and critically converse about a mediated world, in a mediated environment. ...At the same time, MediaCommons will be a full-fledged electronic press dedicated to the development of born-digital scholarship ...

Study of social media in science

BioInformatics, LLC, The Brave New World of Scientific Publishing, report, November 2008. From the description:

The Brave New World of Science Publishing is the most comprehensive study of scientists’ use of social media ever undertaken. This report is designed to help publishers understand what scientists expect—and prefer—as Web 2.0 capabilities become the new industry standard. Scientists and publishers rely on each other to establish the priority of discoveries, to validate the scientific process by peer review, to disseminate findings and to establish the scientific record. However, social media threatens current business practices, and publishers that do not respond to these challenges, or respond by further entrenchment of traditional positions, could find themselves becoming increasingly unnecessary and irrelevant.

... [T]he scientific publishing industry has weathered many storms. It has had to adapt rapidly to disruptive technologies, such as the emergence of the Web as a rival medium, as well as grapple with a changing business model resulting from societal and economic forces. Some scientists, especially pioneers of the Open Access moment, have prophesied the end of traditional publishing as we know it.

Now come “Web 2.0” and “social media”—two related phenomena that again present publishers with perils but also unparalleled opportunities, at least for those willing to accept new challenges. ...

From their growing use of discussion boards, blogs, wikis, video and podcasts, scientists are learning how to employ Web 2.0 and social media tools to good effect. The Brave New World of Science Publishing will help publishers keep pace with the expectations of their readers while reinforcing their positions of respect and authority.

The report itself is not OA, but a description, table of contents, and methodology are, along with this executive summary and topline findings.

Award for CCCOER

Emilie Doolittle, Foothill-De Anza receives award for online texts, Palo Alto Online, January 20, 2009.

Foothill-De Anza Community College District received a statewide technology award for developing a national consortium to make educational resources freely available online.

The district is putting textbooks and other course materials online to lower the cost of higher education.

The California Community College Board of Governors presented the 2008 Technology Focus Award to the district for its work on the Community College Consortium for Open Educational Resources.

The consortium, established by the district in 2007 and made up of 85 community colleges, promotes the development of online-instructional material for community college faculty and students. In March 2008, the consortium launched the Community College Open Textbook Project with a $527,000 grant from The William and Flora Hewlett Foundation.

The first free online textbook, "Collaborative Statistics," was released in August 2008. The consortium worked in conjunction with Rice University's Connexions, an open-educational-resources site (, to republish the book. ...

See also our past posts on CCCOER.

Persée and collaborate

See the January 22 announcement in the original French or Google's English. Persée, which provides access to digitized backfiles, and, which hosts recent issues, will link to each other for issues of the same journal, and will offer full-text searching of both sites. Journals currently included:

On the Public Knowledge Project

G. W. Brian Owen and Kevin Stranack, The Public Knowledge Project and the Simon Fraser University Library: A partnership in open source and open access, Serials Librarian, July 2008; self-archived January 23, 2009. Abstract:
The Public Knowledge Project is an ongoing collaboration between academics, librarians, publishers, editors, and software developers, working together to build alternatives in scholarly publishing. The project has developed a suite of open source software that significantly reduces the time and expense required for producing academic journals and conferences, and facilitates making research results freely available through open access. This article examines the history of the project, provides an overview of its open source software, discusses the growing community participating in the project, and considers its future directions.

More on the Conyers bill to overturn the NIH policy

John Willinsky, The Publisher's Pushback against NIH's Public Access and Scholarly Publishing Sustainability, PLoS Biology, January 27, 2009.  Excerpt:

The dying light of the George W. Bush presidency was marked by, among other things, a legislative move to derail recent gains in the federal government's opening of science. In particular, the innocuous sounding “Fair Copyright in Research Works Act” (HR 6845) introduced into the House by John Conyers, Jr. (DEM-MI), on 9 September 2008 was poised to shut down the National Institutes of Health (NIH) Public Access Policy, as well as forestall the spread of this open-access spirit to other areas of federally sponsored research and scholarship. Hearings were held, but the bill did not make it through the House. End of story? Not quite....

What is strangely amiss in the publishers' support for outlawing the NIH Public Access Policy is that they support the upshot of this initiative with their own current copyright policies.  While a growing number of open-access journals provide immediate access to the published version, even subscription-based publishers had originally been happy enough to grant authors the right to do what they were already doing, which was putting PDFs of their articles up on their websites....Now, as indicated by their support for the Fair Copyright in Research Works Act, publishers are taking a stand against archiving. As the International Association of STM Publishers recently put it, “publishers do not believe that self-archiving offers a sustainable alternative for scientific publishing”.

One reason for the publishers' change of heart is that archiving is catching on, amid growing public expectations that research is a public good that should be made freely available online....

In testifying last September in support of the bill...,Martin Frank, Executive Director of the American Physiological Society (APS), insisted that the issue was not access rights but revenue streams....

Frank's stance makes it clear that access itself it not the issue. After all, APS's 17 journals already make their contents free after 12 months,...exceeding the terms of the NIH mandate (as APS makes the final published version freely available)....What matters, then, as Frank pointed out, was that APS “can modify [its current free access policy] should 12 months prove disadvantageous to the Society's business model”....

For STM publishers, the NIH mandate “puts at risk a system which has enabled more research to be available to more scientists in more countries than at any point in the history of science”. But is the opposite not true? To insist that the current publishing economy must be sustained places the system at risk, and all the more so amid the current economic downturn, which is bound to affect research library budgets in the coming year. The point has been brought home by Heather Joseph, executive director of the Scholarly Publishing and Academic Resources Coalition (SPARC), who testified in opposition to the Fair Copyright bill on behalf of the leading research libraries, which can not sustain subscriptions to as many journals as they would like: “This situation [in which libraries can afford access to only a portion of the literature] is exacerbated by the continued rapid escalation in price of journal subscriptions, which puts libraries in the position of having to cancel subscriptions”.

[T]he real battle for the future of science is not about NIH's provision of delayed access to author's drafts. It is about efforts to protect unsustainable revenue streams amid the capitalization of a public good to a degree that arguably undermines what is basic to the progress of science and useful arts. If the publishers' exclusive ownership of this body of work enables them to charge what they will, on their own terms, then what is at risk is the delicate-at-best balance between public interests in such learning and private investments in bringing it to the widest possible audience....

Publishers, who could once count on no one giving a second thought, least of all researchers, to transferring to them the copyright for the entire research corpus, article by article, should have sufficient vision to see that a new civil rights issue is emerging over access to, as well as control over, such knowledge....

PS:  Hear, hear.  See my article from October 2008, analyzing the Conyers bill in detail and showing the false assumptions in the rhetoric of the publishing lobby in support of it.  Also see my point by point rebuttal to the STM briefing document.

PubMed flags articles that are OA from journal or PubMed Central

Annette M. Nahin, Free Article Indicators on PubMed Summary Display, NLM Technical Bulletin, January 26, 2009.  Excerpt:

The PubMed Summary display format has changed to include information about the availability of free articles from publishers. In addition to the existing link on citations showing what is free in PubMed Central (PMC), a new link indicates if the article is freely available at the journal's Web site. Some citations have a link indicating the article is free from both PMC and the journal. The links all go to the AbstractPlus display where icon links to PMC and journal sites display.

For some LinkOut library participants, their subscriptions to journals that provide online access can greatly increase the number of "free" articles available to patrons as compared to what is shown by free indicators on the Summary display. Patrons can use My NCBI to create a filter for their library in order to easily determine what additional articles they can view. Instructions for creating filters can be found in the PubMed Help....


  • Here's a sample search with some examples of the flags.  Look for the green text which says "Free article at journal site", "Free article in PMC", or "Free article in PMC | at journal site".
  • Here's my translation of the second paragraph above:  The new free-article flags do not cover subsidized TA articles, or articles that are not OA but free to you because your employer has prepaid the access fees.  However, you can still link to such articles from PubMed and filter your searches to display only such articles.  BTW, this is in contrast to the practice at Google Scholar, whose free-article flag covers both OA articles and subsidized TA articles.


More on OA from the Smithsonian

Wayne Clough, the current secretary of the Smithsonian Institution, is already leagues ahead of his predecessor, Lawrence Small, on access issues as well as ethics issues.  See Brett Zongker's story today from the Associate Press:

[Clough] said the Smithsonian should focus on tackling the issues of education, climate change and biodiversity, and American identity and diversity. And he wants curators to help open access to the Smithsonian's 137 million-object collection.

"Our job is to authenticate and inform the significance of the collections," he said, "not to control access to them." ...


  • In this passage, "open" is a verb, and Clough may simply mean making access wider and easier, not providing free online access.  But even wide and easy access would be an improvement over his predecessor's deals with Corbis and Showtime to enclose portions of the national commons.
  • Or he may really mean free online access.  In September 2008 he announced plans to digitize all 137 million objects in the Smithsonian's collection.  The announcement didn't say whether the digital versions would be OA, but reading his recent language in light of the digitization plans is very promising.  Moreover, it was under Clough that the Smithsonian joined the Flickr Commons in June 2008, an expressly OA rebuke to the Corbis deal.  And of course his principle is the right one:   "Our job is to authenticate and inform the significance of the collections, not to control access to them."
  • Clough has been on the job since May 2008.

January edition of repository rankings

The January 2009 edition of the Ranking Web of World Repositories is now online.  See today's announcement in plain text or Stevan Harnad's edition enhanced with links.  From the latter:

The number of repositories is growing fast worldwide but still many of them do not have their own domain or subdomain, and for this reason it is not possible to add them into our analysis. Some institutions maintain several databases with completely different URLs which penalize the global visibility they have [PS:  as measured by this service].

We are still unable to add usage/download statics but there are many initiatives already working on standardization of the collecting methods, so we expect that global data should be available soon.

Following several requests we now show two global Rankings. One that covers all repositories as was shown in previous editions (Top 300), and a new one that focus only on Institutional Repositories (Top 300 Institutional).

There is a minor change regarding the calculation of the number of rich files as in this new edition we are again using formats other than pdf (doc, ppt, ps) to obtain the data. Contrary to the methodology we used to make the other Rankings, the figures for rich files are combined and not treated individually.

The French HAL central repository, and its subsets like INRIA, Social Sciences and Humanities (HAL-SHS) or IN2P3, are at the top of the institutional repository list.

Important repositories like PubMedCentral, CiteSeerX and Smithsonian/NASA Astrophysics Data System, do not use standard suffixes to design their contents (e.g. papers in acrobat format with file names which extension is not .pdf). This is a bad practice as it reduces the visibility of these documents to the search engines....

Update on SCOAP3

Jeffrey Young, Physicists Set Plan in Motion to Change Publishing System, Chronicle of Higher Education, January 30, 2009.  Excerpt:

In what some are calling a peaceful revolution, researchers have mounted a takeover of high-energy-physics publishing. One signature at a time, national research agencies and university libraries have pledged to support a radical new system that would replace expensive subscriptions to leading journals with membership in a nonprofit group. The new organization would then dole out money to journal publishers, while pushing them to distribute all articles free online and to keep their prices in check.

The key: By teaming up, the libraries, which pay the bills, and the researchers, who provide the articles, will exert unprecedented leverage. The strategy might also convince journal editors — who have been reluctant to give away all of their content for fear of losing money — that libraries will continue to pay them even in an open-access system.

The group is called Scoap3, the Sponsoring Consortium for Open Access Publishing in Particle Physics....

[L]eaders in physics hope that if their experiment works, other disciplines will follow suit....

So far the journal publishers say they are willing to consider such a model, but they are hardly enthusiastic. "We must show some good will," said Christian Caron, an executive editor at the publishing conglomerate Springer Science+Business Media, which oversees a major high-energy-physics journal. "We pledge that we will sit down at the table for negotiations." He described his attitude toward the project as "a very cautious 'Let's see and discuss it.'" ...

[CERN's Salvatore Mele] says journals still play a crucial role in the professional life of scientists, even though readership has declined. "We do not buy journals to read them, we buy journals to support them," he said. "They do something crucial, which is peer review." ...

So far more than 19 countries have pledged to participate, including Belgium, France, Germany, Greece, Hungary, Israel, Italy, Norway, Sweden, Switzerland, and Turkey....

Colleges in the United States have been a tougher sell...The librarians [at a February 2008 meeting in Berkeley] praised the goals of the project, but some asked whether it was sustainable. After all, if the journals make their contents free online, why should college libraries use their shrinking resources to pay for them?

Some librarians at public institutions say they cannot participate even if they want to. "Most states require that public funds allocated for purchasing have to be used to actually purchase something," said Dennis Dillon, associate director for research services at the University of Texas at Austin....

Some journal editors are also anxious about whether the project will work.

"We are gravely concerned about the difficulty of reassembling our subscription model were Scoap3 to fail," said Gene D. Sprouse, editor in chief at the American Physical Society, in a written statement....

Paul Ginsparg, a physics professor at Cornell who started arXiv, also expressed skepticism about the new project's viability, echoing concerns about the project's financial model.

He said he hoped that open-access options would become so compelling — and incorporate new features that are so useful — that researchers would only want to publish their papers in journals that choose to be completely open. "Such systems are currently under construction," Mr. Ginsparg said, "but some of my colleagues argue that it's useful to have additional mechanisms to force the materials out there — to hasten the transition to 21st-century scholarly-communications infrastructure."

Despite such skepticism, more than 30 colleges and several library consortia in the United States have pledged to participate....

Peter Suber...praised the project for involving publishers in the discussions and for searching for a compromise....

Major new report on the economic implications of OA

John Houghton and eight co-authors, Economic implications of alternative scholarly publishing models:  Exploring the costs and benefits, January 2009.  A major (256 pp.) report to JISC

From the press release:

...The research and findings reveal that core scholarly publishing system activities cost the UK higher education sector around £5 billion in 2007. Using the different models, the report shows, what the estimated cost would have been:

  • £230 million to publish using the subscription model,
  • £150 million to publish under the open access model and
  • £110 million to publish with the self-archiving with peer review services plus some £20 million in operating costs if using the different models.

When considering costs per journal article, Houghton et al. believe that the UK higher education sector could have saved around £80 million a year by shifting from toll access to open access publishing. They also claim that £115 million could be saved by moving from toll access to open access self-archiving.

In addition to that, the financial return to UK plc from greater accessibility to research might result in an additional £172 million per annum worth of benefits from government and higher education sector research alone.

JISC’s Chair Professor Sir Tim O’Shea said, “The argument for moving from more traditional subscription or toll-based publishing to a model that allows for greater accessibility and makes full use of the advances in technology cannot be ignored. This report shows there are significant savings to be made and benefits to be had.

“JISC will work with publishers, authors and the science community to identify and help to remove the barriers to moving to these more cost-effective models,” he added....

From the summary on the landing page:

...Scholarly central to the efficiency of research and to the dissemination of research findings and diffusion of scientific and technical knowledge. But, advances in information and communication technologies are disrupting traditional models of scholarly publishing, radically changing our capacity to reproduce, distribute, control, and publish information. The key question is whether there are new opportunities and new models for scholarly publishing that would better serve researchers and better communicate and disseminate research findings (OECD 2005, p14).

Debate on the economics of scholarly publishing and alternative publishing models has focused almost entirely on costs. And yet, from an economic perspective, the aim is to have the most cost-effective system, not (necessarily) the cheapest, and however much one studies costs one cannot know which is the most cost-effective system until one examines both costs and benefits. Hence, the aim of this project was to examine the costs and benefits of three alternative models for scholarly publishing (i.e. subscription publishing, open access publishing and self-archiving). In so doing, it seeks to inform policy discussion and help stakeholders understand the institutional, budgetary and wider economic implications.

The project involved two major phases:

  • Phase I: Identification of costs and benefits – sought to describe the three models of scholarly publishing, identify all the dimensions of cost and benefit for each of the models, and examine which of the main players in the scholarly communication system would be affected and how they would be affected; and
  • Phase II: Quantification of costs and benefits – sought, where possible, to quantify the costs and benefits identified; identify and where possible quantify the cost and benefit implications for each of the main players in the scholarly communication system; and, where possible, compare the costs and benefits of the three models....

From the section on comparing costs and benefits (pp. 211f) in the body of the report:

As noted, it is not possible to compare toll with open access publishing directly at the national level as they perform very different roles: toll access publishing seeks to provide UK subscribers with access to worldwide research (to the limits of affordability), whereas open access seeks to provide worldwide access to UK research. Therefore, we approach the question from both sides....

Because of the lag between research expenditure and the realisation of economic and social returns to that research, the impact on returns to R&D is lagged by 10 years....[T]his has the effect that over a transitional period of 20 years we are comparing 20 years of costs with 10 years of benefits....

[Gold OA:]  We estimate that an all author/producer side funded OA publishing system for all journal articles produced in the UK would have cost around £170 million nationally in 2007, of which around £150 million would have related to higher education outputs – approximately 0.74% of GERD and 2.43% of HERD, respectively....

Ignoring potential cost savings and given the assumptions outlined above (including inflating costs at the higher 5% per annum), we estimate that over 20 years:

  • The cost of OA publishing for higher education would be around £1.8 billion in Net Present Value, whereas the estimated impact on returns to Higher Education R&D Economic implications of alternative scholarly publishing models 214
    (HERD) would be around £615 million, a benefit/cost ratio of 0.3 (i.e. the benefits would be less than the costs); and
  • The cost of OA publishing nationally would be around £2 billion in Net Present Value, whereas the estimated impact on returns to UK Gross Expenditure on R&D (GERD) would be around £2.4 billion, a benefit/cost ratio of almost 1.1 (i.e. the benefits would be marginally greater than the costs) (Table 6.1)....

[Green OA:]  We estimate that a system of OA (publications) repositories for journal articles with all outputs posted once, would have cost the UK around £23 million per annum nationally in 2007, of which £18 million per annum would have related to higher education. Ignoring potential cost savings and given the assumptions outlined above, we estimate that over 20 years:

  • The cost of OA self-archiving for higher education would be around £189 million in Net Present Value, whereas the estimated impact on returns to Higher Education R&D (HERD) would be around £615 million, a benefit/cost ratio of 3.2; and
  • The cost of OA self-archiving nationally would be around £237 million in Net Present Value, whereas the estimated impact on returns to UK Gross Expenditure on R&D (GERD) would be around £2.4 billion, a benefit/cost ratio of 9.9....

These comparisons suggest that the additional returns from enhanced accessibility and efficiency alone would be sufficient to cover the costs of OA self-archiving in parallel with subscription publishing (i.e. ‘Green OA’ self-archiving without subscription cancellations), independent of the activity cost savings noted above.

Indicatively, putting the notional impacts of enhanced access into year one to simulate a post-transition ‘steady-state’ alternative OA self-archiving system, returns a benefit/cost ratio of 36 for higher education and 110 nationally. This suggests that the benefits of an OA self-archiving system with overlay services would substantially outweigh the costs....

From the conclusions and recommendations (pp. 231f):

...There are...major differences between impacts [of OA] during a transitional period and those in a hypothetical alternative ‘steady-state’ system....We took the view that it was more realistic and of more immediate concern to model the transition, but it must be emphasised that a transitional model returns significantly lower benefit/cost ratios than would a hypothetical alternative ‘steady-state’ model. Hence, while the findings presented should be interpreted with caution, the assumptions and modelling are very conservative....

[D]ifferent models for scholarly publishing can make a material difference to the returns realised, as well as the costs faced....

[W]hile net benefits may be lower during a transitional period they are likely to be positive for both OA publishing and self-archiving alternatives (i.e. Gold OA) and for parallel subscription publishing and self-archiving (i.e. Green OA). This suggests that there are gains to be realised from moving towards more open access publishing models and that, despite the lag between the costs and the realisation of benefits, the transition would probably be affordable within current system-wide budgetary allocations....


Overcoming the barriers...

  • Ensuring that research evaluation is not a barrier to innovation (e.g. by developing and using metrics that support innovation in scholarly publishing, rather than relying on traditional evaluation metrics that reinforce and reward traditional publishing models and behaviours);
  • Ensuring that there is funding for author or producer side fees (e.g. encouraging all research funders to make explicit provision for publication charges, and encouraging higher education and research institutions to establish funds to support publishing fees);
  • Encouraging and funding the further development of institutional and/or subject repositories; and
  • Supporting advocacy initiatives to inform and educate funders, researchers and research managers about the potential impacts of alternative publishing models....

Realising the benefits...

Our analysis suggests that open access self-archiving, either in parallel with subscription publishing or with overlay services, may be very cost-effective, although more information is required on repository costs and the potential benefits of greater integration of publications with other forms of research output, their integration into learning materials, and the curation and Exploring the costs and benefits sharing of research data (Box E-I). Hence, there is scope to focus greater attention on the development of repositories. This might include:

  • Encouraging and supporting the development of institutional and/or subject repositories;
  • Encouraging greater focus on the operational effectiveness of repositories (e.g. enhancing metadata standards and quality, effective federation, enhanced
    discoverability and searchability, supporting the development and use of metrics and reporting suitable for research evaluation, etc.); and
  • Encouraging greater sharing of information and experiences to enable stakeholders to better understand the costs and benefits involved and build more effective ‘business cases’ for repositories.

Our analysis also suggests that there may be considerable benefits available from a shift to open access scholarly book publishing....

PS:  Also see our past posts on John Houghton's research on the economic impact of OA.

Update (2/6/09). For interactive what-if analysis, see the

online model which makes a small subset of the the EI-ASPM Project cost-benefit modelling available to those interested in further exploring results and wanting to explore national, sectoral or institutional costs and benefits. It runs as an executable application within MS Excel, by simply clicking on the file after downloading. Each of the model elements is presented as a single screen worksheet. Copies of the model can be saved locally to record results and each of the worksheet models can be printed as a single page. Simply enter your preferred values into the Variables column of the Parameters Table and the results will be recalculated automatically. You can TAB between the active cells.

Update (2/13/09).  For a response from publishers, see the joint statement from the Publishers Association, the Association of Learned and Professional Society Publishers, and the International Association of Scientific, Technical and Medical Publishers, February 13, 2009.


Monday, January 26, 2009

Richard Poynder's Basement Interviews at Bloomsbury Academic

Richard Poynder's Basement Interviews "with leaders and thinkers from the growing number of free and open initiatives" should now have a higher profile and reach more readers.  A new page collecting links to the interviews, and a preface introducing them, has been posted at the site of Bloomsbury Academic, the OA imprint of Bloomsbury Publishing.  (Disclosure:  I'm the subject of one of the interviews.)  From his preface:

A few years ago I could see an increasing number of "free" and "open" movements beginning to develop. And while they all had different aims, they appeared to represent a larger and more generalised development than their movement-specific objectives might suggest.

Indeed, I felt that they looked set to exemplify the old adage that the sum of some phenomena is always greater than the constituent parts. But if that was right, I wondered, what was the sum in this case?

I was also intrigued as to why they were emerging now. For while it was apparent that these movements — including Open Source and Free software, Creative Commons, Free Culture, Open Access, Open Content, Public Knowledge, Open Data, Open Source Politics, Open Source Biology, and Open Source Journalism etc. — all owed a great debt to the development of the Internet, it was not clear to me that the network was the only driver....

Additionally, I was curious about the individuals who had founded these movements: What had motivated them? Why did they feel so passionate about the cause that they had adopted? What did they think the various movements had in common (if anything) with one another? What was the big picture?

All in all, it seemed to me to be good material for a book; a book that I envisaged would consist primarily of a series of Q&A interviews with the key architects and advocates of what I had come to call the Free Knowledge movement — people like John Perry Barlow, John Gilmore, Michael Hart, Richard Stallman, Eric Raymond, Linus Torvalds, Jay Rosen, Lawrence Lessig, Joe Trippi, Harold Varmus, Vitek Tracz, Stevan Harnad, Paul Ginsparg, Cory Doctorow, Yochai Benkler, Richard Jefferson, Michel Bauwens etc.

I eventually started publishing the interviews on my blog, as The Basement Interviews. And much to my pleasure I began to receive positive feedback almost immediately. I also felt the big picture was beginning to emerge, although the project remains ongoing for now....

Many of those who have contacted me have urged me to seek out a publisher....Others were less sure [that was necessary]....

What do you think? I'd be interested to hear. I'd also be interested for suggestions as to who is missing from my list of interviewees. Who else, that is, do you think of as a key architect or advocate for the Free Knowledge movement that has not been mentioned here? I can be contacted at ....

OA to South African heritage

Open SA! is a recently-launched project by SA Rocks and the African Commons Project. (Thanks to Tectonic.) From the announcement:
OpenSA! launches in Johannesburg today with a pilot project to make South African heritage more accessible for remixing and re-publishing by online creators. In collaboration with SA Rocks and the African Commons Project, OpenSA! is collecting, tagging and managing donations from people who are willing to make their material freely available online. ...

LOC to study distribution of bibliographic data

Library of Congress Announces Study of Bibliographic Record Publication, press release, January 23, 2009. (Thanks to ResourceShelf.)

The Library of Congress today announced the next phase of its investigation into the creation and distribution of bibliographic data in U.S. and Canadian libraries. The Library has commissioned a study to research and describe the current marketplace for cataloging records in the MARC format, with primary focus on the economics of current practices, including existing incentives and barriers to both contribution and availability. The study will be carried out by R2 Consulting LLC of Contoocook, N.H.

The Library has recognized that its role as a producer of bibliographic data is changing and that other libraries have options as they consider sources for cataloging records. The conclusions outlined in a report issued last year, "On the Record: Report of the Library of Congress Working Group on the Future of Bibliographic Control," indicate that cataloging activity must be shared more broadly and equitably among all libraries. Before the Library considers any changes to its cataloging commitments or priorities, however, it is vital to understand the extent to which other libraries rely on its contributions. The study will examine cataloging production and practice across all library types, including cooperative activity through OCLC, the Program for Cooperative Cataloging (PCC), the National Library of Medicine, the National Agricultural Library, library consortia, and other shared cataloging initiatives.

Under the general direction of Deanna Marcum, Associate Librarian for Library Services at the Library of Congress, R2 will develop a description of the current economic model and will determine the extent of library participation in and reliance on existing structures and organizations. The study will show the degree to which sources other than the Library of Congress are supplying quality records in economically sufficient quantities, or whether most libraries use records created by the Library. This project is oriented toward fact-finding and reporting rather than solutions, and it is intended to produce a snapshot of the existing market. The project is scheduled for completion by June 30, 2009, with a written report and visual representation of the existing marketplace. Progress reports, along with various other data collection and communication tools, will be made available via the R2 Web site and the Bibliographic Control Working Group site ...

Podcast on

Richard Wallis, Interview with Josh Ferraro of, Talking with Talis, January 23, 2009. A podcast.

‡ is a free service for librarians to create, edit, and share bibliographic records backed by an equally free and open store of over 30 million library records available for all to access, search and download.

LibLime CEO Josh Ferraro joins me in conversation as he launches ‡ at ALA Midwinter in Denver.

We explore how this is a really free and open service that has been made possible, not only by technology and open source software, but also by the availability of open data licensing in the form of Open Data Commons. Josh also explains how the core software behind ‡ is itself open source.

See also our past post on

Progress report on OA in Indonesia

L. T. Handoko, A New Approach for Scientific Data Dissemination in Developing Countries: A Case of Indonesia, Earth, Moon, and Planets, January 14, 2009.  Only an abstract and short preview are free online, at least so far.

Abstract:   This short paper is a progress report on our experiences in Indonesia of collecting, integrating and disseminating both global and local scientific data across the country through the web. Our recent efforts are concentrated on improving local public access to global scientific data, and encouraging local scientific data to be more accessible for global communities. We have maintained a well-connected infrastructure and web-based information management systems to realize these objectives. This paper is especially focused on introducing the ARSIP system for mirroring global data and sharing local scientific data, and the newly developed Indonesian Scientific Index for integrating local scientific data through an automated intelligent indexing system.

PS:  Does anyone know what "ARSIP" stands for in this context?  Is it the old US military acronym for "Accuracy, Reliability, Supportability Improvement Program" or is it something more data-specific?

Update.  Stian Håklev, who once lived in Indonesia, tells me that arsip is Indonesian for archive.  That explains why they chose that acronym but not what the acronym stands for.  (Thanks, Stian.)  Anyone have the missing piece of the puzzle --or access to full text?

Update (1/27/09).  Stian Håklev has solved the rest of the puzzle:

ARSIP is a mirroring-service located in Indonesia that mirrors a number of foreign datasets. They also discuss the Indonesian Science Index (ISI), and includes a calculator to calculate an individual researcher's scientific performance. All of these services are hosted at Indonesia Institute of Science.

Update (3/8/09). An OA edition of the paper is now available from arXiv.

OA in biomedical engineering

Jennifer A. Flexman, Open access to biomedical engineering publications, Engineering in Medicine and Biology Society, August 25, 2008.  Only this abstract is free online, at least so far.

Scientific research is disseminated within the community and to the public in part through journals. Most scientific journals, in turn, protect the manuscript through copyright and recover their costs by charging subscription fees to individuals and institutions. This revenue stream is used to support the management of the journal and, in some cases, professional activities of the sponsoring society such as the Institute of Electrical and Electronics Engineers (IEEE). For example, the IEEE Engineering in Medicine and Biology Society (EMBS) manages seven academic publications representing the various areas of biomedical engineering. New business models have been proposed to distribute journal articles free of charge, either immediately or after a delay, to enable a greater dissemination of knowledge to both the public and the scientific community. However, publication costs must be recovered and likely at a higher cost to the manuscript authors. While there is little doubt that the foundations of scientific publication will change, the specifics and implications of an open source framework must be discussed.

Advantages of converting to OA and electronic-only

Back in June 2008 we learned that Computational Linguistics would convert to OA with the first issue in 2009.  Here are the editor's reflections on the conversion, in the last issue of 2008:

Robert Dale, What’s the Future for Computational Linguistics?  Computational Linguistics, December 2008.  An editorial.  (Thanks to the Alex Project.)  Excerpt:

1. Why Open Access?

...At the time of writing, the vast bulk of scholarly literature is not open access....There is an increasingly widely held view that this is just not right. Given that almost all the research published in scholarly journals is paid for by the taxpayer, it's reasonable to ask what justification there can be for restricting public access to this research by requiring that a further payment be made to read about it. And 'toll access', as it is sometimes called, is not only bad for the reading public; it has been frequently argued that it is bad for authors too, because any barriers to access may decrease the likelihood of citation and the general impact of the reported work.

So...Computational Linguistics will become an open access journal as of the first issue of 2009. In fact, for some time the journal has had a rather unusual access status, whereby its content becomes freely available as part of the online ACL [Association for Computational Linguistics] Anthology one year after publication. As of the first issue of Volume 35, even this delay will be removed, with the journal's contents being freely available as soon as they are published.

2. Why Electronic-only?

That the journal is going electronic-only and open access at the same time are not unrelated events. Print publication is a major part of the cost of producing the journal, and this is a cost that is very considerably subsidized by the revenues generated by institutional subscriptions. Going open access means that institutions no longer pay these subscriptions, and so we need a sustainable financial model that does not depend on this income. Ceasing the production of the printed version of the journal is an important step in that direction. In an environmentally conscious age, it also has the additional benefit of removing the need to fly, four times a year, small plastic-wrapped wads of paper to two thousand ACL members around the world....

This is not to say that going electronic-only makes the journal free of production costs. Although the journal's editor, editorial board, and external reviewers provide their labor free of charge, it still costs real money to produce the journal: In particular, CL will continue to use the professional services of the MIT Press in managing the high standards of copyediting and typesetting to which our readers have become accustomed. For the foreseeable future at least, these costs will continue to be met by the ACL, as owner of the journal.

3. What Else is Changing?

The shift to electronic publication means that publishing an article in Computational Linguistics should be considerably faster than before, for a number of reasons. Under the old model, it can take up to six months from the time that the editorial office dispatches the author's final copy of an article to the publisher, to the time the article appears in print. Around half of this time is taken up by the print production process. The delay is further exacerbated by the fact that the first article that is ready for an issue has to wait until the remainder of the issue is ready before it can be published. By moving to electronic-only publication, articles will now appear online as soon as they have gone through the copy-editing process, on an article-by-article basis. You'll be able to access articles even before they have completed the copy-editing cycle. This, in fact, is not a new thing: for the last year or so, we have been publishing...the proofs that are sent to the authors for checking before final edits are made...via the MIT Press Web site under the label 'Early Access'. But the unavailability of an easy way to let individual subscribers see this toll-access content meant that very few people were aware of its existence. With the move to open access, the Early Access articles will also be freely available to all.

Another consequence of being electronic-only is that there will be no artificial backlog resulting from the page limits imposed by physical print production....

Digesting some recent literature on scholarly information behavior

Carole L. Palmer, Lauren C. Teffeau, and Carrie M. Pirmann, Scholarly Information Practices in the Online Environment:  Themes from the Literature and Implications for Library Service Development, OCLC Research, January 2009.  (Thanks to the JISC Information Environment Team.)  Excerpt:

Research libraries exist to support scholarly work. In recent years, the literature on scholarly practices and information use has been growing, and research libraries should be prospering from this increased base of knowledge. Unfortunately, the profession has no effective means for systematically monitoring or synthesizing the published results. This review begins to address the problem by reporting on the state of knowledge on scholarly information behavior, focusing on the information activities involved in the research process and how they differ among disciplines. It provides an empirical basis for identifying promising directions and setting priorities for development of digital information services to support and advance scholarship....

[Over time], faculty have developed more informed and positive perceptions of open-access and alternative models for publishing, but some scholars still perceive e-publishing to be risky and less rigorously reviewed. Studies have found that senior faculty tend to be more comfortable sharing early stages of work in online venues and that Web presentation and self-archiving is increasing across fields. For example, chemical engineering faculty have been shown to consider digital alternatives highly viable, and some archaeologists are now willing to share field observations on open-access sites (Harley, Earl-Novell, Arter, Lawrence, & King, 2007).

As discussed by Kling and McKim (2000), “scholarly societies play a major role in the shaping of communications forums within a field, both because they are typically major publishers within a field, and also because they articulate and disseminate research and publishing standards for a field” (p. 1312). They note that both the American Chemical Society and the American Psychological Association have had policies directing authors not to put publications on the Web at any stage of production. A survey examining scientists’ use of e-print archives for dissemination reported that they were used by a small number of psychology faculty and less so by chemists who indicated it was “against the policy of the publishers.” Nearly one-quarter of psychology scholars also cited publisher policies as a reason for non-use of e-print archives (Lawal, 2002)....

Recently, the Consolidated Appropriations Act (2008) in the United States mandated that any research conducted on behalf of the National Institutes of Health must be made freely accessible, and other funding agencies like the National Science Foundation have been strong proponents of openly accessible research. Motivated in part by the rising cost of serials and the Web’s influence on scholarship, many universities across the world are developing their own institutional repositories (IRs) to preserve and freely disseminate the work of their scholars. The use of IRs by faculty has been associated with self-archiving behavior (e.g., Kim, 2007; Xia & Sun, 2007). But while one international survey of over 1,200 scholars showed that nearly half of the respondents engaged in self-arching behavior (Swan & Brown, 2005), deposit in IRs has been slow in general. A range of factors have been identified, including faculty not understanding potential benefits and continued preference for traditional peer review venues over open access alternatives (Bell, Foster, & Gibbons, 2005; Crow, 2002; Palmer, Teffeau, & Newton, 2008; Park & Qin, 2007). At the same time, librarians and other proponents stress that IRs, author-pay models, and other open access options are “viable alternatives to the problem of unsustainable journal costs” (Harley et al., 2007, p. 8)....

"Data isn't copyrightable"

John Wilbanks, Data, Copyrights, And Slogans, Oh My, Common Knowledge, January 24, 2009.  Excerpt:

I got drawn into a debate about copyrights and factual data this week that felt like it merited its own blog post....

So, the whole thing started when Jon Philips, a dear friend and running-dog creative commoner, posted that we need to have a slogan-level campaign about data. He suggested "data is not copyrightable" and the comments started to fly. Some jumped in and said it was an empty slogan. Being a pedantic wonk, I jumped in to point out that this was a technically correct and truthful statement. And then it got interesting, at least, from my perspective.

We got embroiled in the weeds of the issue - the definitions of data, and importance of compilation and selection and arrangement, the funky international regimes around data protection, and so forth. I'm going to try to untangle them a little bit here....

[The Feist case] was good news for telephone book competition. It's fantastic news for science. It means that at least in the US, there is a right granted to us as users to extract, republish, integrate, federate, query, mash, mix, fold, spindle, and mutilate data to our own ends. It is an essential legal component of the emerging web of data. If copyrights traveled with either an individual datum or a data set, we'd have attribution stacking problems that make the miserable 27 pages of illegible wikipedia attribution look like a walk in the park, and that's just for today. In 30 years, which is less than halfway towards the end of a copyright whose death-clock began ticking today, it'd be a nightmare.

So "Data isn't copyrightable" might be a poor slogan. But it's an essential truth. It sits at the basis of a lot of really important legal aspects around data. If data were copyrightable it might be easier to understand, but it'd be a lot worse to use.

This creates what a lot of folks seem to think is an incentive problem. How can we create incentives for people to create data collections if there's no protection? I'll come back to this in another post....

ESPRC adopted an OA mandate

If you recall (1, 2), the Engineering & Physical Sciences Research Council (ESPRC) is the only one of the seven Research Councils UK which hasn't already adopted an OA mandate.  Back in 2006, when the other councils were adopting or announcing theirs, the ESPRC said it would wait until 2008. 

The new policy isn't public yet but over the weekend we got the first public clue.  The ESPRC updated a key paragraph on its OA policy page

Old paragraph:

In addition, an independent study will start late in 2006 and report in late 2008. Once this has been completed a full assessment of all the factors and implications can be made. This will include the question of whether the mandatory deposit of papers in repositories is a beneficial and cost effective option.

New paragraph:

The independent study commissioned by Research Councils UK was completed in late 2008. The findings from the study are now being taken forward by the Cross-Council Research Outputs Group and will be used to inform future policy on open access. EPSRC Council agreed at its December meeting to mandate open access publication, but that academics should be able to choose whether they use the green option (ie, self-archiving in an on-line repository) or gold option (ie, pay-to-publish in an open access journal). Further details will be published in spring 2009.


RoMEO goes to Spain

Dulcinea is a recently-launched database of the self-archiving and copyright policies of Spanish journals. Named for the Quixote character (by comparison to SHERPA's RoMEO), the beta service currently lists 249 journals. (Thanks to Universo Abierto.)

On the signifiance of wiki knowledge projects

Larry Sanger, Why wiki knowledge projects are so fascinating to so many, Citizendium Blog, January 23, 2009.

The sheer number of fields that are intensely interested in wiki knowledge communities is staggering. ...

Why are wiki knowledge projects of such intense and broad-based interest?

There’s a good reason. It’s because of what wiki knowledge projects are.

They are a new thing under the sun: international communities of volunteers that collaboratively produce free knowledge, information of use to everyone, distributed online; and, in the form of Wikipedia and soon the Citizendium too, they are remarkably huge and well-used. ...

But there is an even more essential explanation: wiki knowledge projects are an enormous coming-together of people to understand the world. ...

I think most people have vaguely, but not quite, realized that we are coming to grips with a new kind of knowledge institution– one that has the potential to be as powerful as any that has come before it, or more so. ...

Inderscience offers a hybrid OA option

On January 24, Inderscience Publishers launched an OA option for its journals. (Thanks to Jim Till.) Key points:

Thesis finds 16% of LIS journals are OA

Sebastian K. Boell, A Scientometric Method to Analyze Scientific Journals as Exemplified by the Area of Information Science, thesis, Saarland University, 2007; self-archived January 23, 2009. Abstract:

Background. In most academic disciplines journals play an important role in disseminating findings of research among the disciplinary community members. Understanding a discipline's body of journals is therefore of grave importance when looking for previous research, compiling an overview of previous research and and in order to make a decision regarding the best place for publishing research results. Furthermore, based on Bradford's Law of scattering, one can assume that in order to be able to compile a satisfying overview of previous research a wide range of journals has to be scanned, but also that there are some 'core' journals which are of more importance to specific disciplines than others.

Aim. This thesis aims to compile a comprehensive master list of journals which publish articles of relevance to Library and Information Science (LIS). A method to rank journals by their importance is introduced and some key characteristics of the disciplines body of journals are discussed. Databases indexing the disciplines journals are also compared.

Method. The master list of LIS journals was created by combining the journal listings of secondary sources indexing the field's literature. These sources were six databases focusing on LIS literature: INFODATA, Current Contents, Library and Information Science Abstracts, Library Information Science Technology Abstracts, Information Science and Technology Abstracts, and Library Literature and Information Science, the LIS subsection in three databases with a general focus: Social Science Citation Index, Academic Search Premier, and Expanded Academic ASAP, and the listing of LIS journals from the Elektronische Zeitschriften Bibliothek. Problems related to editorial policies and technical shortcomings are discussed, before comparing: predominant publication languages, places of publication, open access, peer review, and the ISI Journal Impact Factors (JIF). Journals were also ranked by the number of occurrences in multiple databases in order to identify 'core' publications. The number of journals overlapping between databases are estimated and a matrix giving the overlap is visualized using multi dimensional scaling. Lastly, the degree of journals overlapping with other disciplines is measured.

Results. A comprehensive master list of 1,205 journals publishing articles of relevance to LIS was compiled. The 968 active journals are mostly published in English, with one third of the journals coming from the US and another third from the UK and Germany. Nearly 16% of all journals are open access, 11% have a ISIJIF, and 42% are peer reviewed. Fifteen core journal could be identified and a list of the top fourteen journals published in Germany is introduced. Databases have between five to 318 journals in common and the journal collection shows an substantial overlap with a wide range of subjects, with the biggest journal overlap with Computing Studies, and Business and Economics.

Conclusion. The aim of compiling a comprehensive list of LIS journal was achieved. The list will contribute to our understanding of scholarly communication within the LIS discipline and provide academics and practitioners with a better understanding of journals within the discipline. The ranking approach proved to be sufficient, showing good similarity with other studies over the last 40 years. The master list of LIS journals has also potential use to further research.

Update on Open Economics

Rufus Pollock, Open Economics: Recent Progress, Open Knowledge Foundation Blog, January 23, 2009.

Recently we made some substantial improvements/additions to our Open Economics project including:

  1. Improved javascript graphing.
  2. Extend Millenium Development Goals package and added web interface.
  3. First efforts at Where Does My Money Go
    • Aim: Dig up govt finance info and visualize the results (online)

More details on each of these can be found below. ...

Australia asks for comments on OA to PSI

Australia's Department of Broadband, Communications and the Digital Economy has issued a consultation paper asking for public input on, among other topics, OA to public sector information. Background and how to comment:

Industry and other stakeholders are invited to provide input to and comments on the specific topics raised below. Please forward your responses to by Wednesday 11 February 2009, clearly indicating any material that is commercial-in-confidence. The Department of Broadband, Communications and the Digital Economy will collate and edit input for the Digital Economy Future Directions paper before publishing it in the first half of 2009.

From the introduction to the issue, see especially:

... There is considerable interest in increasing access to publicly-funded cultural, educational and scientific collections. In some instances, publicly-funded institutions have already made their material available on open access terms. For example, in 2008, the Powerhouse Museum in Sydney became the first museum in the world to release publicly-held historical photographs on the photo-sharing platform Flickr under a ‘no known copyright’ identifier. Geotags are added to create an interactive map documenting the position of the photographic content. The Museum is also releasing its ‘Photo of the Day’ online under a Creative Commons license. Similarly, in January 2008, the NSW State Library released 100 images of historical Australian ‘firsts’ on Flickr, also under a ‘no known copyright’ identifier. In the educational space, the University of Southern Queensland’s OpenCourseWare program provides access to free and open educational resources across several disciplines for students and teachers worldwide. ...

The questions:

  • What categories of Public Sector Information (PSI) are most useful to industry and other stakeholders to enable innovation and promote the digital economy?
  • What are priority issues that will facilitate the use of PSI?
  • If PSI is made open access, what are the best formats to enable and promote use and reuse?
  • If PSI is made open access, what licensing terms would best facilitate and promote its use and reuse?
  • Should licensing terms distinguish between commercial uses and non-commercial uses and reuses?
  • Are there other examples of innovative, online uses of PSI?
  • Is there any additional economic modelling or other evidence to show the benefit to Australia from open access of PSI?

Comment. The paper's overview of OA to PSI is worth reading and generally favorable. It's especially promising to include "publicly-funded cultural, educational and scientific collections" in the discussion alongside government data. Australians shouldn't miss this opportunity to weigh in.

See also our previous post about the government's blog on the issue.

Sunday, January 25, 2009

Speak up on dropping the Ingelfinger rule

David Linden, Chief Editor at the Journal of Neurophysiology, is soliciting opinions on whether to consider submissions which have already circulated as preprints (i.e. whether to drop the journal's use of the Ingelfinger rule).  Thanks to DrugMonkey for the alert and for reprinting Linden's letter and survey.  Excerpt:

I am writing today to solicit your opinion on an important policy at the Journal of Neurophysiology. Presently, the journal will not review manuscripts that it considers to have undergone "prior publication." For most submissions this is not a problem: brief abstracts presented at meetings like Society for Neuroscience or Keystone conferences or Gordon conferences are not considered prior publication. However, in recent years it has become more popular for authors, particularly in the systems and computational neuroscience communities, to post full-length draft manuscripts on preprint servers like arXiv ( or Nature Precedings ( This is considered "prior publication" by the policy of the American Physiological Society....

These preprint servers have become a standard initial mode of scientific communication in the physics, astronomy and chemistry communities. They are permanent archives that are moderated (so they do not fill up with spam or political rants) but are not peer-reviewed. Authors submitting manuscripts to preprint servers retain the copyright to their work, which can then be transferred to the publisher when a later version of the work is accepted at a peer-reviewed journal....

There is little reason to believe that a change of policy to allow preprint server manuscripts at Journal of Neurophysiology will have a negative impact on either the editorial function or the business model of the journal. Institutions are not going to cancel their JN subscriptions in favor of free preprint access. Journal of Neurophysiology should benefit from getting manuscripts that are better for having received more feedback prior to submission and from receiving submissions from authors who otherwise would avoid JN due to the prior publication policy.

Other publishers are rapidly adopting policies that allow preprint sever manuscripts.... [PS: Omitting a list of journals in the same field.]

It is my view that changing the guidelines at Journal of Neurophysiology to allow for submission of preprint server manuscripts can only improve scientific communication and benefit the Journal. It is important to stress that preprint server submission would be voluntary and that the authors' decision to make use of this process would have no bearing on editorial decisions.

The Publications Committee of the American Physiological Society will meet on March 18, 2009 and I have placed a motion to change the guidelines to allow preprint server manuscripts to be reviewed at JN on the agenda. When this issue was considered at the 2008 meeting of the committee, it was rejected. A key factor in the committee's decision will be a measure of where the neuroscience community stands on this issue. I'm asking you to take a moment to register your opinion. You do not have to be a member of APS to vote in this poll. You do not have to be faculty either: students, postdocs and staff are all welcome. You are encouraged to distribute this message to your colleagues. You can vote by filling in the info below and emailing to

Your name:

Your institution:

______ Yes, I support amending the guidelines to allow preprint server manuscripts to undergo review at Journal of Neurophysiology.

______ No, I do not support amending the guidelines to allow preprint server manuscripts to undergo review at Journal of Neurophysiology.

Your comments (optional):

Comment.  It's time to retire the dinosaur Ingelfinger rule, and Linden's argument for doing so is beautifully done.  Until the rule is publicly dropped, many researchers will hesitate to make their preprints OA through preprint servers or repositories, fearing that it would disqualify them from publishing the same manuscripts later in a journal.  Please take a moment to fill out Linden's survey question and send it in. 

Law journal editor calls for OA to law journals

Benjamin J. Keele, Open access to student-edited law journals, Student Lawyer, February 2009.  Keele is a third-year student at Indiana University Maurer School of Law, and editor in chief of the Indiana Journal of Global Legal Studies.  Excerpt:

...[H]ow can we be sure that even more people have access to important legal scholarship? What about readers unfamiliar with typical legal databases or scholars, especially those in other countries, who cannot afford the access fees? How can we be sure that the literature will remain accessible online for the long term, safe from business failures and obsolete file formats? Open access can address all these problems....

Law journals are generally run by volunteers (authors and student editors usually are not paid for their work) and subsidized by schools that serve the legal profession and general public.The general public often cannot afford the fees for subscription databases, thus leaving unmet a need for high-quality legal information.

Open access also is good for authors and journals because it increases their visibility and scholarly impact....

Many journals already put their print issues online and even have online supplements. This is a great start, but journals cannot fully seize the opportunity offered by open access without taking advantage of IRs' capacities for increased accessibility and preservation....

Like most worthwhile projects, there are costs to making legal scholarship freely available online. There are operational costs, such as creating digital files and hosting the content, but school libraries may be willing to contribute. Some revenue from hard-copy subscriptions and database royalties may be sacrificed, but open access and subscription models serve different parts of the market. And open access tends to increase the visibility of scholarly work, thus increasing revenue from reprint royalties.

Many recent journal articles are online, and law professors are accustomed to posting working drafts on sites like the Social Science Research Network and Bepress. It is time to put law journals online, easily accessible to all, and to keep them online by preserving them in IRs. Techsavvy student editors are in an ideal position to make that happen.

Another example of free editions stimulating sales of the same content

Adam Frucci, Monty Python Puts Free Videos Online, Sells 23,000% More DVDs, Gizmodo, January 23, 2009.  Excerpt:

Monty Python started a YouTube channel with tons of their sketches streaming for free. The included links to their DVDs at Amazon. The result was a whopping 23,000% increase in sales....

PS:  Another in a series with this footnote:  It's not about research literature, but how far does it transfer?

Presentations on OA in Germany

The presentations from the HBZ symposium, The Open Access Landscape in Germany (a session within the Berlin 6 meeting, Düsseldorf, November 11-13, 2008), are now online.

Bayh-Dole-style tech transfer laws threaten open science in developing countries

David Dickson, Time to rethink intellectual property laws? SciDev.Net, January 23, 2009.  An editorial.  Excerpt:

Patents on scientific knowledge may not be as useful — or valuable — as many claim them to be....

[T]here is a...danger in trusting strong science patents to promote social development....

Take, for example, the aura that surrounds the 1980s US Bayh–Dole Act, which gave US universities, for the first time, ownership of patents arising from government-funded research.

There is a widely-held belief this helped the US economy's explosive growth in the following two decades, making many universities — and the scientists who work for them — rich in the process. Those with interests in the commercial, rather than the social value of science, actively promote this view.

This conviction, for example, has led South Africa to introduce similar legislation. And it has recently taken hold in India, where the government, urged on by its pharmaceutical and biotechnology industries (and supported by the US Chamber of Commerce), is proposing tightened patent legislation based explicitly on the Bayh–Dole approach, namely making it easier for publicly-funded research to be 'owned' by private entities.

Yet there is very little empirical evidence to show that the Bayh–Dole Act has had the claimed effect in the United States, let alone that it is appropriate for developing countries (see 'Indian patent bill: Let's not be too hasty')....

Conversely, there is widespread anecdotal evidence that the act created a mind-set among many researchers that their knowledge represents a potential goldmine not to be shared with potential competitors (i.e. those working in other universities) — at least until it has been protected by a patent application....

As a group of academics recently stated, the present impetus for similar legislation in developing countries "is fueled by overstated and misleading claims about the economic impact of the Act in the US, which may lead developing countries to expect far more than they are likely to receive" (see 'Is Bayh–Dole good for developing countries? Lessons from the US experience')....

There are alternatives available to developing country governments. For example, they can focus patent legislation on genuine technological inventions, while leaving publicly-funded research openly accessible, and rewarding researchers who come up with socially-valuable inventions through other mechanisms, such as prizes.

More radically, governments could promote 'open innovation', where a wide range of individuals are encouraged to work towards technological breakthroughs. This approach has already been suggested in India, for example, to design new tuberculosis treatments....

Protecting intellectual property will legitimately remain part of such new strategies. But science can only effectively contribute to these if it remains as open as possible. Duplicating the Bayh–Dole approach, and building expectations only of science's commercial value, is not the way to go.

PS:  See our past posts on the new tech transfer laws in India and South Africa, inspired by the US Bayh-Dole Act. 

Wikipedia may tighten restrictions on user contributions

At the same time that the Encyclopedia Britannica is inviting user contributions, Wikipedia is tightening restrictions.  For details, see Noam Cohen's article in Friday's New York Times, Wikipedia May Restrict Public’s Ability to Change Entries.  Excerpt:

Stung by criticism after vandals changed Wikipedia entries to erroneously report that Senators Edward Kennedy and Robert Byrd had died, Wikipedia appears ready to introduce a system that prevents new and anonymous users from instantly publishing changes to the online encyclopedia.

The new system [is] called Flagged Revisions....The idea in a nutshell is that only registered, reliable users would have the right to have their material immediately appear to the general public visiting Wikipedia. Other contributors would be able to edit articles, but their changes will be held back until one of these reliable users has signed off, or “flagged” the revisions. (Registered, reliable users would see the latest edit to an article, whether flagged or not.)

The system has been used by German Wikipedia since May as a test case....

While long discussed as something to be used by the much-larger English Wikipedia, Flagged Revisions was given new life as a proposal after Wikipedia’s mastermind — Jimmy Wales — all but ordered it be adopted after the Kennedy and Byrd false-death reports, which remained on the site for about five minutes.

On his user page, under the header “Why I Am Asking Flagged Revisions Be Turned On Now,” Mr. Wales observed: “This nonsense would have been 100% prevented by Flagged Revisions.” ...

The response was immediate and deafening, with headlines like: “Jimbo Wales, stop acting dictator.” ...