Open Access News

News from the open access movement

Saturday, January 24, 2009

More on Britannica's wiki-like features

Stephen Hutcheon, Watch out Wikipedia, here comes Britannica 2.0, Sydney Morning Herald, January 22, 2009.  (Thanks to Resource Shelf.)  Excerpt:

In a move to take on Wikipedia, the Encyclopedia Britannica is inviting the hoi polloi to edit, enhance and contribute to its online version.

New features enabling the inclusion of this user-generated content will be rolled out on the encyclopedia's website over the next 24 hours, Britannica's president, Jorge Cauz, said in an interview today....

Mr Cauz, who is visiting Australia, said the changes were the first in a series of enhancements to the website designed to encourage more community input to the 241-year-old institution and, in doing so, to take on Wikipedia in the all important search engine rankings....

Mr Cauz said that any changes or additions made to Britannica entries online would have to be vetted by one of the company's staff or freelance editors before the changes were reflected on the live site.

He said the encyclopedia had set a benchmark of a 20-minute turnaround to update the site with user-submitted edits to existing articles, which are written by the encyclopedia's paid expert contributors.

Many of those changes will eventually appear in the printed version of the encyclopedia, which is published every two years.

In addition to the community editing features, will enable approved users to add their own creative input which will sit beside the authorised articles....

Would-be editors on the Britannica site will have to register using their real names and addresses before they are allowed to modify or write their own articles....


  • Also see Britannica's announcement from June 2008, previewing these features. 
  • While the Britannica has been gratis OA for registered bloggers and other web-based writers since April 2008, it still charges the general public for access to what it calls "premium topics".  What's a premium topic?  You won't learn the answer by putting the term in the site's search box.  A little browsing suggests that Britannica has not made some articles free and some priced.  Instead, it shows you the first screenful of the one and only article on a topic, for free, and then pops up a pay-per-view screen when you try to scroll.  This may be an innovation for Britannica, but it's not the way to compete with Wikipedia.  Nor is it the way to compete with other major print-era encyclopedias.  France's Larousse not only allows wiki-like user contributions, but also plans an entire OA edition.  Germany's 30-volume Brockhaus Encyclopedia has already converted to OA.

Update (1/25/09).  A sentence in my comment yesterday was imprecise:  "This may be an innovation for Britannica, but it's not the way to compete with Wikipedia."  Let me distinguish competing with Wikipedia (1) for quality, (2) for scope, and (3) for eyeballs and links.  Britannica is already competitive with Wikipedia for quality, and the limited nature of its new wiki-like features is designed to preserve its quality.  (Conversely, Wikipedia is competitive with Britannica for quality.)  Britannica will never be competitive with Wikipedia in scope and isn't apparently trying, which is wise.  Hutcheon's article suggested, however, that Britannica is trying to compete with Wikipedia for eyeballs and links.  In passages I didn't include in my excerpt, Cauz criticized Google for ranking Wikipedia articles above Britannica articles, as if Google rank were about quality, or as if the quantity of links to TA articles would ever rival the quantity of links to OA articles.  When I said that Britannica hadn't found a way to compete with Wikipedia, I was referring to eyeballs and links.  Entirely apart from Britannica's quality, and its partial openness to user contributions, it will never compete for eyeballs and links as long as the bulk of its content is TA. (Disclosure: I'm on the advisory board of the Wikimedia Foundation.)

Correction (1/25/09).  My information on Brockhaus is outdated.  Here's better information from Mathias Schindler, posted with permission (thanks Mathias):

Last year, Brockhaus announced to change their business model for the 30 volume encyclopedia with free (as in beer) access at or some other online page. The project was scheduled to start in late spring 2008, it was postponed several times and finally indefinitely. In December 2008, the company which owns the Brockhaus trademark ("Bibliographisches Institut & F. A. Brockhaus, BIFAB) announced that they intend to sell the Brockhaus brand, the trademark, the rights of usage to Bertelsmann. The sell is scheduled for 31 Jan, 2009 pending permission from the monopoly commission. There will be no free (as in beer) online Brockhaus.  For a while, Brockhaus had a substitute page online, the "" that mostly contains content from the 30 volume encyclopedia or specifically created for the upcoming Brockhaus online portal.

Presentations from Berlin IuK conference

The presentations from IuK im Wandel - im Zentrum der Wissenschaft (Berlin, September 25-26, 2008) are now online.  Several are on OA.

Author of an OA/TA book recounts his experience to date

Christopher Kelty, Two Bits at Six Months, Savage Minds, January 24, 2009.  Excerpt:

Last June I announced that I had published my book, Two Bits: The Cultural Significance of Free Software. It was released both as a book by Duke University Press and as an open access publication via a website that I created and maintain. For scholars in my fields —anthropology, history, science studies, media studies— this is one of the first experiments, if not the first, of this kind. As such, I’ve been doing my best to keep some notes on the process, with a mind towards reporting on the results of going open access with a first book....

Google Analytics [shows]...the initial spike of interest,...the role of small communities in creating attention....

My book is in kind of a strange space. On the one hand, it is a conventional academic book, a first book by an assistant professor (now tenured, thank you very much Duke University Press); it is accessible, but not popular; it has a large potential audience beyond academics because of its subject matter; and it is beautifully designed and people tell me it is well written. So much for the pro column. In the con column: it is long, it contains complicated theory in the first chapter, including Habermas, which is fatal to any reader even in small amounts; it doesn’t have any sound-bitable arguments and people tell me it is poorly written.

In short, it’s a pretty standard academic book, and therefore a good candidate for this experiment. People always ask what I had to do to convince Duke to let me release the book. On the one hand the subject matter made it easy: I couldn’t respect myself, or Duke, if a book about Free Software were not freely available. On the other hand, I think they were really eager to experiment, to see what would happen. I created the website, so they didn’t have to; and they agreed to use a CC (By-NC-SA) license and to give me the pdf and a very clean HTML copy (thank you Achorn) for distribution. The designer, Cherie Westmorland, used an open source font and the Boston Public Library let me use the cover image....Duke is making as little or as much money on the book as they do on others of its ilk, and yet I am getting much more from it being open access than I might otherwise.

So what have I learned so far? A few things:

1) The Internet saturated with everything and everybody that it now feels more like normal life and less like some special place. This amounts to saying that things have returned to normal levels of hard work. To get a book to sell, one has to invest a lot of work in marketing it, promoting it and distributing it....All the spikes in traffic correlate precisely with mentions in major and minor media outlets, ranging from Savage Minds to the New Yorker....[T]he ratio of print sales to downloads has been 3 to 1. Not 1000 to 1 or even 100 to 1, but 3 to 1. That’s kind of amazing. It means that neither my outsized expectations of hordes of geeks downloading the book, nor Duke’s fears of massive numbers of lost sales have come true.

2) I have tenure. Putting my book online did not ruin my career. Having Duke publish it, as opposed to, say, some online vanity press, contributed to my tenure case, but simply having it available for free is not career suicide. Quite the opposite, I would say. I have more requests now for talks, reviews, contributed papers, conferences, interviews and projects than I can accept....Lots of people are assigning the book in class, or bits of it, which I can only assume is facilitated by the ease of access....

3) I’ve had a pretty excellent amount of media attention....I have had mentions in The New Yorker Blog, The Times Higher Education Supplement, Technology Review, Inside Higher Ed, and others. I’ve had conversations with people from Korea, Argentina, Brazil, and India about the book. I’ve had excellent response from European scholars interested in the book. In short, I can’t complain. According to Duke, the amount of marketing that went into my book was more intensive than most, and this may no doubt accounts for some of that attention. Frankly, it’s more than enough. I’m not quite sure what I would do with more, but I do know that with a bit more marketing, the dynamics of attention might conceivably change much more dramatically than just ten years ago. For some books that university presses publish, this fact is worth mulling over.

PS:  Also see our past posts on Kelty.

New OA journal of libertarian thought

Libertarian Papers is a new peer-reviewed OA journal of libertarian thought.  The inaugural issue from January 2009 is now online.  (Thanks to Kimmo Kuusela.)  From the about page, a libertarian defense of OA:

[W]herever possible, content on this site is licensed under a Creative Commons Attribution 3.0 License. We even provide [alongside the PDF] the Word file of the article —our “source code”— to make it easier for others to republish, incorporate, print-on-demand, or cut-and-paste (this also gives authors a final copy of their published article in editable form, which is unheard of). We want our ideas read, spread, and copied. As Cory Doctorow notes, “for pretty much every writer—the big problem isn’t piracy, it’s obscurity.” We do not, of course, oppose the profit motive, as some who cling to state-granted intellectual property laws might allege, but we do recognize the stifling effect copyright has had on the communication of ideas.  And as spreading the ideas of liberty is the end of our action, in Misesian terms we are indeed seeking a handsome profit by unshackling these ideas to spread them as widely as possible. And in the spirit of open discourse and the free flow of ideas, unlike other academic journals, we allow comments on our articles, via the blog posts announcing them.

Background on the UNAM network of repositories

Clara López Guzmán and 19 co-authors, 3R-Red de Repositorios Universitarios de Recursos Digitales : informe de la etapa 3: desarrollo del sistema y de aplicaciones, an unpublished September 2007 report from the Universidad Nacional Autónoma de México (UNAM), self-archived January 21, 2009.  The report is in Spanish with an English-language abstract:

The 3R project is part of a UNAM's megaproject. This project consists of 4 stages: research, conceptual models, development and implementation. The project aims to create the prototype of a network of repositories of UNAM, which will allow greater use and visibility of the intellectual output of the members of the community. This technical report presents the results of the third phase of the project, Development of the system, which enables the operation of the conceptual model. It therefore proceeded to implement the software to activate a University Repository, including the analysis for the standardization of metadata and the development of a compact disc that provides the software installation.

India's new wiki-based repository for agricultural research

India has launched the beta edition of Agropedia, a publicly-funded, wiki-based repository of Indian agricultural research.  From the site:

agropedia is an agriculture knowledge repository of universal meta models and localized content for a variety of users with appropriate interfaces built in collaborative mode in multiple languages. agropedia aims to develop a comprehensive digital content framework, platform, and tools in support of agricultural extension and outreach. In other words, it aspires to be a one stop shop for any information, pedagogic or practical knowledge related to extension services in Indian agriculture – an audiovisual encyclopedia, to enchant, educate and transform the process of digital content creation and organization completely.

Using state of the art practices and techniques of the semantic web, agropedia is a platform where both specialists in the agriculture research and education domain and students and others interested in agriculture can make lasting contributions to the vast knowledge base. The specialists have a choice to contribute towards the gyan dhara (certified content) or participate in the interaction space to contribute to janagyan (emergent knowledge). All other registered users are co creators of janagyan (emergent knowledge) through their participation in the agrowiki, agro-blog, agro-forum and agro-chat like interaction spaces....

For more details, see M. Sreelata's article in SciDev.Net, India debuts 'agricultural Wikipedia', January 21, 2009.  Excerpt:

...The government-backed initiative, Agropedia, was launched last week (12 January).

It aims to disseminate crop- and region-specific information to farmers and agricultural extension workers — who communicate agricultural information and research findings to farmers — and provide information for students and researchers....

Content will be continually added and validated through review and analysis by invited agricultural researchers, in a manner similar to that used by Wikipedia and using open source tools, says V. Balaji, head of knowledge management and sharing with the International Crop Research Institute for the Semi-Arid Tropics (ICRISAT), a partner in the project.

The site also houses blogs and forums where anyone can provide and exchange knowledge.

The 85 million-rupee (around US$17 million) project is being implemented over 30 months and is backed by the National Agricultural Innovation Project, a six-year government programme intended to modernise agriculture.

The World Bank and the Indian government have provided the funding for the project and six Indian agricultural and technology institutions are partners in the project, providing information and technological expertise....

It is hoped that even where farmers have no access to the Internet, the Agropedia information can be used as a basis for radio plays, for example, says Balaji.

Agropedia's lead architect, T. V. Prabhakar of the Indian Institute of Technology in Kanpur, initially envisioned the website as the equivalent of Wikipedia for global agriculture three years ago, but for now it will concentrate on India-specific information.

He says that the initial phase of the project — developing a mechanism to manage the vast repository of knowledge — is nearly completed, and the next step is to develop ways to disseminate the knowledge.

Trials will soon begin in six locations around the country.

Friday, January 23, 2009

Update on Obama's first days

A few updates on U.S. President Barack Obama's first days in office:

Medical journal brings backfiles online with PMC

The Journal of the National Medical Association has brought its backfiles to 1909 OA on PubMedCentral. The journal was previously OA, but the journal's site only provides backfiles to 2008.

Update. See also this February 2 press release from the National Library of Medicine, PubMed Central Adds Historically Significant Journal of the National Medical Association (1909-2007) to Its Free Online Holdings:

In celebration of Black History Month, the National Library of Medicine is pleased to announce an important addition to PubMed Central (PMC), its free digital archive of full-text journal articles: the complete archive of the Journal of the National Medical Association (JNMA), which observes its centennial this year.

The National Medical Association (NMA), established in 1895, is the largest and oldest national organization representing African American physicians and allied health professionals in the United States. ... The archive currently represents over 77,000 digitized pages of issues, cover to cover, through 2007. Current content will be coming at a later date. ...

Building an OA database of genome-wide association studies

Andrew D. Johnson and Christopher J. O'Donnell, An Open Access Database of Genome-wide Association Results, BMC Medical Genetics, January 22, 2009. Abstract:

Background: The number of genome-wide association studies (GWAS) is growing rapidly leading to the discovery and replication of many new disease loci. Combining results from multiple GWAS datasets may potentially strengthen previous conclusions and suggest new disease loci, pathways or pleiotropic genes. However, no database or centralized resource currently exists that contains anywhere near the full scope of GWAS results.

Methods: We collected available results from 118 GWAS articles into a database of 56,411 significant SNP-phenotype associations and accompanying information, making this database freely available here. In doing so, we met and describe here a number of challenges to creating an open access database of GWAS results. Through preliminary analyses and characterization of available GWAS, we demonstrate the potential to gain new insights by querying a database across GWAS.

Results: Using a genomic bin-based density analysis to search for highly associated regions of the genome, positive control loci (e.g., MHC loci) were detected with high sensitivity. Likewise, an analysis of highly repeated SNPs across GWAS identified replicated loci (e.g., APOE, LPL). At the same time we identified novel, highly suggestive loci for a variety of traits that did not meet genome-wide significant thresholds in prior analyses, in some cases with strong support from the primary medical genetics literature (SLC16A7, CSMD1, OAS1), suggesting these genes merit further study. Additional adjustment for linkage disequilibrium within most regions with a high density of GWAS associations did not materially alter our findings. Having a centralized database with standardized gene annotation also allowed us to examine the representation of functional gene categories (gene ontologies) containing one or more associations among top GWAS results. Genes relating to cell adhesion functions were highly over-represented among significant associations (p<4.6x10-14), a finding which was not perturbed by a sensitivity analysis.

Conclusions: We provide access to a full gene-annotated GWAS database which could be used for further querying, analyses or integration with other genomic information. We make a number of general observations. Of reported associated SNPs, 40% lie within the boundaries of a RefSeq gene and 68% are within 60 kb of one, indicating a bias toward gene-centricity in the findings. We found considerable heterogeneity in information available from GWAS suggesting the wider community could benefit from standardization and centralization of results reporting.

Comment. The datasets compiled by the researchers is provided in links at the end of the article.

Many new and forthcoming OA journals from Academic Journals

Academic Journals has a large number of new and planned OA journals listed on its Web site. Journals are published under the Creative Commons Attribution License, but authors transfer copyright to the publisher. The article-processing charge appears to be $550 at each journal, subject to waiver. (Thanks to Norzaidi Mohd Daud.) Organized by launch date:

First issue in January 2009:

Planned for March 2009:

Planned for April 2009:

Planned for May 2009:

Update (3/16/09). Jim Till has gathered a list of the author-side fees for some of the journals.

WHO releases cost estimates for implementing GSPA

The World Health Organization has released detailed cost estimates and time frames for implementing its Global Strategy and Plan of Action on Public Health, Innovation and Intellectual Property. For background, see posts at IP Watch or Knowledge Ecology International. From IP Watch:
... This list of expected costs is one of the outstanding elements on the plan of action the WHO secretariat had been tasked with completing by this week’s Executive Board meeting. The board, which advises and makes recommendations to the annual WHO World Health Assembly, is meeting from 19 to 27 January. ...
Funding for the OA element is included with costs to establish public health libraries; from the WHO document:

2.4(a) promote the creation and development of accessible public health libraries in order to enhance availability and use of relevant publications by universities, institutes and technical centers, especially in developing countries
Time frame: 2008-2015
Estimated funding needs in U.S. dollars: 42,210,000, plus share of subtotal in support unit costs

2.4(b) promote public access to the results of government funded research, by strongly encouraging that all investigators funded by governments submit to an open access database an electronic version of their final, peer-reviewed manuscripts
Time frame: 2008-2015
Estimated funding needs in U.S. dollars: Included in 2.4(a), plus share of subtotal in support unit costs

OA theses about Afghanistan

The Afghanistan Analyst hosts OA theses and dissertations about Afghanistan

(Thanks to Javed Akhtar, who also links to non-OA sites hosting theses and dissertations about Afghanistan.)

More on patents as access barriers

Zhen Lei, Rakhi Juneja, and Brian D Wright, Patents versus patenting: implications of intellectual property protection for biological research, Nature Biotechnology, 27 (2009) pp. 36-40.  (Thanks to Michael Geist.)  Excerpt

A new survey shows scientists consider the proliferation of intellectual property protection to have a strongly negative effect on research....

Here we report scientists' assessments regarding the overall effects of IP protection, as revealed in a survey of academic agricultural biologists. Scientists believe that, contrary to the current consensus, proliferation of IP protection [particularly, patents and material transfer agreements or MTAs] has a strongly negative effect on research in their disciplines....

[T]he major impediment to accessing research tools is not patents per se, but patenting as an institutional imperative in the post-Bayh-Dole era....

These findings challenge the inferences of social scientists that there are no real problems with policies encouraging increased patenting of research tools. They also help explain why agricultural biologists have become leaders in the exploration of open source biology (BiOS, Biological Innovation for Open Society) and in institutional collaborations to facilitate access to crucial enabling technologies (PIPRA, Public Intellectual Property Rights for Agriculture). They support the widespread adoption of the Uniform Biological Material Transfer Agreement (UBMTA) for exchanges among scientists, long advocated by the National Institutes of Health....

At first glance, our sample's views regarding IP protection might appear to be at odds with previous surveys attributing the research tool access problem to scientists' reluctance to share materials they control, motivated by increasing competitive pressures, cost of sharing or commercial concerns. However, none of those surveys asked scientists for their own assessment of the effects of IP protection on research in their fields. Our general conclusions are consistent with the fact that each of the AAAS' Project on Science and Intellectual Property in the Public Interest four-country studies reports (without comment) that more than half of respondents believe that IP rights impair the free and open exchange of materials and/or research results....

Also see the Supplementary information at the journal web site.

When depositing articles in a repository, include metadata about their cited references

Stevan Harnad, The fundamental importance of capturing cited-reference metadata in Institutional Repository deposits, Open Access Archivangelism, January 22, 2009.  Excerpt:

On 22-Jan-09...Francis Jayakanth wrote on the eprints-tech list:

"Till recently, we used to include references for all the uploads that are happening into our repository....Our experience has been that when the references are copied and pasted...from the PDF file, invariably non-ascii characters found in almost every reference. Correcting the non-ascii characters takes considerable amount of time. Also, as to be expected, the references from difference publishers are in different styles, which may not make reference linking straight forward. Both these factors forced us take a decision to do away with uploading of references, henceforth...."

The items in an article's reference list are among the most important of metadata, second only to the equivalent information about the article itself....If each Institutional Repository (IR) has those canonical metadata for every one of its deposited articles as well as for every article cited by every one of its deposited articles, that creates the glue for distributed reference interlinking and metric analysis of the entire distributed OA corpus webwide, as well as a means of triangulating institutional affiliations and even name disambiguation.

Yes, there are some technical problems to be solved in order to capture all references, such as they are, filtering out noise, but those technical problems are well worth solving (and sharing the solution) for the great benefits they will bestow....

(Roman Chyla has replied to eprints-tech with one potential solution:

"The technical solution has been there for quite some time, look at citeseer where all the references are extracted automatically (the code of the citeseer, the old version, was available upon request - I don't know if that is the case now, but it was in the past). That would be the right way to go, imo....")

Accessing Google-scanned books from outside the US

If you remember, Google Book Search tends to block access to users outside the US, even when they try to click through to books that are in the public domain in both the US and the user's country (1, 2, 3, 4, 5).

Klaus Graf has posted instructions on how non-US users can create a US proxy and regain access.

Comment.  Google:  Why does it have to be this difficult?

Croatian OA journals of archaeology

Charles Ellwood Jones has compiled a list of 15 Croatian Open Access Archaeology Journals.

Open Data Commons moves to OKF

Rufus Pollock, Open Data Commons now at the OKF, Open Knowledge Foundation Blog, January 22, 2009.  Excerpt:

Just over a year ago Open Data Commons was launched as a home for the new open data licenses such as the PDDL which had been developed by Jordan Hatcher and Dr Charlotte Waelde.

From early on, Jordan, the legal expert and aficianado-of-openness and the main mover behind these efforts, had been talking with us about finding a more permanent home for ODC. After further discussions over the last few months, we are pleased to announce that Open Data Commons has now found a new home here at the OKF [Open Knowledge Foundation].

One immediate result of this is that there is now a new mailing list for discussion of the ODC licenses (and any related open data issues)....

Thursday, January 22, 2009

New IAP program will support OA in developing countries

The InterAcademy Panel on International Issues (IAP) has launched a new Program on Digital Knowledge Resources and Infrastructure in Developing Countries.  From the site:

The primary goal of this programme is to engage IAP and its Members in strengthening their own scientific and technical (S&T) capacity as well as in developing countries and transitional economies by enhancing access to and use of digital knowledge and the related infrastructure. More specifically, it will promote the goal established in IAP's Strategic Plan to enhance the quantity and quality of information on issues of science and society that is being exchanged among member academies, thereby building the policy advisory capacity of individual academies. The programme also will build directly upon the activities and results of the 2004-2007 Initiative on Access to Scientific Information in Developing Countries. The two major areas of focus are:

  • access to and use of digital S&T data and information, and
  • access to and use of digital networks and infrastructure for research and education....

Thanks to Paul Uhlir for the alert and for adding these details:

This program is focused on the greater involvement of science academies in a number of areas in developing countries, including: OA digitization of valuable analog research material [and] open institutional repositories....


New US board on research data will host a meeting on OA mandates

The U.S. National Research Council has launched a new Board on Research Data and Information.  From the site:

The mission of the Board on Research Data and Information is to improve the management, policy, and use of digital data and information for science and the broader society....

Acting through the NRC, BRDI focuses on the following tasks within its primary mission areas:

  1. Address emerging issues in the management, policy, and use of research data and information at the national and international levels.
  2. Through studies and reports of the NRC, provide independent and objective advice, reviews of programs, and assessment of priorities concerning research data and information activities and interests of its sponsors.
  3. Encourage and facilitate collaboration across disciplines, sectors, and nations with regard to common interests in research data and information activities.
  4. Monitor, assess, and contribute to the development of U.S. government and research community positions on research data and information programs and policies.
  5. Initiate or respond to requests for consensus studies, workshops, conferences, and other activities within the Board's mission, and provide oversight for the activities performed under the Board's auspices.
  6. Broadly disseminate and communicate the results of the Board's activities to its stakeholders and to the general public.

Thanks to Paul Uhlir for the alert and this note:

The Board's inaugural meeting will be held January 29-30, and will include a mini-Symposium on Author Deposit Mandates for Government Grantees. The symposium, which is open to the public and will be netcast (audio only), will begin at 4:30 EST (Washington, DC time) on the afternoon of Thursday, 29 January. Comments and questions from remote participants will be possible....

Interview with an OA archive

Jennifer Howard, Archive Watch: Bohemian Rhapsody, Wired Campus, January 20, 2009. Interview with Edward Whitley, editorial director of The Vault at Pfaff’s, an OA "archive of art and literature by New York's nineteenth-century Bohemians".

... Q. Who uses the site? How much traffic does it get?

A. In 2008 we had over 114,000 visits to the site by over 39,000 unique visitors. As far as we can tell, it’s mostly scholars and students who are visiting the site, but we also hear from people outside of academia who stumble onto the site out of an interest in some aspect of American history and they end up falling in love with the Pfaffians. ...

Q. How does being involved in this kind of work affect a scholar’s chances for tenure and promotion?

A. When I started this project as an untenured assistant professor during my first year of employment, I vowed that I wouldn’t let myself become a test case for whether or not someone could get tenure based solely on digital scholarship. I’ve kept that promise over the past five years, and as tenure looms on the horizon I’m grateful that I’ve put the Vault at Pfaff’s aside often enough to work on publishing in traditional venues. Having said that, I can’t help but wonder how much further along I could be with this project if I’d dedicated myself to it 100 percent. ...

Q. How do you sustain the project?

A. Without the institutional support that Lehigh University has provided through the Digital Scholarship Center, this project would never have gotten off the ground. A strong institutional commitment to digital scholarship is absolutely essential to the survival of projects like this. Lehigh has also been very generous in providing additional funds to pay research assistants, and the Gladys Krieble Delmas Foundation gave us a grant a while back that really allowed us to get the bulk of the work done. ...

Presentations on UNAM's 3R repository project

3R (Red de Repositorios Universitarios de Recursos Digitales) is a project to develop a network of repositories at the Universidad Nacional Autónoma de México. See this set of presentations on the project, recently self-archived, in Spanish with English abstracts: From the abstracts:
The 3R project is part of a UNAM's megaproject. This project consists of 4 stages: research, conceptual models, development and implementation. The project aims to create the prototype of a network of repositories of UNAM, which will allow greater use and visibility of the intellectual output of the members of the community. ...
See also our past post which mentioned the 3R project.

An argument from the Enlightenment for OA and against the Google settlement

Robert Darnton, Google and the Future of Books, The New York Review of Books, February 12, 2009.  Darnton is the Director of the Harvard University Library.  At his direction, Harvard's is the first and largest library to refuse to participate in the Google settlement.  Excerpt:

How can we navigate through the information landscape that is only beginning to come into view? The question is more urgent than ever following the recent settlement between Google and the authors and publishers who were suing it for alleged breach of copyright....After lengthy negotiations, the plaintiffs and Google agreed on a settlement, which will have a profound effect on the way books reach readers for the foreseeable future. What will that future be?

No one knows, because the settlement is so complex that it is difficult to perceive the legal and economic contours in the new lay of the land. But those of us who are responsible for research libraries have a clear view of a common goal: we want to open up our collections and make them available to readers everywhere....

Despite its principles, the [18th century] Republic of Letters, as it actually operated, was a closed world, inaccessible to the underprivileged. Yet I want to invoke the Enlightenment in an argument for openness in general and for open access in particular.

[M]ost of us would subscribe to the principles inscribed in prominent places in our public libraries. "Free To All," it says above the main entrance to the Boston Public Library....

Our republic was founded on faith in the central principle of the eighteenth-century Republic of Letters: the diffusion of light....This faith is embodied in the United States Constitution. Article 1, Section 8, establishes copyright and patents "for limited times" only and subject to the higher purpose of promoting "the progress of science and useful arts." The Founding Fathers acknowledged authors' rights to a fair return on their intellectual labor, but they put public welfare before private profit.

How to calculate the relative importance of those two values? ...

When the Americans gathered to draft a constitution thirteen years later, they generally favored the view that had predominated in Britain. Twenty-eight years seemed long enough to protect the interests of authors and publishers. Beyond that limit, the interest of the public should prevail....

How long does copyright extend today? According to the Sonny Bono Copyright Term Extension Act of 1998 (also known as "the Mickey Mouse Protection Act," because Mickey was about to fall into the public domain), it lasts as long as the life of the author plus seventy years....As things stand now, for example, Sinclair Lewis's Babbitt, published in 1922, is in the public domain, whereas Lewis's Elmer Gantry, published in 1927, will not enter the public domain until 2022....

[W]e live in a world designed by Mickey Mouse, red in tooth and claw....

[T]he Journal of Comparative Neurology now costs $25,910 for a year's subscription; Tetrahedron costs $17,969 (or $39,739, if bundled with related publications as a Tetrahedron package); the average price of a chemistry journal is $3,490; and the ripple effects have damaged intellectual life throughout the world of learning. Owing to the skyrocketing cost of serials, libraries that used to spend 50 percent of their acquisitions budget on monographs now spend 25 percent or less. University presses, which depend on sales to libraries, cannot cover their costs by publishing monographs. And young scholars who depend on publishing to advance their careers are now in danger of perishing....

The eighteenth-century Republic of Letters had been transformed into a professional Republic of Learning, and it is now open to amateurs —amateurs in the best sense of the word, lovers of learning among the general citizenry. Openness is operating everywhere, thanks to "open access" repositories of digitized articles available free of charge, the Open Content Alliance, the Open Knowledge Commons, OpenCourseWare, the Internet Archive, and openly amateur enterprises like Wikipedia. The democratization of knowledge now seems to be at our fingertips. We can make the Enlightenment ideal come to life in reality.

At this point, you may suspect that I have swung from one American genre, the jeremiad, to another, utopian enthusiasm....

Libraries exist to promote a public good: "the encouragement of learning," learning "Free To All." Businesses exist in order to make money for their shareholders —and a good thing, too, for the public good depends on a profitable economy. Yet if we permit the commercialization of the content of our libraries, there is no getting around a fundamental contradiction. To digitize collections and sell the product in ways that fail to guarantee wide access would be to repeat the mistake that was made when publishers exploited the market for scholarly journals, but on a much greater scale, for it would turn the Internet into an instrument for privatizing knowledge that belongs in the public sphere. No invisible hand would intervene to correct the imbalance between the private and the public welfare. Only the public can do that, but who speaks for the public? Not the legislators of the Mickey Mouse Protection Act....

Libraries say, "Digitize we must." But not on any terms. We must do it in the interest of the public, and that means holding the digitizers responsible to the citizenry....

Don't get me wrong. I know that businesses must be responsible to shareholders. I believe that authors are entitled to payment for their creative labor and that publishers deserve to make money from the value they add to the texts supplied by authors....

But we, too, cannot sit on the sidelines, as if the market forces can be trusted to operate for the public good. We need to get engaged, to mix it up, and to win back the public's rightful domain. When I say "we," I mean we the people, we who created the Constitution and who should make the Enlightenment principles behind it inform the everyday realities of the information society. Yes, we must digitize. But more important, we must democratize. We must open access to our cultural heritage. How?  By rewriting the rules of the game, by subordinating private interests to the public good, and by taking inspiration from the early republic in order to create a Digital Republic of Learning....

An enterprise on such a scale [as the digital collection contemplated in the Google settlement] is bound to elicit reactions of the two kinds that I have been discussing: on the one hand, utopian enthusiasm; on the other, jeremiads about the danger of concentrating power to control access to information....

Google can make the Enlightenment dream come true.  But will it? The eighteenth-century philosophers saw monopoly as a main obstacle to the diffusion of knowledge —not merely monopolies in general...but specific monopolies such as the Stationers' Company in London and the booksellers' guild in Paris, which choked off free trade in books.

By spreading the cost in various ways —a rental based on the amount of use of a database or a budget line in the National Endowment for the Humanities or the Library of Congress— we could have provided authors and publishers with a legitimate income, while maintaining an open access repository or one in which access was based on reasonable fees. We could have created a National Digital Library—the twenty-first-century equivalent of the Library of Alexandria. It is too late now. Not only have we failed to realize that possibility, but, even worse, we are allowing a question of public policy —the control of access to information— to be determined by private lawsuit....

The district court judge will pronounce on the validity of the settlement, but that is primarily a matter of dividing profits, not of promoting the public interest....

If Google makes available, at a reasonable price, the combined holdings of all the major US libraries, who would not applaud? Would we not prefer a world in which this immense corpus of digitized books is accessible, even at a high price, to one in which it did not exist?  Perhaps, but the settlement creates a fundamental change in the digital world by consolidating power in the hands of one company....

OA resources on Egypt's Karnak temple

Digital Karnak is a recently-launched project at the University of California at Los Angeles to
... (1) to assemble databases of information related to Karnak, (2) build an interactive computer model of the site, and (3) create a series of resources using the model and databases that are available online free-of-charge through this website and can be easily used for undergraduate education. ...
See also the article in Wired Campus.

Nature expands green and hybrid gold OA options

Expanded green and gold routes to open access at Nature Publishing Group, a press release from the Nature Publishing Group, January 22, 2009.  Excerpt:

Nature Publishing Group (NPG) is expanding open access choices for authors in 2009, through both 'green' self-archiving and 'gold' (authors-pays) open access publication routes. Eleven more journals published by NPG are offering an open access option from January 2009. NPG has also expanded its Manuscript Deposition Service to include 32 further titles.

An open-access option is now available to authors submitting original research to Molecular Therapy, published by NPG on behalf of the American Society of Gene Therapy, and to ten journals owned by NPG....

For a publication fee of £2,000 / $3000 / €2400, articles will be open access on the journal website and identified in the online and print editions of the journal with an open-access icon. The final full text version of the article will be deposited immediately on publication in PubMed Central (PMC), and authors will be entitled to self-archive the published version immediately on publication. Open access articles will be published under a Creative Commons license. Authors may choose between the Attribution-Noncommercial-No Derivative Works 3.0 Unported and the Attribution-Noncommercial-Share Alike 3.0 Unported Licence. The Attribution-Noncommercial-Share Alike Licence permits derivative works, ensuring that authors can comply with funders such as the Wellcome Trust. Other articles will continue to be published under NPG's exclusive License to Publish, and its usual self-archiving policy will apply....

Site licence prices will be adjusted in line with the amount of subscription-content published annually....

These titles join The EMBO Journal, EMBO reports and British Journal of Cancer, which already offer an open access option to authors. Molecular Systems Biology, published by NPG in partnership with the European Molecular Biology Organization, is a fully open access journal.

Continuing its support for the 'green route' to open access on high-impact journals, NPG has extended its Manuscript Deposition Service. Forty-three journals published by NPG now offer the free service to help authors fulfil funder and institutional mandates for public access. In addition to Nature and the Nature research journals, 28 society and academic journals published by NPG now offer a Manuscript Deposition Service to authors of original research articles. A full list of participating journals is available [here].

NPG's Manuscript Deposition Service will deposit authors' accepted manuscripts with PMC and UK PubMed Central (UKPMC). For participating journals, the service is open to authors whose funders have an agreement with PMC or UKPMC to deal with authors' manuscripts from publishers.

NPG's License to Publish encourages authors of original research articles to self-archive the accepted version of their manuscript in PMC or other appropriate funding bodies' archive, their institution's repositories and, if they wish, on their personal websites. In all cases, the author's version of the accepted manuscript can be made publicly accessible six months after publication. NPG does not require authors of original research articles to transfer copyright. NPG's policies are explained in detail [here].

Also see today's press release on the the new OA and deposit service from Molecular Therapy.


  • I applaud the expansion of NPG's green and gold OA options.  For the NPG hybrid OA journals, I applaud the policy to reduce subscription prices in proportion to author uptake, in contrast to the majority of hybrid OA journals which still use a double-charge business model.. 
  • I have mixed feelings about NPG's self-archiving policy, and have had since 2005:  thumbs up for positively encouraging self-archiving, and not merely permitting it, but thumbs down for demanding the six-month embargo. 
  • I have another quibble which is minor by comparison but too large to let go.  It's true that green OA is OA through repositories or self-archiving, but it's not true that gold OA is "author pays".  Gold OA is simply OA through journals, regardless of the journal's business model.  Some OA journals charge publication fees, but most do not.  Even when OA journals do charge fees, it's misleading to describe them as "author pays".  The more NPG expands its OA options and advertises them, the more important it is to avoid this misleading terminology.


SPARC resources will foster campus-based publishing

SPARC has launched a guide, a bibliography, a set of case studies, a discussion forum on campus-based publishing, and a Resource Center to pull them all together.  From today's announcement:

SPARC (the Scholarly Publishing and Academic Resources Coalition) today released Campus-based Publishing Partnerships: A guide to critical issues, by Raym Crow. The guide is the core of a new Web site, the Campus-based Publishing Resource Center, designed by a panel of advisors from the library and university press communities to support successful publishing partnerships.

Campus publishing partnerships can offer universities greater control over the intellectual products that they help create. But to fully realize this potential, partnerships need to evolve from ad hoc working alliances to stable, long-term collaborations. SPARC’s guide will help partnering organizations to: 

  • Establish practical governance and administrative structures;
  • Identify funding models that accommodate the different financial objectives of libraries and presses;
  • Define objectives that advance the missions of both the library and of the press, without disrupting the broader objectives of either; and
  • Demonstrate the value of the collaboration to university administrators....

The guide reviews current library-press initiatives, describes the potential benefits of partnerships, and provides an overview of the financial and operating criteria for launching and sustaining a successful collaboration. It provides practical guidance on structuring a publishing partnership, including case studies that illustrate key concepts.

“This is a moment of great opportunity for academic publishing and for university presses, in particular,” said Laura Cerruti, Director of Digital Content Development for the University of California Press. “SPARC's efforts to survey those in the trenches – librarians, university press publishers, and other active campus publishing entities – have resulted in an invaluable resource for those who are just beginning to tap into their campus's publishing needs and priorities. It gives them a head start, if you will.” ...

“While the missions of libraries and presses differ,” said Heather Joseph, executive director of SPARC, “both entities recognize the growing need to address fundamental problems in scholarly publishing and to understand the interdependence of their organizations. By developing this resource, we hope to drive a shared exploration of new, innovative, sustainable publishing models.” ...

[T]the site will be expanded to include FAQs, sample planning documents, an index of collaborative initiatives, and other content suggested by the community....

DRIVER overview

Fleur Stigter, Who and What Drives Driver? Tell Fleur, January 19, 2009.  Excerpt:

A small warning….This is a long post about (open source) e-Infrastructure and (Open Access) digital repositories....

Having established the necessary network of people and institutions, the Driver network is now working on an actual pan-European infrastructure for digital repositories....

According to the Driver website, the main aim is to build a Digital Repository infrastructure....

The repositories contain today the full spectrum of scholarly materials, from theses, technical reports and working papers to digitised text and image collections and they can contain sets of primary research data.(…)

Recently they announced the software release D-NET v. 1.0. “this open source software offers a tool-box for deploying a customizable distributed system featuring tools for harvesting and aggregating heterogeneous data sources. A variety of end-user functionalities are applied over this integration, ranging from search, recommendation, collections, profiling to innovative tools for repository manager users.”

The press release continues: “A running instance of the software, namely the “European Information Space”, maintained by the DRIVER Consortium to aggregate Open Access publications from European Institutional Repositories, can be accessed online at:  (Search the Repositories Portal).”

Additional Information:

“Towards one e-Infrastructure”: BELIEF’s portal allows you to explore everything about e-Infrastructures and the global virtual research communities that they empower.  [Also see] the related Digital Library....

Open licensing and open science

Michael Nielsen, The role of open licensing in open science, Michael Nielsen, January 21, 2009.

... When talking to some open science advocates, I hear a great deal of interest and enthusiasm for open licenses for science. This enthusiasm seems prompted in part by the success of open licenses in promoting open culture. I think this is great - with a few minor caveats, I’m a proponent of open licenses for science - but the focus on open licenses sometimes bothers me. It seems to me that while open licenses are important for open science, they are by no means as critical as they are to open culture; open access is just the beginning of open science, not the end. ...

I’m not sure what role licensing will play in open science, but I do think there are some clear signs that it’s not going to be as central a role as it’s played in open culture.

The first reason for thinking this is that a massive experiment in open licensing has already been tried within science. By law, works produced by US Federal Government employees are, with some caveats, automatically put into the public domain. ...

This policy has greatly enriched the creative commons. And it’s led to enormous innovation - for example, I’ve seen quite a few mapping services that build upon US Government data, presumably simply because that data is in the public domain. But in the scientific realm I don’t get the impression that this is doing all that much to promote the growth of the same culture of mass collaboration as open licenses are enabling. ...

The second reason for questioning the centrality of open licenses is the observation that the main barriers to remixing and extension of scientific content aren’t legal barriers. They are, instead, cultural barriers. If someone copies my work, as a scientist, I don’t sue them. If I were to do that, it’s in any case doubtful that the courts would do more than slap the violator on the wrist - it’s not as though they’ll directly make money. Instead, there’s a strong cultural prohibition against such copying, expressed through widely-held community norms about plagiarism and acceptable forms of attribution. ...

[T]he legal issues around openness are only a small part of a much larger set of issues, issues which are mostly cultural. The key to moving to a more open scientific system is changing scientist’s hearts and minds about the value and desirability of more openly sharing information, not reforming the legal rights under which they publish content.

So, what’s the right approach to licensing? John Wilbanks has argued, persuasively in my opinion, that data should be held in the public domain. I’ve sometimes wondered if this argument shouldn’t be extended beyond data, to all forms of scientific content, including papers ... After all, if the scientific community is primarily a reputation economy, built around cultural norms, then why not simply remove the complication of copyright from the fray? ...

More on OLCL's WorldCat policy

Wendy M. Grossman, Why you can't find a library book in your search engine, The Guardian, January 22, 2009.

... Put any book title into your favourite search engine, and ... you won't find the nearest library where you can borrow that book. ...

Yet there is an alternative that few people seem aware of: Worldcat, which offers web access to the largest repository of bibliographic data in the world - from the 40-year-old Ohio-based non-profit Online Computer Library Center. But Worldcat suffers from the same problem on a larger scale. OCLC shares only 3m of its 125m records with Google Books; none of them show up in an ordinary search.

You might expect forward-thinking libraries to put their databases online, to encourage people through their doors. But they can't. Even though they created the data, pay to have records added to the database and pay to download them, they can't.

In November, OCLC announced new rules covering the use of Worldcat data due to go live on 19 February. Among other things, the new policy prohibits any use - transfer, sharing - that "substantially replicates the function, purpose, and/or size of WorldCat". In other words, no publicly searchable databases. ...

Presentations from CNI fall meeting

The presentations from the Coalition for Networked Information fall meeting (Washington, DC, December 8-9, 2008) are now online. Several are related to OA. (Thanks to Charles Bailey.)

See also: We previously posted John Wilbanks' and Michele Kimpton's presentations from the meeting.

Wednesday, January 21, 2009

U of Michigan will transfer OAIster to OCLC

University of Michigan and OCLC form partnership to ensure long-term access to OAIster database, a press release from OCLC, January 21, 2009.  Excerpt:

The University of Michigan and OCLC today announced that they have formed a partnership that will ensure continued public access to open-archive collections through the OAIster database, and will expand the visibility of these collections to millions of information seekers through OCLC services.

Launched in 2002 with grant support from the Andrew W. Mellon Foundation, OAIster was developed to test the feasibility of building a portal to open archive collections using the Open Archives Initiative Protocol for Metadata Harvesting (OAI-PMH). OAIster has since grown to become one of the largest aggregations of records pointing to open archive collections in the world with over 19 million records contributed by over 1,000 organizations worldwide.

Under the partnership, will continue to function as the public interface to OAIster collections, through funding provided by OCLC to the University of Michigan. Later in 2009, metadata harvesting operations will transfer from the University of Michigan to OCLC.

"The University of Michigan approached OCLC about managing future operations for the OAIster project to ensure its long-term viability," said John Wilkin, Associate University Librarian, University of Michigan Library. "OCLC plays a pivotal role in the business of metadata creation and distribution. Situating OAIster with OCLC helps to create an increasingly comprehensive discovery resource for users." ...

OCLC recognizes that open archive collections are critical for scholarly research, communications and scholarship. To that end, OCLC commits to building on the success of OAIster by identifying open archive collections of interest to libraries and researchers, and ensuring that open archive collections will be freely discoverable and accessible to information seekers worldwide.

Starting in late January 2009, while OAIster continues to be freely available at the Web site, OCLC will host a version of OAIster on OCLC's FirstSearch platform and make it available through subscriptions to the FirstSearch Base Package at no additional charge.


  • Although OCLC will create a TA version of OAIster in FirstSearch, it's committed to maintain an OA version as well.  So far, so good.  But then, many people thought OCLC was committed to OA for WorldCat bibliographic data.  OCLC is taking steps to resolve the WorldCat data controversy, but the tacking raises questions about the future of OAIster.
  • I do want to see OAIster survive, and if OCLC can "ensure its long-term viability" and Michigan can't, then that's a ground for hope.
  • I'm not yet clear on how the move to OCLC will "create an increasingly comprehensive discovery resource for users."  Is OCLC thinking about some kind of OAIster-WorldCat hybrid?  Most resources indexed in OAIster are OA, and most resources indexed in WorldCat are not.  That creates some interesting possibilities for synergy, but also some risks.  Right now OAIster is the most comprehensive of all the cross-archive search engines for OA repositories.  Part of its utility is that it covers so many repositories and part is that it covers essentially nothing else.  If it survives this acquisition as a component of a larger service, and users can't filter searches for OA resources or identify OA resources from the hit list, then we will have lost something valuable.  There will be a niche for another OAIster.
  • Also see our past posts on OAIster and OCLC.

Update (1/22/09).  Also see Dorothea Salo's comments.

Update (1/23/09).  Dorothea Salo has posted (1) an interview with OSIster's Kat Hagedorn about the OCLC deal and (2) some further comments on the deal.

Update (1/28/09). Dorothea Salo has posted some clarifications from OCLC about the OAIster deal.


New issue of eIFL newsletter

The January/February 2009 issue of the eIFL newsletter is now available. See especially the section on eIFL-OA.

OA, drawn

Dave Gray, Free the facts!, January 17, 2009. Describes, in drawings, the problems of the toll-access journal system. See also his blog post.

Slidecast on open education, et al.

M. S. Vijay Kumar, Opening Up Education: The Collective Advancement of Education through Open Technology, Open Content, and Open Knowledge, presented at EDUCAUSE Live!, October 17, 2008.

See also our previous post on the book of the same name, edited by Kumar.

New version of PURE repository software

PURE is a Current Research Information System and full-text repository by Danish software developers Atira. Version 3.12.0 was released on January 13.

WHO backs away from R&D treaty

Kaitlin Mara, WHO Members Make Informal Progress On Plan Of Action As Executive Board Opens, Intellectual Property Watch, January 20, 2009.

A small, diverse group of [World Health Organization] member states meeting informally over the weekend were able to resolve questions of which institutions should be actors in a broader plan of action for the implementation of the World Health Organization Global Strategy on Public Health, Innovation, and Intellectual Property, according to sources. The informal agreement increases the likelihood of consensus when the document is discussed at the WHO Executive Board meeting this week, they said.

The informal meeting immediately proceeded the Executive Board (EB) meeting, which advises and makes recommendations to the WHO’s decision-making body, the annual World Health Assembly (WHA) in May. The EB is taking place between 19 and 27 January.

The text produced on Sunday is available here.

The global strategy was the endpoint of a series of discussions and working groups begun in 2003, when WHO first began to explore the relationship between public health and innovation, in particular innovation on medical products related to diseases that disproportionately affect the developing world. Intellectual property protection, and its influence as an incentive for innovation and in determining prices of medical products, was necessarily a key component of the discussions and the final strategy.

The plan of action was not finalised at the close of the WHA in May 2008, and the WHO Director General was tasked with finishing several unfinished components, in particular indicators of progress for the strategy’s implementation, funding needs, and relevant stakeholders. The plan matches specific actions with stakeholder groups meant to carry them out. ...

[A key decision made at the informal meeting] was the removal of the WHO from a separate list of stakeholders in exploratory discussions on instruments or mechanisms for essential health and biomedical research and development, including a possible treaty on those issues. The agreed stakeholders for the possible treaty are interested governments, and other relevant organisations including NGOs. ...

The informal group meeting included, according to sources, representatives from Brazil, Canada, Chile, the EU, Norway, Thailand and the United States, with India and the African Group apparently unable to attend. ...

Comment. To be clear, nothing official was decided this week. But the idea is for influential member states to line up in advance, preparing the way for official agreement. We should hear this week or next if the Executive Board follows.

OA for publicly-funded research was part of the discussions for a R&D treaty. See also our past posts on the R&D treaty.

Authors' rights: perceptions vs. reality

Publishing Research Consortium, Journal authors’ rights: perceptions and reality, presentation, undated but apparently recent. Preliminary report of a forthcoming paper.

The study, conducted for the publishing industry, compares what authors think journal agreements permit them to do with their work with a survey of journal policies. From the conclusions:

  • Authors underestimate what they are allowed to do with pre-publication versions
  • Authors significantly overestimate their rights to self-archive final published versions
  • Authors significantly underestimate all their other rights with regard to final published versions
  • Publishers need to make their policies (and the reasons for them) much more transparent

JISC launches an open education pilot project

Mandy Garner, The University of Europe: accessible to all, The Guardian, undated but published on January 16, 2009.  Excerpt:

Universities in Europe are looking to embrace a new form of learning, called open content, which could blow away the division between university students and the rest of the population.

In the UK, the Joint Information Systems Committee (Jisc) and the Higher Education Academy are launching a £5.7m pilot scheme to investigate the impact of open content and to look at issues of how to contextualise existing online material so anyone can make sense of it. Several European academics are already experimenting and the European Commission has expressed interest.

Malcolm Read, executive secretary of Jisc, says he wants to find out how many lecturers are willing to put their course material up for open debate and how difficult it will be to contextualise the material. Read thinks open content will boost the profile of university teaching, widen participation and raise standards, as the public will migrate to the best material. The pilot will fund individual lecturers, subject areas and institutions.

Read has had discussions with the European Commission and says he would be "very surprised if they did not start funding work in this area very soon". Allied to this is the fact that more and more European universities are offering some courses in English, which makes them more accessible internationally....

Another open-access facility creating a buzz in the world of elearning is Mooc — Massive Open Online Courses. These are super-sized open education courses....

Engineering association provides OA to theses in its field

The European Association for Signal Processing (EURASIP) has launched PhD Thesis Links.  From the site:

Each year hundreds of PhD theses are published by Universities around the world. Access to these documents is usually via contacting the library where the degree was awarded. Via this web page, EURASIP aim to grow a list and directory of theses that will allow a much wider dissemination for theses produced by EURASIP members or the students of current EURASIP members.

This page therefore links to relevant signal, speech and image processing PhD theses. The thesis documents are held on the EURASIP web site. Any relevant thesis will be listed and we will also list other higher degree documents such as MPhil or EngD's.

PS:  Also see our past posts on EURASIP.

Notes on Spanish OA conference

Javier Pérez Iglesias, 3ras Jornadas de OS-Repositorios: de la creación de repositorios institucionales a su promoción, El profesional de la información, January 16, 2009. Notes on La proyección de los repositorios institucionales (Madrid, December 10-12, 2008). Read it in the original Spanish or Google's English.

See also our past posts on the conference (1, 2).

What Obama can do to promote openness

Jonathan Gray, What Obama can do to promote openness, Open Knowledge Foundation Blog, January 20, 2009.
  1. ... Open government data. Make core government data open so that it can be re-used in mashups, visually represented, used in semantic web applications and so on! ...
  2. Open access to publicly funded research. ...
  3. Publish public information in way which makes it easy to re-use. For example, publish in XML or Text/CSV, not PDF files which data must be extracted from. Allow direct, bulk downloading, rather than access through an API or piecemeal access via a web service. ...
  4. Legal and licensing clarity. Be clear about what can and can’t be done with public content and data - with explicit legal and licensing statements, terms of use, and so on. Be clear what is in the public domain and what is free for re-use as long as attribution is given. Be clear about what is not available for use - including material where copyright is held by third parties. ...
  5. Make it open by default. Make public content and data - whether its government data, or publicly funded digitisation of cultural heritage artefacts - open by default. ...

New edition of ETD bibliography

Charles Bailey has released version 3 of his Electronic Theses and Dissertations Bibliography. See the announcement.

Reports on OA conferences in Russia, Lithuania, Ethiopia

eIFL has posted reports on three OA conferences: Each report also contains links to presentations from the conference.

OA to Scottish map collection

Great Reform Act plans for Scotland go online, National Library of Scotland, January 7, 2009.

The most significant body of maps drawn to illustrate urban boundaries in Scotland is now available on our website.

Plans of 75 Scottish towns were published in 1831-1832 to implement the Reform Act (Scotland) of 1832. ...

Our maps section on the 'Great Reform Act Plans and Reports, 1832' has zoomable digital images of all 75 plans. Also zoomable are the pages of a complete facsimile of the 153-page report to the House of Commons in which the plans were published. The report also contains details of each of the Scottish burghs, visited by the Parliamentary Boundary Commissioners who produced the town plans.

Columbia launches Korean digitization project

Columbia University Libraries and the National Library of Korea to Digitize Rare Books, press release, January 15, 2009.

Columbia University Libraries is partnering with the National Libraries of Korea to digitize rare materials from the C. V. Starr East Asian Library and make them available online. ...

The Preservation and Digital Conversion Division of Columbia University Libraries will digitize approximately 100 rare volumes and make them available through the Libraries' online catalog, CLIO. As a result, materials from the library’s collections will be easily accessible to students, scholars and people worldwide. The project will begin in January 2009 and conclude in June 2010. ...

Collection of resources by PKP

The Public Knowledge Project has launched a OA collection of presentations and course material created by PKP staff.

Leaders for open culture and open science

Juan Varela, Líderes para la cultura y la ciencia libres, La ciencia es la única noticia, December 30, 2008. Read it in the original Spanish or Google's English.

SPARC honors Preston McAfee as the latest SPARC Innovator

Preston McAfee is the latest SPARC Innovator.  See today's announcement from SPARC:

SPARC (the Scholarly Publishing and Academic Resources Coalition) has named economics scholar and author R. Preston McAfee the newest SPARC innovator for his pioneering contributions to the Open Educational Resources movement and passionate advocacy for Open Access.

McAfee is the J. Stanley Johnson Professor of Business, Economics and Management at the California Institute of Technology in Pasadena and is currently working as a vice president and research fellow at Yahoo! Inc.

In 2006, McAfee was the first to publish a complete textbook, Introduction to Economic Analysis, and make it openly available online. As “open source” material, available under a Creative Commons license that requires attribution, users can pick and choose chapters or integrate with their own material. McAfee’s book, which has been updated three times since it was first introduced, is currently used on campuses from Harvard to New York University....

Organizations that lobby for reasonable textbook pricing, such as Student Public Interest Research Groups (PIRGs), are hailing McAfee’s book as a break-through. “He’s the first who understands what an open text book needs to be. He’s provided an example that everyone should follow,” says Nicole Allen, leader of the Student PIRGs’ Make Textbooks Affordable Campaign....

As a proponent of Open Educational Resources, McAfee hopes that other educators will provide more free content online – but, he realizes for many, it is hard to pass up the payment from publishers. McAfee would like to see universities host competitions for scholars to write the best online textbook and then give the winner a substantial cash prize. This would satisfy the author’s need for compensation and provide the university with an online textbook to share with students at a much more reasonable price.

In addition to his work with textbooks, McAfee has illuminated the problem of rising journal prices and been a tireless advocate of Open Access.  In 2005, along with Ted Bergstrom of the University of California at Santa Barbara, McAfee helped build a Web site [Journal Cost-Effectiveness] to allow users to compare journal prices in detail....

From SPARC's longer profile of McAfee:

“I’m not in it for the money. I’m in it to change the world,” says [McAffee]....

“Knowledge that is created, but not looked at, is not useful,” he says. McAfee contends that universities are too focused on the creation of knowledge without paying enough attention to an equally important mission of dissemination....

Now, the OER movement is in “herding cats mode” with lots different efforts being launched but not a lot of coordination, contends McAfee. There are too many people wanting to organize the movement, but not enough people willing to supply the content....

Working on the library committee at Caltech since his arrival in 2004, University Librarian Kimberly Douglas says McAfee has been a force for Open Access. “Preston stands out as not being willing to give up,” says Douglas. “That’s what it takes. This is not an easy transition. There are a lot of stakeholders for the status quo.” Douglas says McAfee has had a big impact in the field because of his energy level and the authority he brings to the issue as a faculty member, author and economist....

On the journal pricing front, McAfee worked with Ted Bergstrom of the University of California at Santa Barbara to build a Web site that is a one-stop shop on journal pricing and value.

“The light bulb went off in my head when I was doing a literature search – not at the library, but from home,” recalls McAfee of his early commitment to Open Access. “I paid $30 for an article to download and it was completely idiotic. I had spent $30 and had nothing to show for it. It was a scam and that woke me up.” ...

On a plethora of information is compiled including journal title, publisher, ISSN, subject, profit status, year first published, price per article, price per citation and composite price index. The site was launched in 2005 with 5,000 journals and updated in 2007 and again in 2008, bringing the total to 7,000.

McAfee and Bergstrom wrote an open letter to university presidents and provosts concerning increasingly expensive journals in 2006 and mailed to about 150 universities. They received lots of response from university librarians, some from a few provosts, but none from presidents. They argued that the large for-profit publishers are gouging the academic community for as much as the market will bear. So far, universities have failed to use one of the most powerful tools that they possess: charging for their valuable inputs. McAfee and Bergstrom recommend that universities assess overhead charges for the support services of editors working for expensive journals and university libraries should refrain from buying bundled packages from large commercial publishers.

In the meantime, universities can promote Open Access with digital repositories and encouraging faculty to help them post articles on their on our Web sites, says McAfee. “Universities, in my view, are in the creation of knowledge business, but also the dissemination of knowledge. The real bottleneck is not the creation, but the dissemination,” he says. “We ought to be in both businesses. Right now the crying need is to disseminate.”

McAfee says the United States is rich because of knowledge more than resources and that the country’s success is tied to distributing that knowledge. “I think this is a solvable problem,” he says. “The Web has made knowledge free. We have the content. What’s pathetic is that we aren’t talking about rocket science. We know how to get the information out there.” ...

PS:  Congratulations, Preston!

Update on the OA percentage of new articles

Bo-Christer Björk, Annikki Roos, and Mari Lauri, Scientific journal publishing: yearly volume and open access availability, Information Research, March 2009.  Abstract:  

Introduction. We estimate the total yearly volume of peer-reviewed scientific journal articles published world-wide as well as the share of these articles available openly on the Web either directly or as copies in e-print repositories.

Method. We rely on data from two commercial databases (ISI and Ulrich's Periodicals Directory) supplemented by sampling and Google searches.

Analysis. A central issue is the finding that ISI-indexed journals publish far more articles per year (111) than non ISI-indexed journals (26), which means that the total figure we obtain is much lower than many earlier estimates. Our method of analysing the number of repository copies (green open access) differs from several earlier studies which have studied the number of copies in identified repositories, since we start from a random sample of articles and then test if copies can be found by a Web search engine.

Results. We estimate that in 2006 the total number of articles published was approximately 1,350,000. Of this number 4.6% became immediately openly available and an additional 3.5% after an embargo period of, typically, one year. Furthermore, usable copies of 11.3% could be found in subject-specific or institutional repositories or on the home pages of the authors.

Conclusions. We believe our results are the most reliable so far published and, therefore, should be useful in the on-going debate about Open Access among both academics and science policy makers. The method is replicable and also lends itself to longitudinal studies in the future.

PS:  Also see the preprint of this article from April 2008.

Seeing repositories within an information ecosystem

R. John Robertson, Mahendra Mahey, and Julie Allinson, An ecological approach to repository and service interactions, JISC, Version 1.5, October 2008.  Though dated October 2008, apparently it wasn't made public until January 5, 2009.  Excerpt:

This work began in response to a perceived need to express something of how and why digital repositories and services interact. As a community of implementers and developers we have well understood technical models and architectures that provide conceptual mechanisms to promote interoperability. Articulating the details and challenges of actual interactions that occur, however, is not so widely understood and knowledge about them is not often shared. This is, in part at least, because we tend to share in the abstract through architectures and use cases and in these we focus on the technical. Articulating interactions or connections requires an engagement with and presentation of specific local details. Beginning to consider why particular interactions succeed or fail over time requires us to factor in more than the technical....

We think that ecology, and examples of the ecosystems it studies, may offer a useful analogy to inform the task of understanding and articulating the interactions between users, repositories, and services and the information environments in which they take place. This report outlines some concepts from ecology that may be useful and suggests some definitions for a common conversation about the use of this metaphor....

Update.  Also see Dorothea Salo's comments.

Springer's first US deal in which subscriptions cover publication fees for affiliated authors

The University of California Libraries and Springer have struck a deal to facilitate the publication of OA articles by UC authors in Springer journals.  From today's press release:

The University of California Libraries and Springer Science+Business Media (Springer) have concluded a ground-breaking experimental agreement to support open access publishing by UC authors. The arrangement is part of the journals license negotiated by the California Digital Library on behalf of the ten campuses of the University of California.

Under the terms of the agreement, articles by UC-affiliated authors accepted for publication in a Springer journal beginning in 2009 will be published using Springer Open Choice with full and immediate open access. There will be no separate per-article charges, since costs have been factored into the overall license. Articles will be released under a license compatible with the Creative Commons (by-nc: Attribution, Non-commercial) license. In addition to access via the Springer platform, final published articles will also be deposited in the California Digital Library's eScholarship Repository.

The University of California-Springer agreement is the first large-scale open access experiment of its type undertaken with a major commercial publisher in North America.

"UC faculty have told us that they want open access publishing options in order to increase the impact of their published work and eliminate barriers to educational and research use," said Ivy Anderson, Director of Collections for the California Digital Library, which licenses content on behalf of the University of California libraries. "Just as importantly, they want these options in the journals in which they routinely publish, without disrupting their normal research activity. The CDL agreement with Springer supports the transformation that our faculty seeks, while continuing the libraries' crucial role in facilitating access to research information. Springer is a leader among commercial publishers in open access experimentation, making it a natural partner for the University of California in this endeavor." ...

PS:  In the past, Springer has struck similar deals with the Max Planck Society (blogged here February 2008), the University of Göttingen (October 2007), and the Dutch library consortium, UKB or Universiteitsbibliotheken en de Koninklijke Bibliotheek (June 2007). 

Update (1/22/09). Also see Andrew Albanese's article in today's Library Journal Academic Newswire. Excerpt:

The announcement...clearly marks a watershed moment for the open access movement: it is the first large-scale open access licensing deal between a major, state-wide library in the United States and the world’s second largest commercial journal publisher --a deal that, if successful, could have a significant impact on the wider marketplace for scholarly journals.


EThOS update

From the December 28 issue of the EThOSNet Newsletter:

EThOS [Electronic Theses Online Service] is due to go into Beta testing on Tuesday 9th December [2008]. In order to ensure that the service performs as expected, and to get initial feedback on the web interface, the beta version will initially be released to a small group of testers (from within the project and the wider community). The beta-testers will be able to check and feedback on EThOS, except the ‘order a thesis’ function, which will still only be accessible via the current hybrid service....

Once we are happy that the service is robust it will be made available to the whole community.

The hybrid EThOS/microfilm service

The hybrid service will enable the ordering and supply of microfilm copies where the thesis is already held by the British Library.  In cases where the required thesis is not held on microfilm by the British Library, the customer’s order will initiate the digitisation of the hardcopy source thesis and delivery as a searchable PDF file on CD-ROM....

More detail from the UoL Library Blog, quoting an offline press release:

EThOS goes into the second phase of beta testing phase this afternoon (20th January 2009). This means that you will now be able to order theses from and, should researchers request theses from your institution, you will be requested to send them to The British Library for digitisation.

PS:  Also see our past posts on EThOS.

Update (1/23/09).  Owen Stephens, the Project Director for EThOSNet, wrote to add this clarification (posted with permission):

I thought it might be worth mentioning that the public beta is the step on from the initial beta mentioned in the December EThOSNet Newsletter you quote from, and that now the public beta is available, the ‘hybrid service’ is no longer available. Now we are in public beta is the way that all users can get access to PhD theses from the British Library....

Also see his detailed blog post from January 20:

...I’m incredibly excited about [the public beta] – of all the projects I’ve been involved has the most potential to have an incredible impact of the availability of research. Until now, if you wanted to read a thesis you either had to request it via ILL, or take a trip to the holding university. Now you will now be able to obtain it online. To give some indication of the difference this can make, the most popular thesis from the British Library over the entire lifetime of the previous ‘Microfilm’ service was requested 58 times. The most popular electronic thesis at West Virginia University (a single US University) in the same period was downloaded over 37,000 times....

One of the biggest issues that has surfaced several times during the course of these projects, is the question of IPR (Intellectual Property Rights). EThOS is taking the bold, and necessary, step of working as an ‘opt-out’ service....In order that authors can opt-out if they do not want their thesis to be made available via EThOS there is a robust take-down policy....

Update on Canada's Data Liberation Initiative

Seth Grimes, The Real Data Liberation Initiative, Intelligent Enterprise, January 15, 2009.  Excerpt:

The Data Liberation Initiative is a worthy project that aims to provide academic researchers with affordable and equitable access to Canadian current governmental statistics and other data. I had a chance to meet with three DLI team members back in 2003, and I'm glad to see that the initiative, approved by the Canadian government as a pilot in 1996, has grown into a robust effort with subprojects that benefit the spectrum of public data users. DLI and similar undertakings such as the US government's Fedstats, the Open Data Foundation, and IBM's Many Eyes are what real data liberation is all about.

For the DLI, "liberation" means freedom. Borrowing from another context, "to understand the concept, you should think of free as in free speech, not as in free beer." Government paid for the data; government employees (and like minded individuals working in industry) feel a responsibility to make sure it's widely and effectively used.

In the case of the DLI, that means backing and participating in the development of the Data Documentation Initiative (DDI), "an effort to establish an international XML-based standard for the content, presentation, delivery, and preservation of metadata for datasets in the social and behavioral sciences." ...

For the DLI, data liberation means providing direct access to datasets and SAS and SPSS code snippets — those packages are most commonly used for analysis of survey data — to facilitate dataset use. It means maintaining a metadata browser using an interesting tool, Nesstar Webview, developed in Norway and the UK by and for data archivists and their worldwide community of social-science data users they support....

Another argument for publicly-funded stem cell research

You know the main arguments for President Obama to reverse the Bush restrictions on federally-funded stem cell research. 

Here's a good one you may not have heard, from Deirdre Madden of the University College Cork Law Department:

“In my opinion, it is preferable that the research should be funded from public or governmental sources rather than private funding so as to ensure open access to research results – both positive and negative – and easier translation into clinical practice in due course,” Madden says.

Update.  Also see Toni Brayer:

Stem cell research from public funding will potentially ensure open access to research results and allow easier transition from research to development of new therapies and cures for patients with a wide range of illnesses.

Tuesday, January 20, 2009

More on OA for economic stimulus

Michael Geist, Fire Up the Digital Jobs Machine, The Tyee, January 20, 2009.  Excerpt:

Finance Minister Jim Flaherty will rise in the [Canadian] House of Commons next week to deliver the most anticipated federal budget in years.....While financial support for hard-hit industries are a given, one of the most important elements in the budget will be the significant expenditures on infrastructure....

Canada has steadily increased funding for primary research at universities and colleges across the country, yet it continues to lag in implementing policies to ensure that all Canadians have access to results of that publicly-funded research. Other countries, including the United States and the European Union, have moved toward mandated "open access" policies that link access to research funded by taxpayers.

The Canadian Institutes of Health Research established an open access policy last year, but billions of research dollars are still spent in Canada without any guarantees of public access. Flaherty should use the budget to confirm the government's commitment to public funding for health, science and social science research, but require Canada's research agencies to implement open access policies that offer better hope for a return on the sizable investment....

PS:  Hear, hear.  Also see the similar proposals from Prue Adler and Charles Lowry and from me.

Factors inhibiting OA

Gavin Baker, What are the factors inhibiting OA? A Journal of Insignificant Inquiry, January 19, 2009.  Excerpt:

Michael’s comment on my earlier post got me thinking: What are all the factors inhibiting uptake of OA by authors, funders, institutions, and publishers? And how important is each factor? ...

I’ll suggest a few potential factors for authors, just as examples:

  • Familiarity with / usage of OA repositories / journals as a reader (if you read papers on arXiv, are you more likely to self-archive there?)
  • Familiarity with / knowledge of OA as an idea, the serials crisis, economics of scholarly publishing, etc.
  • Familiarity with / knowledge of copyright, authors’ rights, Creative Commons, etc.
  • Familiarity / comfort with IT in general, and OA tools (e.g. repositories) in particular
  • Resources / support provided by the author’s institution and/or funder
  • Information provided by / stance toward OA of societies of which the author is a member
  • Colleagues or co-workers who are OA practitioners or advocates....

Update (1/26/09).  Also see Gavin's further reflections, Incentives and disincentives to OA.  Excerpt:

I had inadvertently and lazily framed the question as one of disincentives: “What inhibits OA?” But that assumes, naively if implicitly, that some force (momentum? altruism?) would compel scholarly communications toward OA, if only some niggling matter didn’t stand in the way. The truth is likelier to be the opposite: that some mild force might nudge the system toward OA, but generally not forcefully enough to overcome inertia.

So I’m retreating toward a broader and more useful frame: looking at incentives and disincentives, both extant and potential, to OA. What are they, and how relatively influential are they? ...

So, in the spirit of open science, is this something other people would be interested in working on? ...

Official launch of Elsevier's free SciTopics

Paula Hane, Elsevier Launches SciTopics—Now a Fully Developed Research 2.0 Resource, Information Today, January 19, 2009.  Excerpt:

This week, Elsevier is officially launching its Scirus Topic Pages under the new brand name SciTopics....The company says that this is now a fully developed product, not just a beta offering, and this is the "right moment" to bring it to a broader audience. The newly expanded SciTopics offers additional functionality and features and...some 650 live topic pages with many more in draft format.

Scirus Topic Pages made its debut in June 2007....The main improvements, according to company representatives, are in the workflow processes and in quality controls....

The new SciTopics is a free, online expert-generated knowledge-sharing service for the research community that offers scientific, technical, and medical knowledge (STM) on a variety of subjects. Designed to complement the traditional peer-review process, the site allows invited experts to develop topic pages offering research summaries with the content moderated through 14 subject editors to ensure high standards and relevancy. The easy-to-use, wiki-like resource aims to give authors a platform to showcase their works and to facilitate scholarly debate....

PS:  See our past posts (1, 2) on SciTopics, under its old name, Scirus Topic Pages.  Also see a sample topic page, Does Open Access Increase Citations?

Preview of ALPSP survey on self-archiving

Stevan Harnad, Learned Society Survey On Open Access Self-Archiving, Open Access Archivangelism, January 20, 2009.  Excerpt:

On Mon, Jan 19, 2009 at 3:15 PM, Sally Morris [SM] (Morris Associates) wrote in liblicense: ...

SM: "Sue Thorn and I will shortly be publishing a report of a research study on the attitudes and behaviour of 1368 members of UK-based learned societies in the life sciences.  72.5% said they never used self-archived articles when they had access to the published version."

This makes sense. The self-archived versions are supplements, for those who don't have subscription access.
SM: "3% did so whenever possible, 10% sometimes and 14% rarely. When they did not have access to the published version, 53% still never accessed the self-archived version."
This is an odd category: Wouldn't one have to know what percentage of those articles -- to which these respondents did not have subscription access -- in fact had self-archived versions at all? (The global baseline for spontaneous self-archiving is around 15%)
The way it is stated above, it sounds as if the respondents knew there was a self-archived version, but chose not to use it. I would strongly doubt that...
SM: "16% did so whenever possible."
That 16% sounds awfully close to the baseline 15% where it is possible, because the self-archived supplement exists. In that case, the right description would be that 100%, not 16%, did so. (But I rather suspect the questions were yet again posed in such an ambiguous way that it is impossible to sort any of this out.)
SM: "16% sometimes and 15% rarely. However, 13% of references were not in fact to self-archiving repositories - they included Athens, Ovid, Science Direct and ISI Web of Science/Web of Knowledge."
To get responses on self-archived content, you have to very carefully explain to your respondents what is and is not meant by self-archived content: Free online versions, not those you or your institution have to pay subscription tolls to access.

Comment.  While the full study is still forthcoming, Sally Morris reminds us that the ALPSP and the Biosciences Federation released a summary report last June.  See my comments on the June summary.

Forthcoming Dutch journal on digital libraries

The Dutch publisher, Essentials, has announced plans to launch a new journal of digital libraries, which will cover OA and copyright issues, among others.  The inaugural issue of Digital Library should appear in March 2009.  Read the December 2008 press release in Dutch or Google's English.

It appears that the journal will not itself be OA.

ACS provides free online access for "just accepted" papers

Sophie L. Rovner, ACS Speeds Web Publication: Society tests free online access to peer-reviewed, accepted manuscripts, Chemical and Engineering News, January 15, 2009.  Excerpt:

Yesterday, the American Chemical Society began a pilot program that will make some journal papers available online about two to seven weeks quicker than was the case previously. Authors publishing in ACS Chemical Biology, Biochemistry, Journal of Proteome Research, and Molecular Pharmaceutics can opt to have their peer-reviewed, accepted manuscripts posted on the ACS Publications website within three days of acceptance. These "Just Accepted" manuscripts, (see example here) which are free to all readers, are assigned a digital object identifier (DOI) that can be used to cite the papers. "ACS is providing this service to expedite the dissemination of scientific information in a fully citable format," says Evelyn Jabri, senior acquisitions editor and project leader.

A "Just Accepted" manuscript then proceeds through the usual ACS production process: Technical editors edit and format it for the Web, and authors approve galleys. The final published article retains the same DOI as the "Just Accepted" manuscript, ensuring that citations link to the final scientific article of record when it becomes available.

Biochemistry Editor Richard N. Armstrong says the "Just Accepted" program moves a paper into the public domain as rapidly as possible. That's particularly useful for grant applicants, who can cite these papers as being already published. But some authors are leery about releasing a paper before it's in its final form, Armstrong says....

ASBMB's "Papers in Press" program reduces the time from acceptance to publication by six to eight weeks, Director of Publications Nancy Rodnan says. The society places each accepted manuscript in a permanent, publicly accessible archive on the respective journal's website, but this free access hasn't eroded subscription demand, she notes.


  • If I understand it, the "just accepted" manuscripts are peer-reviewed but not copy-edited or formatted for publication.  ACS will not only make them gratis OA within three days of acceptance, but will allow them to remain gratis OA even after the edited version of the article is published (as TA of course).  If I have this right, then I commend ACS  for this policy --and I've often criticized the ACS for its opposition to OA.  I'm also glad to see it cite the experience of the ASBMB, whose similar program has not caused journal cancellations.
  • Note:  I'm assuming that the "just accepted" articles are gratis OA, and not libre OA, but this is only an assumption.  I just looked at the sample to which ACS points.  It has a copyright statement but no licensing information on the landing page.  It has neither a copyright statement nor licensing information on the PDF itself.
  • Currently, the ACS does not allow preprint or postprint archiving.  However, I hope it will now reconsider and allow both.  If it's willing to allow OA for peer-reviewed manuscripts on its own web site (prior to publication and with DOIs), then it should be willing to allow OA for peer-reviewed manuscripts in OA repositories (from the time of publication and without DOIs).  It should also be willing (a fortiori) to allow OA archiving for preprints.

Update (1/21/09).  I was right that the "just accepted" articles are gratis OA, but wrong that they remain online and remain gratis OA after the final version is published.  On the contrary, the TA published version replaces the OA version at the time of publication.  (Thanks to Evelyn Jabri, Senior Acquisitions Editor at the ACS, for her helpful correspondence.)  For other aspects of the "just accepted" article pilot project, see the FAQ.

PLoS ONE will offer more impact-related data on articles

Elie Dolgin, New impact metric, The Scientist, January 19, 2009.  Excerpt:

In an attempt to provide alternative metrics to the traditional journal impact factor, the open-access journal Public Library of Science ONE announced that it will release a slew of alternative impact data about individual articles in the coming months.

The new "articles-level metrics project" -- which will post usage data, page views, citations from Scopus and CrossRef, social networking links, press coverage, comments, and user ratings for each of PLoS ONE's thousands of articles -- was announced yesterday (Jan. 18) by Peter Binfield, the journal's managing editor, at the ScienceOnline'09 conference in Research Triangle Park, North Carolina.

"No one has any data other than [ISI] impact factors," Binfield told The Scientist. "Our idea is to throw up a bunch of metrics and see what people use."

From its inception at the end of 2006, PLoS ONE has eschewed the notion of impact factors. (It is not currently listed by the ISI Web of Science's rankings.) Binfield argued that the traditional impact factor judges a journal's overall performance, rather than assessing impact at the article-level. The new scheme, however, is aimed at evaluating each article on its own merits, regardless of the other papers in the same journal, he said.

PLoS ONE doesn't plan to crunch the data itself, though. "We're not being arrogant enough to make our own metric," said Binfield. Rather, he hopes that the journal's readers will use the information to come to their conclusions. "We're putting the data out there and letting the world figure it out."

Eventually, Binfield hopes that readers will be able to personalize how they view the data, and sort articles according to the metric of their choice....

Comment.  Kudos to PLoS ONE.  This is an important decision and I hope that other journals (OA and TA) will follow suit. All academics have an interest in breaking the stranglehold of impact factors, undoing their pernicious effects on hiring, promotion, and funding, and working toward more nuanced impact measurements.  Because most OA journals are new, friends of OA have a special reason for undoing the pernicious incentives created by impact factors to shun new journals as such, regardless of their quality.  Here's an excerpt from an article I wrote last September:

...If you've ever had to consider a candidate for hiring, promotion, or tenure, you know that it's much easier to tell whether she has published in high-impact or high-prestige journals than to tell whether her articles are actually good....

Impact factors (IFs) rose to prominence in part because they fulfilled the need for easy quantitative judgments and allowed non-experts to evaluate experts....

IFs measure journal citation impact, not article impact, not author impact, not journal quality, not article quality, and not author quality, but they seemed to provide a reasonable surrogate for a quality measurement in a world desperate for a reasonable surrogate.  Or they did until we realized that they can be distorted by self-citation and reciprocal citation, that some editors pressure authors to cite the journal, that review articles can boost IF without boosting research impact, that articles can be cited for their weaknesses as well as their strengths, that a given article is as likely to bring a journal's IF down as up, that IFs are only computed for a minority of journals, favoring those from North America and Europe, and that they are only computed for journals at least two years old, discriminating against new journals....

When we want to assess the quality of articles or people, and not the citation impact of journals, then we need measurements that are more nuanced, more focused on the salient variables, more fair to the variety of scholarly resources, more comprehensive, more timely, and with luck more automated and fully OA....

I'm never surprised when OA journals report high IFs, often higher than older and better-known journals in their fields.  This reflects the well-documented OA impact advantage.  I'm glad of the evidence that OA journals can play at this game and win.  I'm not saying that journals shouldn't care about their citation impact, or that IFs measure nothing.  I'm only saying that IFs don't measure quality and that universities should care more about quality, especially article quality and candidate quality, than journal citation impact....

I do want to increase submissions to OA journals, but the present argument has the importantly different goal of removing disincentives to submit to OA journals....

Monday, January 19, 2009

Media from CC Tech Summit

Presentations, audio, and video from the Creative Commons Technology Summit (Cambridge, Mass., December 12, 2008) are now available. See especially:
  • David Torpie, Government Information Licensing Framework: A multidisciplinary project improving access to Public Sector Information
  • Jonathan Rees, Open Source Knowledge Management: What Comes After Access?

Geography journal adopts delayed OA policy

Norois has gone online with The journal will have a delayed OA policy with a 2-year embargo. The journal has been in publication since 1954; backfiles to 2004 are currently available.

English journal adopts delayed OA policy

The Journal of the Short Story in English has gone online with The journal will have a delayed OA policy with a 2-year embargo. The journal has been in publication since 1983; backfiles to 1997 are currently available.

Linguistics journal adopts delayed OA policy

Pragmatics, the journal of the International Pragmatics Association, has adopted a delayed OA policy with a one-year embargo as of 2009. (Thanks to Tom Van Hout.)

1300 repositories listed in OpenDOAR

Heather Morrison points out that OpenDOAR now lists 1,300 OA repositories, a 28% increase during 2008.

Importance of OA in biomedical research

Carlo Vinicio Caballero, et al., La importancia del Acceso Abierto en la investigación biomédica y científica, Revista Colombiana de Reumatología, June 2008. (Thanks to PANLAR Bulletin Online.) English abstract:
The Access to the information of high quality, of first class and free entry about medical present topics and scientific knowledge of social importance, makes the Open Access (OA) a fundamental tool in the medical community in the world and Latin America. Open access is free, immediate, permanent, full-text, online access, for any user, web-wide, to digital scientific and scholarly material primarily research articles published in peerreviewed journals. OA means that any individual user, anywhere, who has access to the internet, may link, read, download, store, print-off, use, and datamine the digital content of that article. An OA article usually has limited copyright and licensing restrictions. The OA is a powerful source of information and great social impact in Latin America because the benefits, the accessibility and economical advantages that raises in the zone.

OA annotated Quebec Civil Code

Simon Fodden, Annotated Civil Code, Slaw, January 17, 2009.

LexUM has released a digital, annotated version of the Quebec Civil Code. With this release the Code for the first time obtains a hyperlinked table of contents, which even the version on the LexUM supported CanLII lacks. But the new LexUM version offers much more: each section is seeded with caselaw annotations extracted from CanLII and may be further annotated by viewers; there’s an “[add]” button at the bottom of the screen that in good AJAX fashion opens out a form to receive the relevant data about a case. As well there’s an “[annotate]” button allowing the viewer to add text commentary about a particular case annotation.

At the moment, according to LexUM’s Ivan Mokanov, all viewer annotations and comments are going to be moderated. It hasn’t been decided whether this filtering will continue. And they’re also considering whether to continue allowing anonymous annotations or to require annotaters to sign in. ...

Blog notes on ScienceOnline'09

Alice Pawley, Open Access publishing at ScienceOnline 2009, Sciencewomen, January 17, 2009. Blog notes on ScienceOnline'09 (Research Triangle Park, N.C., January, 16-18, 2009).

Update. See also the notes by Kevin Zelnio and Molly Keener.

Update. See also the rest of Molly Keener's notes.

Wikipedia is 8 years old

January 15 was the 8th birthday of Wikipedia. See Mike Linksvayer's comment at the Creative Commons blog.

National Galleries of Scotland joins Flickr Commons

The National Galleries of Scotland has joined The Commons on Flickr. See the January 14 announcement. (Thanks to Creative Commons.)

A Digital Humanities Manifesto

Last month (December 15, 2008) the Mellon Seminar in Digital Humanities at UCLA issued A Digital Humanities Manifesto.  Excerpt:

[5] The digital is the realm of the open: open source, open resources, open doors. Anything that attempts to close this space should be recognized for what it is: the enemy....

[7] Copyright and IP standards must, accordingly, be freed from the stranglehold of Capital....

[8] The multi-purposing and multiple channeling of humanistic knowledge: no channel excludes the other. This is an abundance based economy, not one based upon scarcity. It values the COPY more highly than ORIGINALS and restores to the word COPY its original meaning of abundance: COPIA = COPIOUSNESS = THE OVERFLOWING BOUNTY OF THE INFORMATION AGE....

Librarian OA advocates as community organizers

Greyson at Social Justice Librarian responding to David Shumaker at The Embedded Librarian:

...This line of thinking, for me, developed from working on the issue of open access with the research community in which I am embedded. While traditional education methods (lectures, classes, seminars, websites) were of some utility, I found myself drawing heavily on the community organizing skills from my pre-MLIS days....

In order to organize my research community around open access I have had to find out what matters to my researchers, staff, programmers, etc., and work from there. I have worked collaboratively with them to understand the issue and the relative merits of open access in ways that make sense to them and reflect their motivations and values. Yes, I have specialised tools and access I can offer, and yes I can share my personal values and passions with my colleagues, but the goal we work toward has to be a common one, not one I want to impose upon them....

Time to add community organizing to the list of things not taught in library school? It’s certainly been bouncing around in my brain.

How well do repositories accept automated mass deposits?

Stuart Lewis, DSpace at a third of a million items, Stuart Lewis' blog, January 19, 2009.  Excerpt:

As part of the JISC-funded ROAD (Robot-generated Open Access Data) project we are load testing DSpace EPrints and Fedora to see how they cope with holding large numbers of items. For a bit of background, see an earlier blog post: ‘About to load test DEF repositories

The project programmer Antony Corfield has created a SWORD deposit tool for this purpose....

The tool was left running over Christmas depositing 9Mb (approx) SWORD packages, each containing the results of an experiment. It is now almost up to 1/3rd of a million deposits. Whilst we would have liked to keep running the experiment to take the deposits to maybe 1 million, it would take more time than we have....

[PS:  Here omitting the chart of the time it took to deposit each item.]

Some observations from this chart:

  • As expected, the more items that were in the repository, the longer an average deposit took to complete.
  • On average deposits into an empty repository took about one and a half seconds
  • On average deposits into a repository with three hundred thousand items took about seven seconds
  • If this linear looking relationship between number of deposits and speed of deposit were to continue at the same rate, an average deposit into a repository containing one million items would take about 19 to 20 seconds.
  • Extrapolate this to work out throughput per day, and that is about 10MB deposited every 20 seconds, 30MB per minute, or 43GB of data per day.
  • The ROAD project proposal suggested we wanted to deposit about 2Gb of data per day, which is therefore easily possible.
  • If we extrapolate this further, then DSpace could theoretically hold 4 to 5 million items, and still accept 2B of data per day deposited via SWORD....

It is useful to note the overhead that SWORD puts on the deposit process. Those of you who have run normal imports into repositories such as DSpace will know that they zip along quite fast, probably several times (if not more) faster than the deposit by SWORD....

(Over the next few months we’ll run similar tests with EPrints and Fedora, and depending on time can try other tests such as performing the same tests but on a server which is under user-load....)

Sunday, January 18, 2009

Google announces 100,000 OA knols

The Google Knol project has passed the milestone of 100,000th knols.  From Google's January 16 announcement:

A few weeks ago the 100,000th knol was published, and we figured now is an excellent time to reflect on the first five months of Knol's existence.

Knols are authoritative [OA] articles about specific topics, written by people who know about those subjects....

The Knol interface is now available in eight languages (Arabic, English, French, German, Italian, Korean, Portuguese and Spanish) and we are excited that our users are helping us translate it into many more languages using the Google in Your Language console....[K]nols have been written in 59 different languages to date....

We have worked quickly to incorporate the features most requested by our early authors, such as usage stats showing reader activity on knols and rich media embedding (videos, spreadsheets, forms, slideshows, etc.). All of these improvements are tracked in our Announcement and Release Notes.

We are happy to see that most authors choose to accept moderated edits from their audience and that the volume of suggested edits from readers is steadily growing. So if you find yourself reading a knol and want to suggest an improvement, go ahead and press that edit button! You will be able to make the desired changes directly in the knol, and the author(s) will be able to review and act upon your suggestions....

An ETD repository for the U of Minnesota

The University of Minnesota has launched an OA repository for ETDs.  According to the U Minn Libraries blog,

Early data indicates that graduate students have embraced the option, with 83% participation in December 2008. Discussions are underway now about broadening the option to include Master's Plan A students.

Library records for OA/TA books

How should libraries catalog books that exist in both digital and print editions (hence including OA and TA editions)?  A task force has developed guidelines.  (Thanks to Charles Bailey.)  From its report, December 19, 2008:

The Provider-Neutral E-Monograph Record Task Group was formed shortly after the 2008 Annual Meeting of the American Library Association. The group's charge was to develop a monographic cataloging policy that would provide for a single electronic MARC bibliographic record to represent an online resource that is available from one or more providers. This proposal is only concerned with separate MARC records for the electronic resource-- it does not address the addition of electronic fields to the print record, otherwise known as the "Single Record Approach." The proposal is intended to encompass records for monographic titles which are simultaneously issued in print and online, digital reproductions of print resources, and born-digital resources. All e-monographic resources cataloged on OCLC should follow the Provider-Neutral (P-N) model from Day One, even if the resource is available from only one provider at the time of cataloging. E-monograph records created by the DLF Registry of Digital Masters should be combined with records from other providers onto the one P-N e-monograph record. Separate records may be created at the national level, whenever the cataloger determines that the content of a new digital manifestation is significantly different. Catalogers are also free to use the Single-Record Approach, at both the national and local levels....

U of Oslo launches an OA monograph series

The University of Oslo has launched a new series of OA books, Oslo Studies in Language.  Read the January 14 announcement in Norwegian or Google's English.

Thanks to Stian Håklev for the alert and for his own translation of parts of the announcement.  Excerpt:

...All the articles will be sent to peer-review....

Our desire is to contribute to the growing family of freely available electronic publications that are distributed to the entire research world. Through the establishment of Oslo Studies in Language we guarantee the control of both the academic content, the typographic presentation and storage for the future, without having to go through the traditional publishing houses, state Grønn and Haug. Both refer to the important power that lays with the traditional publishing houses.

– Today researchers and universities actually have to pay to have books published with international publishers. The peer-reviewers and editors are usually not paid. Then, the publishers charge good money for the journals and books. This happens, even if it is the universities and the state has has paid for the research. With this service, the researchers only have to send the content to us, before we pass it on to peer-review. If it is accepted, it will immediately be published in our electronic book series....

Both give credit to a member of the editing committee, Östen Dahl from Stockholm University for sowing the seed about launching an electronic book series. – The only demand he had for joining the editing committee, was that the book series should be made available through a  publishing channel that was electronic and freely available. It is said that Dahl is the most cited linguist in the Nordic countries....

The OA discussion at the TACD IP meeting

Gavin Baker, OA at TACD IP, A Journal of Insignificant Inquiry, January 17, 2009.  Last week Gavin was live-blogging the TACD's Patents, Copyrights and Knowledge Governance conference (Washington DC, January 12-13, 2009).  Now he has summarized the presentations that bear most on OA, and added some comments of his own.  Excerpt:

..Two panels (at least) mentioned [OA] explicitly: the panels on Openness and on Innovation and Access for Medical Technologies. I’ll try to summarize what was notable:

Openness: Tim Hubbard of the Wellcome Trust: ...

  • There is an effort to develop a “PubMed Commons”, building on top of PubMed Central, adding e.g. features for comments and tags.
  • Recommendation: Policies for OA to publications and data should become standard for research funders.
  • There are some problems remaining: ...OA to data: cultural attitudes (e.g. how to give credit/attribution, adjusting to new competitive environment — people interpreting your data before you can), practical problems (e.g. format standardization)
  • The focus of the presentation was on the conflict between openness (e.g. OA to genetic data) and privacy. Even supposedly de-identified data can be re-identified, posing a challenge to OA to data on human subjects. He suggested a solution where researchers could have free access to the complete data (with full granularity), but without handing it out in raw form: researchers could load programs onto a trusted broker’s computer, which would execute the algorithm and provide anonymized results....


  • How many funders actually have open data requirements for their grantees? This seems like a good topic for the Open Access Directory: collecting information on funder policies, as well as on projects and institutions with open data policies for their intramural researchers. This is such a nascent space that there’s a big role for sharing best practices here....

Openness: Heather Joseph of SPARC:

  • There is a growing but limited awareness among policymakers of the benefits of openness. We need policies to remove both access and permission barriers to research. So far most uptake of OA policies has been in the biomedical arena. There’s pushback from opponents even there.
  • There’s a recognition that data is valuable and needs to be appropriately managed. But we need to move from mere “data management policies” toward open access to data.
  • So far, the signs from the Obama administration are promising....
  • Recommendation: Make OA policies throughout the government, e.g. through executive order or regulation, but not as part of IP policymaking.


  • I think this is the first time I’ve heard the suggestion that the U.S. adopt OA policies for its publicly-funded research via executive order or regulation. There’s some sense to this, I think: it has to do with the internal management of the government and doesn’t run afoul of any legislation. I see two challenges, though. First, other funding agencies don’t have the pre-existing cyberinfrastructure into which grantees could deposit their articles; funding for that has to come from somewhere (e.g. through the appropriations process). Second, as we saw with the “Fair Copyright in Research Works Act”, there’s some opposition to these policies in Congress: some members might take issue with executive policymaking — but I think that’s a fight we could win, especially with the administration on our side.

Openness: Questions:

  • Bruce Perens: What about other IP on publicly-funded research, e.g. Bayh-Dole? Joseph: We need to draw a distinction — we’re focusing on access to the literature and data.
  • Jamie Love: If there was an access to knowledge treaty, what specific issues would you want to include? Hubbard: OA data and literature; mandated protection of private data; greater openness to economic data; standards for government data. Joseph: OA literature and data.
  • Jonathan Band: Some countries hold copyright on government works (Crown copyright — unlike the U.S.) — this needs to change.


  • There’s a little surge of discussion lately about patents (and the management thereof) on publicly-funded research, e.g. owing to developments in India and South Africa. It’s an issue I’ve been involved with before, and I’m sympathetic toward efforts to improve the current system. There are some common principles, but they’re also very different questions. Specificity, so we can understand well what we’re discussing, is important in policy, and especially in these areas, which are often so challenging to understand. So: OA can advance without affecting the Bayh-Dole system; the Bayh-Dole system doesn’t necessarily do harm to OA....

Innovation and Access for Medical Technologies: Judit Rius of Knowledge Ecology International: ...

  • Recommendation: Innovation inducement prizes should include an “open source dividend” to encourage open sharing of knowledge, data, material, and technology
  • Recommendation: Governments should support the World Health Organization’s Global Strategy and Plan of Action on Public Health, Innovation and Intellectual Property (as adopted in World Health Assembly resolution 61.21), including open access to scientific articles and data from publicly-funded research

Comments: ...

  • OA advocates shouldn’t forget about the WHO’s Global Strategy, which includes very favorable language toward OA. It’s worth mentioning when speaking with funders, policymakers, researchers, etc. Unfortunately, the language as adopted in WHA 61.21 only said that states should “strongly encourage” publicly-funded researchers to make their manuscripts OA,retreating from earlier language that states should “require” it. Still, as we saw with the NIH policy — maligned as the earlier voluntary policy was (and rightly so), it laid the groundwork for a mandatory policy (establishing a repository, starting to educate researchers, gathering evidence on the policy’s impact and effectiveness, building a constituency in favor of OA, etc.). We have a document representing global consensus that governments should support OA: we should remind people of that more often.

More on the patentability of publicly-funded research

Catherine Saez, IP From Publicly Funded Research Should Benefit The Public, Experts Say, Intellectual Property Watch, January 16, 2009.  Excerpt:

...The symposium on public sector IP management in the life sciences, held at the World Intellectual Property Organization (WIPO) [Geneva, December 15, 2009] aimed at harvesting practical experiences to help policymakers develop a structured perspective on the policy question raised by public interest IP management, said Antony Taubman of WIPO.

In many countries there is a debate on how to manage publicly funded intellectual property, as a balance between private interests and the broader public interest must be reached....

The classical perspective of IP gives two different approaches: the public sector approach which does not make use of the market and simply publishes research to the public domain, and the private sector model in which market interest and technology is licensed in an exclusive way. However, according to Taubman, this was never a complete picture, and the emerging practice shows that between those opposite poles there is a range of different options like licensing strategies, public-private partnerships and patent pooling structures....

Some countries have changed their legislation to address the issue. In South Africa, a new law on the managing of publicly funded IP just came into effect in January....

A policy framework was subsequently developed and adopted by the Parliament. The framework created, among other things, an obligation for recipients of public funding research to disclose all IP coming out of that research. It also stated that the government should secure the IP if the institution did not. In certain cases, some patents can be secured to protect public interest and may not be licensed on commercial terms....

Activists however think that this new legislation might have a negative impact on South African research. Andrew Rens, intellectual property fellow for the Shuttleworth Foundation wrote in a post this week that donor organisations, for example, might be reluctant to fund research in South Africa as they often prefer research results to be open. Also, universities now have commercial interest in the granting of patent, which could create a conflict of interest when South Africa moves to an examination system for patents and needs experts to evaluate research, he said.

In Germany, a “professor’s privilege” was introduced into the legislation, said Wolf-Michael Catenhusen, former state secretary of the German federal ministry of education and research. Until 2002, a German professor was automatically owner of all IP rights on his research results, he said.

In 2002, however, there was a revision of the “employee invention law” which adopted the principles of the US Bayh-Dole act of 1980 [pdf], which puts on obligation on universities to disclose invention to the federal funding agency . “From that date, German universities got the right to patent inventions of members of the university,” said Catenhusen....“Fields which have the chance to get patents and to commercialise them are on the top agenda of research funding priorities, not only in Germany but in many industrialised countries,” he said.

Some researchers have voiced concerns about such priorities in research funding impending freedom of research, but Catenhusen argued that a mixed system of research funding consisting of quality oriented basic research and innovation oriented programme research funding is a good instrument to guarantee the freedom of research....

Developed in the early 1980s, the International Centre for Genetic Engineering and Biotechnology (ICGEB) aims at advancing knowledge of the developing world in fields such as biomedicine, crop improvement and biopharmaceuticals through research and training, according to Decio Ripandelli of the ICGEB.

The ICGEB is an international institution owned by member countries. With 59 full member states, all developing countries, ICGEB, harbours scientists from over 50 countries in its laboratories. The centre has published 1,700 international peer reviewed publications and filed 55 patents.

The institution patents innovations discovered by its researchers within the context of a “social contract,” under which patents act as an incentive and an encouragement for further innovation, while society and taxpayers get public disclosure of the invention, said Ripandelli. Most developing countries do not have the culture or capability for proper management of IP, he said....

However, the ICGEB’s policy guidelines on patents and other IP rights make it clear that the technology should remain available. The results emanating from the institution’s laboratories, Ripandelli said, should “be granted to all developing countries even if they are not members of our centre.”

PS:  Also see our recent posts on the new tech tech transfer laws in South Africa and India.