Open Access News

News from the open access movement

Saturday, August 18, 2007

AnthroSource moves to Wiley-Blackwell

AnthroSource, the publishing arm of the American Anthropological Association, is moving from the University of California Press to Wiley-Blackwell.  Thanks to William Walsh for the alert, this excerpt from the announcement, and the related links and excerpt to put the move in perspective:

From today's announcement from Rachel Lee, UC Press:

Following a lengthy bidding process, the American Anthropological Association informed us earlier this week that they will be moving their publications program -- including AnthroSource -- to Wiley/Blackwell Publishers as of January 2008. Pricing for the 2007/2008 subscription year will be forthcoming directly from Wiley/Blackwell....

You may remember that the AAA publicly opposed FRPAA last year and that, prior to being disbanded, the AnthroSource Steering Committee opposed that opposition....

….Former AAA Director of Publications Susan Skomal's "Transformation of a Scholarly Society Publishing Program" may also be of interest. An excerpt:

Today’s electronic environment mitigates against a small scholarly publisher continuing to operate its entire program independently. Given the complex set of working parts, partnering with vendors, consultants, funding agencies, and even other publishers increases the likelihood of success. Moreover, AAA deliberately chose to work with as many nonprofit and likeminded partners as possible to ensure that it could meet its mission to provide scholarship affordably to both members and libraries.

Comment.  This seems to put an end to the hopes of many anthropologists that the AAA would convert AnthroSource to OA.

Update. Also see this comment by Tom Wilson:

The basis for the decision appears to be, in part, a report by the AAA's Director of Publishing which contains a truly amazing proposition:
Today’s electronic environment mitigates against a small scholarly publisher continuing to operate its entire program independently.
I don't think I've seen such an unintelligent statement about publishing in the electronic era. It is exactly the opposite of the true situation: the electronic environment makes it easier for scholarly societies to pursue an independent programme. I would urge members of the AAA to abandon their organization (since it has abandoned them to the vagaries of commercial decision making) and develop their own alternative publishing outlets. There are many examples of collaborative, non-commercial OA journals from which they could take models and encouragement.

Studying the redirection of university funds from TA to OA

Don Waters of the Mellon Foundation is willing to fund a study of “the feasibility and desirability of a massive reallocation of institutional funds [from journal subscriptions] to support open access”.  From his LibLicense post of August 15:

The "possibility of redeploying some of the funds available for 'indirect costs' from library subscriptions and site licenses to supporting open access initiatives" is frequently mentioned on this list.  It seems to me to be a very glib assertion that ignores the complexity of university organization and budgeting, a point that Jim O'Donnell and others have made in a previous postings to this list.  The assertion nevertheless keeps arising and it may be time to test the likelihood of the "possibility."

I would be very interested in learning of research universities whose presidents or provosts in conjunction with their library and faculty, are conducting or would be willing to conduct (possibly with foundation support), a serious and intensive study of the feasibility and desirability of a massive reallocation of institutional funds to support open access.  Would members of this list identify such institutions and an individual whom I could contact for more information either by a posting to this list or by replying directly to me? [djw *at* mellon *dot* org]


  • I applaud this step, especially the focus on feasibility and desirability.  The idea of redirecting funds from TA journals to OA journals is serious and attractive.  My position is not that redirection is easy or inevitable, but that it’s feasible and very desirable.  (If there is a “glib assertion” about it floating around, it’s that redirection will be easy or inevitable.)  I don’t advocate —or haven’t yet advocated— what could be called preemptive redirection:  the cancellation of TA journals in order to free up funds for OA [*].  OA archiving is already growing without the need for redirecting funds.  But I do argue that if the growth of OA archiving causes the cancellation, conversion, or demise of TA journals, as many publishers fear, then the money formerly spent on their subscriptions can and should be spent first on the OA alternative.
  • [*] Qualification:  I support CERN’s SCOAP3 initiative, which I interpret as a cooperative plan of preemptive redirection worked out with the agreement of publishers themselves, and I would support similar initiatives in other fields.
  • In short, if it’s glib to say that redirection is easy or inevitable, then it’s also glib to say that the cancellation, conversion, or demise of peer-reviewed TA journals will entail the end of peer review. 

An OA table-of-contents service

Roddy MacLeod has written a useful introduction to the new ticTOCs project:

...ticTOCs is a project to develop a freely available service which may transform journal current awareness. The ticTOCs service will make it easy for academics, researchers and anyone else to find, display, store, combine and reuse journal tables of contents (TOCs) from multiple publishers in a personalisable web based environment. JISC is the primary funder of the ticTOCs project, which will run for two years from April 2007.

Fifteen partners are involved in the project. Lead by the University of Liverpool Library, the consortium also includes Heriot-Watt University, Cranfield University, CrossRef, ProQuest, RefWorks, Emerald, Nature Publishing Group, SAGE Publishers, Institute of Physics, Inderscience Publishers, MIMAS, Directory of Open Access Journals, Open J-Gate and Intute....

The ticTOCs project will develop a freely available service which will benefit not only academics and researchers, publishers and authors, but also service providers such as libraries, aggregators, discovery services and journal directories....

The ticTOCS service will enable academics, researchers and anyone else, without having to understand the technical or procedural concepts involved in the process, to discover, subscribe to, search within, be alerted to, aggregate, export and re-use standardised Table of Contents RSS (really Simple Syndication) feeds and their content for thousands of journals from numerous publishers. In addition, it will facilitate the re-use of aggregated journal TOC content on a subject basis by gateways, subject-based resource discovery services, library services and others, where it can act as a showcase of the latest research output. It will also make it easy for users of library and information services, commercial and open access journal publishers, online gateways, content aggregators and journal directories to subscribe to journal TOC RSS feeds of interest, with one click, via a freely available personalisable web-based interface. ticTOCs will encourage the production of standardised journal TOC RSS feeds, and thereby facilitate their interoperability and improve the quality of their data....

A prototype service is expected to be up and running by April 2008....

More information about ticTOCs is available at the project website....A ticTOCs news blog is available....

Beyond open data to open methods

Cameron Neylon, Open methods vs open data - might the former be even harder? Science in the open, August 17, 2007.  Excerpt:

…The case for sharing methods is, at least on the surface, easier to make than sharing data. A community can really benefit from having all those tips and tricks available. You put yours up and I’ll put mine up means everyone benefits. But if there is something that gives you a critical competitive advantage then how easy is that going to be to give up? An old example is the ‘liquid gold’ transformation buffer developed by Doug Hanahan (read the story in Sambrook and Russell, third edition, p1.105 or online here - I think; its not open access). Hanahan ‘freely and generously distributed the buffer to anyone whose experiments needed high efficiencies…’ (Sambrook and Russell) but he was apparently less keen to make the recipe available. And when it was published (Hanahan, 1983) many labs couldn’t achieve the same efficiencies, again because of details like a critical requirement for absolutely clean glassware (how clean is clean?). How many papers these days even include or reference the protocol used for transformation of E. coli? Yet this could, and did, give a real competitive advantage to particular labs in the early 1980s.

So, if we are to make a case for making methodology open we need to tackle this. I think it is clear that making this knowledge available is good for the science community. But it could be a definite negative for specific groups and people. The challenge lies in making sure that altruistic behaviour that benefits the community is rewarded. And this won’t happen unless metrics of success and community stature are widened to include more than just publications.

Special issue of CTWatch on the coming revolution in scholarly

The August issue of Cyberinfrasctructure Technology Watch (CTWatch) is devoted to The Coming Revolution in Scholarly Communication and Cyberinfractructure.  All the articles are OA-related:

More on citation impact of OA journals

Hajar Sotudeh and Abbas Horri, The citation performance of open access journals: A disciplinary investigation of citation distribution models, Journal of the American Society for Information Science and Technology, August 17, 2007.  Not even an abstract is free online, at least so far.

Update. Thanks to Tom Wilson for sending me an excerpt, from which I've pulled these snippets:

...[C]itations to OA articles increase at a faster rate relative to the increase in publication of OA articles, although this rate is slower than that previously reported for the world science system....

Finally, our results at the individual and global level suggest that almost half of the fields exceed or equal their expected citation rates compared to non-OA journals. The first global rank is occupied by Life Sciences, followed by the Natural Sciences and Engineering and Material Sciences in second and third positions, respectively. Medical and Health Sciences, Physics, and Chemistry are, respectively, the best known subdisciplines of Life Sciences and the Natural Sciences, enjoying a long reputation in embracing the OA model. Medical communities have taken quickly to OAJ publishing, whereas physicists and chemists traditionally show a higher tendency towards preprint dissemination and subject repositories, respectively. According to our results, Medical and Health Sciences is the best performing subdiscipline. There are many medical fields positioned among the highest ranked ones. Having a long and widespread tradition of supporting OAJs, their success is not unexpected....Despite [some] concerns [about the quality of OA articles in medicine], the citation findings show a wide embrace of OA articles in the discipline....

Friday, August 17, 2007

Brewster Kahle on the OCA, Google, book-scanning, and libraries

Andrew Richard Albanese, Scan this book!  Library Journal, August 17, 2007.

….To merge the texts and traditions of our print past and our web future, explains visionary technologist Brewster Kahle, represents a truly historic moment for our culture….

Despite all the librarians who eagerly identify themselves as book lovers, it's hard not to notice that books have had, well, a rather rocky start to the Internet Age. In the first iteration of a World Wide Web, they remained all but hidden on library shelves, and, unsurprisingly, circulation numbers dipped. That led some to surmise that the book was languishing in the throes of obsolescence. But as search technology improved and books became more discoverable through online library catalogs and keyword searches on the wider web, circulation surged back, by double-digit margins in many libraries. Overnight, books that went untouched for years were getting into patrons' hands again. Almost any librarian today will tell you their book circulation is going strong. The question now, however, is where is it going? …

Against this backdrop, in 2005 the forward-thinking Kahle launched the Open Content Alliance (OCA). An alternative scan plan to Google's controversial library project, Kahle's vision of putting books online embraces the values of openness central to librarianship and vital to the work of libraries….

OCA now counts 40 members and “regional scanning centers” in six cities scanning up to 12,000 books a month, over four million pages. For 10¢ a page, Kahle says, the OCA can now bring public domain books and other materials online, nondestructively, and offer them to the world. And unlike Google's plan, there are no restrictions on public domain books scanned by OCA members. Users are not forced to use proprietary interfaces; OCA scans are not hidden from rival search engines….

LJ recently visited with Kahle in San Francisco to get his take on the challenges of getting books on the web, the progress of the Open Content Alliance, and the intertwined future of books and libraries….

You've been critical of Google's library partnerships. What is Google doing right and/or wrong?

Two problems: one is perpetual restrictions on the public domain. Another is that these negotiations are all going on in secret. It shouldn't take a subpoena to get information from a librarian. But in this new world order, both perpetual restrictions and gag orders are being put in place on libraries by a corporate enterprise. The idea of making all books accessible online in new and different ways is all good news. But if you do this in a way that the materials that have been housed in libraries for centuries are made available only through one corporate interface, that is an Orwellian future.

Are you surprised to see libraries signing up with Google under restrictive terms?

I'm not surprised that a corporation wants to be the only place someone can get information, and I was not terribly surprised that some libraries went forward with this before they understood how they could do it on their own and how much it would cost to do it for themselves, not only to do the digitization but also to create services around these collections. I was surprised to see more libraries jumping on the Google bandwagon after demonstrating how libraries can do this and after actually doing it with the Open Content Alliance….

If libraries had the organization and the will, could we scan our collections ourselves, without such restrictions?

Yes. We've achieved mass digitization at 10¢ a page, on average about $30 a book. That includes high-resolution color imaging, optical character recognition, and compression and packaging into PDFs. And all of it open, meaning you can download and use these books in bulk. Take a million-book library, which is larger than most libraries in the world. What would it cost to make a million-book library online? At 10¢ a page, 300 pages in a book, it would price out at about $30 million, costs that could be spread out over many institutions. If the library market in the United States is about $12 billion a year, $3 billion to $4 billion of which goes to publishers' products, $30 million is about one percent of one year's budget. We can do this.

Google and OCA would seem to be natural allies. Why hasn't that alliance happened?

With the OCA, we originally tried to figure out whether to put in some restrictions in such a way that Google would come onboard. We found that when we put some restrictions in, the commercial guys just wanted even more. The public domain is small enough as it stands, we thought, let's not clobber it again as it goes digital. Let's let people use the public domain for whatever.

Microsoft was involved with OCA but hasn't it since launched its own, more restrictive book project?

At the OCA launch, Microsoft committed to scanning a lot of books under OCA principles, but it changed after a year of scanning. It put in more restrictions that make it incompatible with OCA, such as it doesn't want its books surfaced in other commercial services. We're sad to see Microsoft putting more restrictions on its scans. But this is a reaction to the growing environment: if Google would take its restrictions off the public domain, I'm sure Microsoft would follow.

Google's pitch to libraries can be awfully attractive, and it is so ubiquitous. How does the OCA compete for library partners?

Revolutions aren't started by majorities. They come from leaders who see things that need to be done. Boston Public Library, for example, has been courted by Google, but it has said it is going to remain open. The Library of Congress also announced it is going to work with the Open Content Alliance. That's what it takes….

You mention digital rights–managed (DRM) interfaces. How much of a stumbling block is DRM for books online?

DRM used to be called copy protection, and it didn't work for the software industry, it's not working for music, and it won't work for books. It is a bad idea that contributes to the demise of an industry. In the software industry, it was a complete failure….I hope the book industry doesn't feel it needs to have centralized copy protection schemes. It's a trap.

How challenging are copyright issues, such as orphan works, in getting our literary past online and accessible?

It's bad out there….We're showing the results of decades of successful lobbyists with very narrow interests hijacking the information age….

As a “digital librarian” and an Internet pioneer, how do you view the library system?

I see the library system in this country as a $12 billion industry dedicated to preservation and access of materials that are not mediated through a corporate experience. You don't have to sign a nondisclosure form to come up with a new idea in a library. In libraries, materials are preserved in original form, uncensored. The alternative is that the materials people learn from are forever mediated by a relatively small number of commercial companies in terms of selection and presentation. This is one of the biggest issues facing libraries in the future: what services will they perform, and what services will be performed by companies or by nonprofits acting like companies. If all content is moderated by a few companies in the digital world, we'll have a giant bookstore rather than a library system….


My computer is on the mend.  It has a new hard drive, but I’m only half finished with the huge, annoying job of re-installing all my non-free software.  Some installed without trouble.  Some didn’t recognize my valid registration number, forcing me to live with a free trial edition until I resolved the problem or bought a new copy.  Some didn’t recognize my valid files of data and settings, forcing me to start over or plead with unnamed customer service people by email (since of course they don’t use the telephone).  Some merely required a day of problem-solving solicitude.  Two are still not working, and I still have 13 packages to go.

One side effect is that I may have missed some email sent between Wednesday and today.  If you sent me a personal message during that time, I hope you’ll try again.  Thanks for understanding.

Wednesday, August 15, 2007


My computer is failing and tomorrow will undergo a hard-drive transplant.  I’ll catch up as soon as we’ve both recuperated. 

Two more nonprofits join the ATA

An author addendum for Canadian scholars

CARL and SPARC offer Canadian Authors new tool to widen access to published articles, a press release from CARL and SPARC, August 15, 2007.  Excerpt:

The Canadian Association of Research Libraries (CARL) and SPARC (the Scholarly Publishing and Academic Resources Coalition) today announced the release of the SPARC Canadian Author Addendum, a new tool for authors in Canada to retain key rights to the journal articles they publish.

Traditional publishing agreements often require that authors grant exclusive rights to the publisher. The new SPARC Canadian Author Addendum enables authors to secure a more balanced agreement by retaining select rights, such as the rights to reproduce, reuse, and publicly present the articles they publish for non-commercial purposes. It will help Canadian researchers to comply with granting council public access policies, such as the Canadian Institutes of Health Research Policy on Access to Research Outputs. The Canadian Addendum reflects Canadian copyright law and is an adaptation of the original U.S. version of the SPARC Author Addendum....

An explanatory brochure complements the Addendum. Both the brochure and addendum are available in French and English on the CARL and SPARC Web sites and will be widely distributed. SPARC, in conjunction with ARL and ACRL, has also introduced a free Web cast on Understanding Author Rights. See [here] for details....

Recent developments in open geography

Steve Cisler, Open Geography: new tools and new initiatives, a preprint, June 5, 2007.  (Thanks to the IAPAD Bibliography.)  Excerpt:

In the past few years there has been a growing interest in digital mapping and the collection and use of geo-spatial information integrated with other kinds of information and media. The tools and commercial applications are suitable for use by nonprofessionals in a myriad of different contexts: social networks, web logs, multimedia presentations, and economic analyses....

The term “open” in a geospatial context is applied in five ways: to standards and organizations promoting them, development tools, data sets, public policies, and to the lowering of barriers for average users to make use of the tools and maps. In addition there is one open hardware development project. OpenMoko is the world’s first integrated open source mobile communications....

Not letting patents obstruct scientific sharing

Gerry Toomey, Sharing the fruits of science, University Affairs, August/September 2007.  Excerpt:

...We...know that the social behaviour of modern science, and of the broader domain of innovation, is marked by a continual tug-of-war. At one end of the rope we find the forces of collaboration and sharing. At the other end are the instincts to compete and to protect one’s hard-earned intellectual property. While both kinds of behaviour lubricate scientific discovery and technological innovation, IP protection via patenting, with a view to future profits, has become a dominant trend in recent decades, particularly in the life sciences.

But now an international scientific counterculture is emerging. Often referred to as “open science,” this growing movement proposes that we err on the side of collaboration and sharing. That’s especially true when it comes to creating and using the basic scientific tools needed both for downstream innovation and for solving broader human problems.

Open science proposes changing the culture without destroying the creative tension between the two ends of the science-for-innovation rope. And it predicts that the payoff – to human knowledge and to the economies of knowledge-intensive countries like Canada – will be much greater than any loss, by leveraging knowledge to everyone’s benefit....

“The reason we talk about open source,” explains Richard Jefferson, a California-born biotechnologist now living in Australia, “is because it was the first movement to embed in the creative process, in this instance software engineering, the permission not just to inspect inventions but to use them to create economic value. Open source imposes covenants of behaviour rather than financial agreements. Unrestricted use and the right to make a profit don’t usually get in bed together. In open source, they’ve done so quite productively.”

Dr. Jefferson is founder of an international research institute in Canberra called CAMBIA. He and his centre are among the most outspoken and active proponents of open science....

Major grant support for Fedora archiving software

Fedora Commons has received a $4.9 million grant from the Moore Foundation to enhance the open-source archiving software and cultivate an open community to support it.  From Monday's announcement:

Fedora Commons today announced the award of a four year, $4.9M grant from the Gordon and Betty Moore Foundation to develop the organizational and technical frameworks necessary to effect revolutionary change in how scientists, scholars, museums, libraries, and educators collaborate to produce, share, and preserve their digital intellectual creations. Fedora Commons is a new non-profit organization that will continue the mission of the Fedora Project, the successful open-source software collaboration between Cornell University and the University of Virginia. The Fedora Project evolved from the Flexible Extensible Digital Object Repository Architecture (Fedora) developed by researchers at Cornell Computing and Information Science.

With this funding, Fedora Commons will foster an open community to support the development and deployment of open source software, which facilitates open collaboration and open access to scholarly, scientific, cultural, and educational materials in digital form. The software platform developed by Fedora Commons with Gordon and Betty Moore Foundation funding will support a networked model of intellectual activity, whereby scientists, scholars, teachers, and students will use the Internet to collaboratively create new ideas, and build on, annotate, and refine the ideas of their colleagues worldwide. With its roots in the Fedora open-source repository system, developed since 2001 with support from the Andrew W. Mellon Foundation, the new software will continue to focus on the integrity and longevity of the intellectual products that underlie this new form of knowledge work. The result will be an open source software platform that both enables collaborative models of information creation and sharing, and provides sustainable repositories to secure the digital materials that constitute our intellectual, scientific, and cultural history....

According to Sandy Payette, Executive Director of Fedora Commons, “the new Fedora Commons can foster technologies and partnerships that make it possible for academic and scientific communities to publish, share, and archive the results of their own work in a free, open fashion, and make it possible to analyze and use content in novel ways.” ...

Payette also noted, “The open-source software that is developed and distributed by Fedora Commons can impact the entire lifecycle of what is often referred to as ‘e-Research’ and ‘e-Science,’ including storage of experimental data, analysis of experimental results, peer review, publication of findings, and the reuse of published material for the next generation of scholarly works....”

PS:  As the Fedora Project expands into the Fedora Commons, it is changing the URL of its home page (the project, old URL → the commons, new URL).

Copyright and the transition to OA in Sweden

Sweden's has launched a study of Copyright in a new publishing environment.  From the announcement:

The copyright issues in scientific communication have a crucial role in moving to an Open Access model for scientific publishing. The project aims at providing all academic users with practical, uncomplicated and updated information about copyright connected to scientific communication. The project will

  • survey actual legal practice at Swedish institutions of higher education,
  • report on interesting cases of legal practice from foreign institutions of higher education,
  • illuminate the relations between author and institution, researcher and publisher, law and contract.

The project will use several channels of communication: a new website, a manual with a focus on practical situations, courses and seminars.

The project is co-funded by the Swedish Research Council. The project is funded for a first phase 2007-2008, followed by an external evaluation before deciding on future funding.

Grant from the National Library: 600 000 SEK.

For information contact either:  The Project leader: Ingegerd Rabow...[or] The Coordinator of the development programme of the National Library of Sweden: Jan Hagerlid....

Tuesday, August 14, 2007

August First Monday

The August issue of First Monday is now online.  None of the articles directly addresses OA, but readers of OAN may find these of interest:

TEL needs to remove permission barriers

Andy Powell, How open is The European Library? eFoundations, August 13, 2007.  Excerpt:

I note that the terms of use of The European Library state:

Copying of individual articles is governed by international copyright law. Users may print off or make single copies of web pages for personal use. Users may also save web pages other than individual articles electronically for personal use. Electronic dissemination or mailing of articles is not permitted, without prior permission from the Conference of European of National Librarians and/or the National Library concerned.

Seems a shame.  Surely some the material found through the TEL portal could be made available on a more open basis?

As someone that would like to build experimental virtual exhibitions of European cultural heritage materials in Second Life, I'm scuppered at the first hurdle - I can't easily work out what is available for re-use.  Worse in fact - it looks like nothing is available for re-use!

As I've noted before, the US seems way ahead of us in terms of making digitised cultural heritage material openly available.

Monday, August 13, 2007

Where is the OA law and humanities from the Max Planck Society?

Klaus Graf shows that almost none of the research output of the Max Planck Society’s law and humanities institutes (as opposed to its natural science institutes) is OA through the Max Planck eDoc Server.  Read the original German or in Google’s English

The Max Planck Society is a major voice for OA and organized the Berlin Declaration on Open Access.

Licensing personal genome data

Jason Bobe, Can a personal genome sequence get a creative commons license?  The Personal Genome, August 13, 2007.  Excerpt:

The short answer is no. There is a long essay waiting to be written here. But for now, I can say that the reason it will not work is because there is no clear legal foundation to build a license on top of when it comes to sequence data. Creative Commons licenses have copyright to build on. Material Transfer Agreements (MTAs) have good old fashioned property law to build on (turns out important things still exist outside of the bitsphere). A personal genome sequence is, well, just bits.

If we could, why might a CC-like license for personal genome sequences be useful to an individual? ...

I may not want to give blanket permission to the entire research community to use my data. What about research that I find ethically or morally objectionable? A person might want to reserve some rights in these situations, but they might also want to give blanket permission for research on cancer for example. An efficient compromise might be for me to use a CC-like license to pre-approve the use of my sequence data for some purposes, but not for others. There might be other conditions I would want to place on the use of data. For example, I might require all researchers who use my data to publish in OA journals (kind of like the CC share-alike clause)....

Funding to improve data sharing in neuroscience

The NIH is funding a project on Sharing Data and Tools:  Federation using the BIRN and caBIG Infrastructures.  From the August 3 announcement:

  • Sharing data and tools across a research community adds tremendous value to the efforts of that community.  Search engines like Google show the power of sharing text based data.  While strides have been made, the infrastructure necessary to share and query data sets that have more than just textual biomedical data is still under development.  Examples of such heterogeneous data sets include those that contain images, clinical data, or genomic/gene expression data.  Two large NIH supported infrastructure projects to allow data and tool sharing are the caBIG™ program and the Biomedical Informatics Research Network (BIRN). 
  • Many of the communities involved in neuroscience research embrace the data/tool sharing idea.  Some communities, such as neuroimaging researchers, have seized it, and in so doing, have accrued scientific benefits that would have been otherwise out of reach.  As a specific example, three neuroimaging research communities are serving as the biological test beds for the BIRN infrastructure.  The BIRN infrastructure has now matured to the point where it can serve as a platform for data sharing and informatics tool sharing that extends beyond the neuroimaging researchers involved in the test beds, to include other areas of neuroscience beyond imaging, and to include biomedical research beyond neuroscience....

Modern biomedical research, including, but not limited to, neuroscience research, generates vast amounts of diverse and complex data.  Increasingly, these data are acquired in digital form, allowing sophisticated and powerful computational and informatics tools to help scientists organize, store, query, mine, analyze, view, and, in general, make better use and sense of their data.  Moreover, the digital form of these data and tools make it possible for them to be easily and widely shared across the research community at-large.  The federal investment in computational neuroscience and neuroinformatics research over the past 15 years has resulted in this research community being exceptionally well-poised to take advantage of these converging opportunities, and in so doing, accelerate the pace of discovery in neuroscience.

The purpose of this FOA [Funding Opportunity Announcement] is to encourage researchers to use the caBIG™ and BIRN infrastructures to share data and tools by federating new software tools under these infrastructures or using the infrastructure to federate significant data sets.  Awards issued under this FOA will NOT provide support to develop the tools or to measure data.  The goal is to make these tools/data broadly available to other researchers....

There are four waves of funding and four deadlines for letters of intent: December 18, 2007, August 18, 2008, December 22, 2009, and August 21, 2009.

Funding an assessment of the public domain

The EU is funding an Assessment of the Economic and Social impact of the Public Domain in the Information Society.  From the August 8 announcement

The envisaged purpose of the assessment is to analyse the economic and social impact of the public domain and to gauge its potential to contribute for the benefit of the citizens and the economy.

The documentation:

Time limit for receipt of tenders: 3 October 2007 (16.00 h)

More on the OA portal of European ETDs

Maurice P.J.P Vanderfeesten and Gerard van Westrienen, A Portal For Doctoral E-Theses in Europe:  Lessons Learned from a Demonstrator Project, SURF Foundation, July 2007.

Abstract:   For the first time various repositories with doctoral e-theses have been harvested on an international scale. This report describes a small pilot project which tested the interoperability of repositories for e-theses and has set up a freely accessible European portal with over 10,000 doctoral e-theses1. Five repositories from five different countries in Europe were involved: Denmark, Germany, the Netherlands, Sweden and the UK. The Open Archives Initiative Protocol for Metadata Harvesting (OAI-PMH) was the common protocol used to test the interoperability. Based upon earlier experiences and developed tools (harvester, search engine) of the national DAREnet service in the Netherlands, SURFfoundation could establish a prototype for this European e-theses Demonstrator relatively fast and simple.

Interview with John Willinsky

Dean Giustini, UBC's John Willinsky - Stanford Takes Him (For Now), Open Medicine blog, August 12, 2007.  Excerpt:

UBC's Dr. John Willinsky is no stranger to open access advocates. His book The Access Principle is 'required reading' for all those who believe in the connection between access to information and the economic and social well-being of knowledge-based societies. Recently, John accepted an appointment at Stanford University....

Dean: Open Journal Systems (OJS) has become an enormously popular and easy-to-implement open source publishing platform. Can you provide a little update about how many journals use OJS, and what sorts of developments you are planning for 2007-08?

JOHN:  "The growth of in the use of OJS has provided an exciting opportunity to work with, and assist, new and old journals from around the world. Open Journal Systems is now being used by over 1,000 journals with little over half of the journals coming from developing nations and 35% of them in languages other than English. About half of our users are existing journals that are using OJS to move online, and support their complete publishing process from accepting submissions to publishing issues (including back issues from earlier days). Almost all of the journals are open access, although that includes 40% that offer a form of delayed open access, while still selling subscriptions to their current issues. About half the journals are in the sciences, with a strong contingent of interdisciplinary journals as well.

As for what's next for PKP, we will be releasing the next version of OJS, in a few months time, in association with our parallel release of Lemon8-XML, developed by MJ Suhonos, which will will automate XML conversion from Word and ODT documents. We'll also be including greater support for reference linking, full PayPal support for subscriptions and delayed open access. Then, down the road, we see moving into greatly modularity between Open Journal and Open Conference Systems to give users greater flexibility in the use of these basic scholarly publishing practices." ...

Dean: We've collaborated on Open Medicine, and, as you know, we have a unique funding model to publish it. Put simply, we require volunteer labour (ie. copy-editors especially) and fund-raising to produce the Journal. Do you see any other 'liminal' kinds of funding models among open access journals, where articles are being published but have no business plan?

JOHN:  "I must say that your contribution to Open Medicine has been part of what makes this journal so special. You've proven, among other things, how blog and journal can work hand in hand. As for the economics, I think Open Medicine presents a very interesting open access model of developing and launching the journal, while still seeking out a sustainable economic model that is likely to be made up of a number of parts, including volunteer support, library funding, donors and other agencies. To further this process of testing new models, we are pursuing, with the support of our library partners in PKP, namely Simon Fraser University led by Lynn Copeland, a more active and sustained role for open access journals among the library community. This could well take the form of a cooperative model, in which research libraries participate with journals and scholarly society publishers at a cost somewhat less than current subscription costs for participating journals (given the economies of open source software, library in-kind support, reduced transactional costs, etc.). We don't have any sterling instances of this yet, but given the number of libraries already hosting open access journals (Vanderbilt, Rutgers, UBC, etc.) and the support of SPARC and other agencies, it will not be long, I believe, before we enter the proof of concept stage with this model. What would be great is for a few journals and scholarly societies to come forward (after reading this blog perhaps) and say, yes, we'd like to give this idea a try." ...

Sunday, August 12, 2007

NIH funds for TA publication should shift to OA

Heather Morrison, NIH Public Access Policy: Is the Funding for an OA transition already there?  Imaginary Journal of Poetic Economics, August 12, 2007.  Excerpt:

...[T]he US National Institutes of Health already expends an estimated $30 million annually in direct costs for [toll-access] publication expenses, and provides for "indirect costs" which can be used to pay for such items as library subscriptions and site license fees.

It would make good economic sense for libraries, researchers, and the NIH, to consider redeploying some or all of these "indirect cost" funds currently spent on library subscriptions or site licensing fees, to support open access initiatives.

This makes good economic sense for the NIH. Every article that is published or self-archived for open access is then available for every researcher. Other NIH funded researchers will enjoy savings from grant funds that might have otherwise gone to subscription, interlibrary loan or pay per view fees. Researchers not funded by the NIH also benefit, both financially and in terms of greater access. This increases their capacity to forward our knowledge in the medical arena, which then benefits future NIH researchers and advances us all more quickly to the real goals: understanding, treating, and curing disease....

DFG funds to support OA journals

The German Research Society (Deutsche Forschungsgemeinschaft or DFG) has announced a funding program to launch new science journals, expand existing journals, and help print journals make the transition to electronic publication.  To be eligible for funding, the journals must meet the DFG guidelines for open access, peer review, and preservation.  (Thanks to the Informationsplattform Open Access.)

Update. See this comment by Tom Wilson:

The enlightened character of this development compares favourably with the still unresolved policy of the UK Research Councils, which seem continually to be running scared of actually making a decision....[T]he RCUK considers that there are only two forms of open access, "author pays" and "self-archiving". The notion that research funds, instead of supporting commercial publishers through "author payments", could go to the formation of new, collaborative, no fee, no subscription e-journals, is not on their agenda. The real reason for this, of course, is that the Research Councils fear offending government policy towards business - even if those businesses lie mainly outside the UK....

Update. Klaus Graf argues that the DFG guidelines only require removing price barriers, not removing permission barriers. Read the German original or Google's English.

Interview with Pat Brown

Lisa Junker, Into the Great Wide Open, Associations Now, August 2007.

...Patrick Brown, PhD, is one of the cofounders of the Public Library of Science, a leading voice on behalf of open access, in addition to his day job as a professor of biochemistry and biomedical researcher at the Howard Hughes Medical Institute and Stanford University School of Medicine. Brown recently discussed his strongly held views on open access, its benefits, and its future with Associations Now.

Associations Now: For our readers who may not be totally familiar with open access and the Public Library of Science, could you lay out the basics of what open access is?

Brown: The mechanisms are in place to allow ready access to anything on the internet, and, if the content is digital, allow it to be the substrate for all sorts of computational tools that add value to it. Now that those possibilities exist, the traditional business model is completely obsolete.

That model doesn't serve the purpose of scientific publication, which is to allow discoveries to be made, and made available for the benefit of the people who are supporting the research. If there's no economic reason and no technological reason why those results can't be made freely available for whatever useful purpose, then it's crazy not to be making every effort to make that transition. And the reason that it's not being made is because there's a vested financial interest in preserving the business model.

The definition of open access: Published information is made freely available through public repositories where there is no control by the publisher of who does what with the information, and with a license for use that only imposes a requirement that the original authors, creators, be properly acknowledged. Other than that, you can do whatever you want with it. I can put it in a database. I can republish it. I can post it on my own website. I can develop computer software for extracting information from published papers and presenting it in new formats or linking it to other kinds of data. You name it. It's information that should be viewed as a resource from which to create even more valuable, useful things—as opposed to private property that produces, by restricting access to it, revenue for publishers.

I understand that a lot of completely well meaning associations and groups are very nervous about the potential financial consequences of changing a business model that is still working for them—even if the purpose is inarguably a good one for the public benefit. There's anxiety about this. But I think there's absolutely no fundamental reason why that transition can't be made.

What are the primary audiences that you see being affected by open access?

...Any extent to which you can improve the access of the scientific-research community to that body of knowledge, you have improved their productivity and improved their ability to do their work. Simply being able to peek at papers one paper at a time, or to see only the older literature, falls short of that.

Think about the history of DNA sequence information. There was a decision made, almost casually, 30 years ago, that journals would require published DNA sequences to be put into a public repository—basically, because they got sick of printing page after page of DNA sequences. So it became the standard that all this stuff is in a public repository.

Well, that was an unbelievably fortuitous thing. All that sequence information is really available for people to analyze any way they want. There's a whole scientific field of people who are developing computational tools for finding and organizing information, comparing information and DNA sequences, et cetera. Without that, there would be no genome project—all the progress and breakthroughs that have come in genomics and molecular biology and genetics as a result of having this ability to compare sequences and analyze sequences and stuff like that....

With all these benefits, why would an author not choose to go with an open-access journal?

...[Funding agencies should] just say, "Look, you don't have to take our money, but if you do, our rule is that when you have results that are worth publishing, you have to make them available to everyone, without restricted use. You don't like that, no one's forcing you to do it, just don't take our money." Of course, people would scream and fuss, but I doubt many of them would say, "OK, then we won't take your money."

Another thing that the funding agencies need to do is to take the financial burden of that decision away from the authors and say, "We will cover the cost of publication in any legitimate, peer-reviewed, scientific journal out of funds that are separate from your research budget." ...[Scientists] should not have to bear the burden of doing something that's good for the scientific community on their own shoulders.

Do you envision a tipping point when the market will shift and there will be much more open access and much less traditional publishing? How do you see that happening?

...[I]t's analogous to when journals went online. The tipping point was reached when there was enough material online that, even if the exact paper that you would have preferred to read wasn't in an online journal, you could find enough of the stuff you wanted that it wasn't worth the hassle to go to the library. As soon as that happens, anyone who's not online is headed for the dustbin.

If you have a critical mass of information that is open access, it's a hell of a lot easier to just go to one repository to get it, and if that repository—because the stuff is really open access—is providing all sorts of tools and links for finding information and integrating it with other things and so forth, you're going to go there first. There'll be more and more explaining to do for the publishers to justify not participating in this process once people realize how much benefit they gain from it.

Once there's enough stuff in the open-access space, the non-open-access stuff is going to be marginalized....

What can you tell association publishers to show them that this transition can be sustainable?

There's a bunch of issues there. Number one, a lot of societies that make that claim —I would encourage people to look at their Form 990s. I get great enjoyment out of reading the Form 990s of scientific societies that talk about how important it is to preserve the income from their journals to do all these wonderful things they do, when, very often, the wonderful things they do, taken in aggregate, don't add up to the cost of their chief executive officer.

But let's just take that at face value—that their only motivation is to do good for the world and for science and for their community. One of the questions is, how important are those things that you're trying to fund with profits in your journal, compared to the good that you do for your mission through publishing itself and making access as freely available as possible?

Then there's the issue: Is there a financially sustainable open-access business model? Even if we didn't have a working model for it, I think you'd have to make the argument [that there is]. You're just talking about having to recover the cost of publishing through a different route. If we, or the scientific community, aren't smart enough to be able to do something that we all agree is good, that basically transfers money from the same pockets to pay the cost of publication through a different route, then we're not nearly as smart as we give ourselves credit for....

Wisconsin launches an OA publishing fund

The University of Wisconsin at Madison has launched a Library Fund for Open Access Publishing.  (Thanks to Heidi Marleau.)  From the site:

The UW-Madison Libraries have established a fund to support open-access publication fees and digital publishing by faculty and academic staff....

With the establishment of the Office of Scholarly Communication and Publishing (OSCP), UW-Madison librarians will begin publicizing the availability of funding support for open-access publishing as a part of their outreach and liaison activities.

    The Open Access Fund will:

  • Enable the library to pay publication fees for articles that have been accepted for publication by an established open-access journal. There are now more than 2000 open-access journals serving the international academic community. Many open-access publishers do not require a publication fee, but some of the best known, e. g., Public Library of Science and BioMed Central, have substantial fees that may discourage authors from submitting their articles for publication.
  • Provide seed money and/or matching funds for the publication of open-access books, conference proceedings, and new electronic journals. Large collaborative projects (such as the development of a new journal) will require a contribution from publishing partners as well as the approval of the General Library System Director.

    Guidelines for publishing or open-access support:

  • Proposed projects must be initiated by UW-Madison faculty/academic staff.
  • Publications must allow free access via the Internet for a substantial portion of the published content within six months of the first publication.
  • Projects with cost sharing and/or high degree of cost-effectiveness will be preferred.
  • Relevance of the publication to UW’s research, teaching, and outreach missions.
  • Publications that require one-time financial support will be preferred over publishing projects that are likely to require ongoing support.
  • The fund will not be used to support staffing or capital equipment.


  • Kudos to UWM. This will help faculty publish in fee-based OA journals even when they are not funded or not funded by an agency that permits grant money to be used for this purpose.  It supports the rise of an OA system of peer-review providers.  To judge from the examples, it helps subsidize UWM-based OA journals to reduce the costs for authors elsewhere.  And if Wisconsin enlarges the fund with money saved from dropped journal subscriptions (based on university decisions to cancel subscriptions or publisher decisions to convert to OA), then it will help redirect existing money from toll access publishing to open access publishing. 
  • Wisconsin joins the University of Amsterdam, Nottingham University, and Texas A&M in supporting such a fund.  If there are others that I’m overlooking, I’d like to hear about them.
  • I hope that UMW complements this fund with a policy to require OA archiving in the UMW institutional repository —whether the articles were first published in an OA or a TA journal.

Not anti-publisher but anti-FUD

Peter Murray-Rust, Open data: are licenses needed?  A Scientist and the Web, August 11, 2007.

…I was asked yesterday to summarise for a reporter why I had issues with certain publishers (I’ll post when the report appears). What I am trying to do on this blog at the moment is (a) to find out what the current situations for data access and re-use ARE and (b) then to highlight the cases which I and others think are unsatisfactory for modern data-driven research. I am not “anti-publisher” or “anti-capitalist”, but I am “anti-fuzz” and “anti-FUD”. I try to be relatively fair and I have lauded two publishers whose policies are now clear to me. Sometimes the discourse here seems tedious and repetitive - but that’s the way it is at present.

Since I am a physical scientist and a programmer I often see things in a literal and algorithmic way. If “open access” is defined in a declaration, and everyone in the publishing industry knows about that declaration then I assume by default that the words have a logical constraint or enablement on the content . But that is clearly not true. Various publishers (and I am not rehashing their words today) assume that “open access” can be used in whatever way they choose to define. Perhaps. But it isn’t generally helpful. Similarly others assume that copyright and licences are linked in some manner that is obvious to them but not to me. So, it seems that clear copyright and clear licences are going to have to be part of the future. “Data are not copyrightable” is a simple algorithm but (a) not everyone agrees what data are and (b) some people (especially Europeans) think it doesn’t apply in some cases.

I should also stress that when we use robots to read the literature (as we are now doing) we have to have clear licences. A robot is generally less smart than an adult human and needs telling clearly what it can and cannot do. If that clarity is missing, then default assumptions will be detrimental to some or all of the parties….