Open Access News

News from the open access movement


Friday, February 17, 2006

Time for an OA mandate at NIH

Dorothea Salo, Spaghetti that didn't stick, Caveat Lector, February 16, 2006. Excerpt:
Open Access News spread the word today that the first Report on the NIH Public Access Policy...is out. Compliance rate? A desperately pathetic 3.8%. Three point eight percent of the literature that was eligible for archiving under this policy actually got archived. You begin to see what repository rats are up against? The NIH did its level best to communicate the policy to researchers, and they’re decently competent at outreach. As far as I know, publishers didn’t spread much FUD among researchers. Even so, a big fat nothing happened, because the policy had no teeth and researchers don’t understand and don’t care about the economics or socioinformatics of publishing. I part ways with Stevan Harnad on a lot, but he’s dead right about one thing at least: if researchers don’t have to provide open access, they mostly won’t. I can cajole and jolly and educate and reason with them all I want, but I won’t have nearly the impact of a policy with teeth. We can’t coddle researchers on this; it’s tantamount to coddling Elseviley Verlag. Fortunately, it looks as though the NIH policy is likely to sprout teeth. Because of that, I’m actually not at all saddened that this particular spaghetti-strand didn’t stick when thrown at the wall. We now have cogent evidence that “voluntary” open-access policies aren’t worth spit. That removes a fairly big pillar that Elseviley Verlag likes to hide behind.

Rise of free digital content, fall of priced printed content

Sergey Dmitriev, Have We Seen the Back of the Hard Copy Reader? St. Petersburg Times (Russia, not Florida), February 14, 2006. (Thanks to LIS News.) Excerpt:
All of this means that the market of e-content (that at present basically means mobile content) is one of high growth potential. The hard-copy format is doomed to fall in its share of the market. The Internet University of Informational Technologies gives a few of its courses in the traditional way, despite their complete accessibility on the web site. The university’s administration believes that free publication on the web in no way influences book sales, because those who read books and those who read books on a computer are different categories....Sooner or later open source will settle down in the mobile world. Motorola released a mobile with Linux, and Nokia is already selling a Linux-based internet tablet. On such platforms the installation of DRM systems is pointless....In due course much can be expected from the much-hyped Google project called ‘Print.’ It can therefore be said that there is plenty of similar, accessible content. Moving the text to pocket computers and mobile phones by oneself is less and less complicated. A shift in our system of values is now taking shape. When a new generation that now downloads illegal music from the internet and uses open source software will age, it will more likely change the law than its way of thinking....It means that content in itself is not recognized as a good. And of interest to the market, from a commercial point of view it can only be as a contextual part of a more complicated service. Such thinking, and its corresponding business model, is clearly visible in such companies as Google: on one hand their services are impossible to copy, on the other for end users they are free.

Google's book scans are not of archival quality

Jim Jacobs, Thoughts on Google Book Search, Diglet, February 16, 2006. Excerpt:
Yesterday, I went to the Stanford EE Computer Systems Colloquium to hear Daniel Clancy, the Engineering Director for the Google Book Search Project....Clancy mentioned that Google was NOT going for archival quality (indeed COULD not) in their scans and were ok with skipped pages, missing content and less than perfect OCR -- he mentioned that the OCR process AVERAGED one word error per page of every book scanned!. The key point that I took away from this is that Google book project IS NOT an alternative to library/archive/archival/preservation scans....When I asked if there would be links to libraries on ALL results pages, he hemmed and hawed a bit and wouldn't say one way or the other. He mentioned about the difference between the publisher-supplied content and the library-supplied content and seemed to hint that the publisher-supplied content is subject to stricter licensing agreements....92% of the world's books are not generating revenues for copyright holders or publishers!...Someone asked what had surprised him the most since he started. One thing he was surprised about was that about 70% of the book project use was coming from India.

The Authors Guild gets one right

From the Authors Guild February 14 press release, Trademark Dilution Revision Act Would Weaken Protections for Free Expression:
A bill that would drop express protection for "noncommercial use" of a trademark and would weaken the protections for those who use trademarks in news commentary will be considered by the Senate Judiciary Committee on Thursday. The legislation has already passed the House....Trademarks, including business names, brands, and slogans, are unavoidable and proliferating in daily life. Writers of fiction and nonfiction inevitably incorporate trademarks into their work, sometimes to comment on the particular business using the trademark, but frequently the use is merely incidental to the nonfiction or fiction writer's story ("Tom went to a McDonald's, had a Coke, and waited for the Harley to arrive."). Just as fair use provisions of copyright law permit writers to make certain uses of copyrighted works in their own works, so do fair use and related provisions of trademark law permit writers to use trademarks in their works....The new law would weaken these protections, exposing writers to greater potential liability for their use of trademarks. This would needlessly chill expression.

Comment. Imagine writing up your research on the safety of Vioxx, the security of a new Microsoft browser, or the lawfulness of Google's Library Project, and finding that you have to use the ™ symbol or even pay royalties just to name the product.

Full-text search indexing helps copyright holders, should be lawful

Jonathan Kerry-Tyerman, No Analog Analogue: Searchable Digital Archives and Amazon's Unprecedented Search Inside the Book Program, Stanford Technology Law Review, February 2006. (Thanks to Ray Corrigan.)
Abstract: This paper begins with an overview of Amazon's prior experiments with e-books, the way in which the Search Inside the Book database is created, and how that database manifests itself to the Amazon user. Part II analyzes the Search Inside the Book program under current copyright law and concludes that the program does infringe copyrights in the indexed works. Part III argues that programs like Search Inside the Book, though infringing, actually serve the purposes of copyright law, and should not create liability for the providers of such programs. Finally, part IV applies the fair use doctrine to Search Inside the Book, assuming that the existing copy-protection measures are improved as indicated and ultimately finding this unconventional program protected as fair use.

More on the webcasting treaty

James Boyle, More rights are wrong for webcasters, Financial Times, February 17, 2006. Excerpt:
I teach intellectual property law....Are we doing a good job of writing [IP] rules? The answer is no. Three tendencies stand out. First and most lamentably, intellectual property laws are created without any empirical evidence that they are necessary or that they will help rather than hurt. Second, the policymaking process has failed to keep track of the increasing importance of intellectual property rights to everything from freedom of expression and communications policy to economic development or access to educational materials. We still make law as though it were just a deal brokered between industry groups....The public interest in competition, access, free speech and vigorous technological markets takes a back seat. What matters is making the big boys happy. Finally, communications networks are increasingly built around intellectual property rules, as law regulates technology more and more directly; not always to good effect. The World Intellectual Property Organisation has now managed to combine all three lamentable tendencies at once. The Broadcasting and Webcasting Treaty, currently being debated in Geneva, is an IP hat trick.

DARE aims to triple OA repository deposits

The Dutch DARE project has set itself the goal of depositing 100,000 full-text eprints in the DARE network of OA repositories in the year-long period from October 1, 2005, to October 1, 2006. This would triple the number of items on deposit in the national repository network. From yesterday's announcement:
Dubbed 'hunDAREd thousand’, a new major project was launched on 1 October 2005 in which all Dutch universities, the Netherlands Organization for Scientific Research (NWO) and the Royal Netherlands Academy of Arts and Sciences (KNAW) intend adding 100,000 full-text documents to the DARE (‘digital research’) archives. The project runs until 1 October 2006. By that time the DARE partners hope to have a total of 150,000 academic publications, dissertations, pre-prints, datasets and other research material available for all on www.DAREnet.nl. This will make the documents searchable on other search engines as well. The institutions involved are participating in the project in their own ways. Various universities are focusing on certain faculties or, at the request of their Boards, are providing more researchers for Cream of Science. Others hope to stimulate the actual supply of material, encouraged by their chancellors. Researchers’ own websites are being combed and journals published by research institutes and schools are being included. With this added effort the universities want to improve the structure of and streamline the process of supplying research results. To do so, a link is being established with Metis, the research information system used by all universities.

In the combined harvest of 100,000 new articles, doctoral theses are a common focus. Every year about 2500 of them, about five percent of the country’s entire academic output, are published. DAREnet is making these theses visible under their own heading, ‘Promise of Science’....The creation of a virtual showcase for universities using doctoral theses not only helps shape their profiles as well as those of the graduates in question, it also often enhances access to previously elusive research results. Promise of Science is therefore an extension to Cream of Science, the partial collection already in place. Alongside ‘established’ researchers, extra attention is now being given to ‘up and coming’ research talent.

E-Theses in Europe

The presentations from the JISC-SURF-CURL International Workshop on e-Theses (Amsterdam, January 19-20, 2006), are now online. Also see the responses to the workshop questionnaire, and Neil Jacobs' report on the questionnaires and workshop in the January Ariadne. SURF reports that the workshop gave rise to a European Taskforce for E-Theses, though it gives few details. The taskforce doesn't seem to have a web site yet.

Negotiating licenses and copyright

Two of the three presentations from the SPARC/ACRL Forum at the ALA Midwinter Meeting, Authors and Authority: Perspectives on Negotiating Licenses and Copyright (San Antonio, January 21, 2006), are now online. (The third used no slides for online posting.)

OA to 100 years of Biochemical Journal

100 years of biosciences research captured in digital archive, a press release from JISC, February 17, 2006. Excerpt:

Attended by more than 100 invited guests from the world of the biosciences – including two Nobel laureates - a ceremony was held last night at the British Library at which Professor Sir Philip Cohen, President of the Biochemical Society, presented the digital archive of 100 years of the society’s journal to Lynne Brindley, the British Library’s chief executive. The [open access] archive represents the entire back archive of the journal [Biochemical Journal] of the Biochemical Society which celebrates its centenary this year. Speaking at the event, Sir Philip thanked JISC and the Wellcome Trust for their funding which has not only paid for digitisation of the journal but will also ensure that the archive, hosted by PubMedCentral, will be openly available to all in perpetuity. Digitisation of the journal is part of a major collaborative programme of digitisation called the ‘Medical journals backfiles digitisation project’, a partnership between JISC, the Wellcome Trust and the US National Library of Medicine which will see the digitisation of nearly 1.7 million pages of complete backfiles from important and historically significant British and American medical journals....Robert Kiley, Head of Systems Strategy at the Wellcome Library, spoke of the functionality of the digital archive, containing 392 volumes and 1,340 issues, which will enable researchers, lecturers and students to access fully searchable text and graphics from 1996 in HTML and PDF formats, full internal and external linking and multi-media adjuncts such as moving images and 3-D structures. He also reported that, even prior to its launch, and with no publicity, article downloads from the archive had already reached 948,000.

The ‘Medical journals backfiles digitisation project’, of which this journal is a part, is itself one strand of the wider £16m JISC programme which will see the digitisation of major scholarly resources, including 18th century parliamentary papers, historical population reports from 1801 – 1920, 3,000 hours of newsreel from the archives and of ITN and Reuters Television, and 19th century newspapers and archival sound recordings from the British Library. View further information on the [JISC] digitisation programme.

More on McAfee's OA economics textbook

As noted on OAN last November, Preston McAfee has been developing an Open Access, Open Source economics textbook for some time. He has frozen version 1.5 and made it available via personal website, institutional repository, and print-on-demand.

Thursday, February 16, 2006

NLM Board of Regents recommends strengthening the NIH policy

The NLM Board of Regents (BOR) met on February 7-8 to discuss the November 2005 recommendations from the the Public Access Working Group for strengthening the NIH public-access policy. The BOR sent a letter to NIH Director Elias Zerhouni on February 8, summarizing its own recommendations. The letter is not yet online. Excerpt:
The report of the November 15 Working Group meeting reveals that the current rate of participation in the voluntary Policy is very low (less than 4%). Since there is evidence that the submission system is relatively easy to use and that the majority of NIH-funded researchers appear to know about the policy, technical difficulties or lack of awareness do not appear to be primary reasons for non-compliance.

Based on this information and the opinions expressed by the Working Group members, the Board has concluded that the NIH Policy cannot achieve its stated goals unless deposit of manuscripts in PubMed Central becomes mandatory. We favor public release of NIH-funded articles in PubMed Central no later than 6 months after publication, although some flexibility may be needed for journals published less frequently than bimonthly. We were pleased that most of the publishers on the Working Group indicated an interest in depositing the final published version of articles in PubMed Central on behalf of NIH-funded authors. The Board agrees that this would be highly desirable. The Board encourages NIH and NLM to develop a careful plan for transitioning to a mandatory policy. It will be important to provide clear guidance and a reasonable timetable, to minimize burden on NIH-funded researchers and grantee institutions, and also to continue to work with publishers to make it easy for them to submit articles on behalf of their NIH-supported authors. The next Working Group's next meeting is scheduled for April 10. I [BOR chair, Thomas Detre] would be happy to engage the Group in assisting with transition planning, if that would be helpful.

Comment. This is important. When Congress first asked NIH to develop an OA policy (July 2004), it asked the agency to mandate OA and limit embargoes to six months. When NIH chose instead (September 2004, May 2005) to request OA without requiring it, and to permit embargoes up to 12 months, it found that it couldn't get even 4% of its grantees to comply with the request. Examining the compliance data, the Public Access Working group recommended (November 2005) strengthening the policy and now the NLM Board of Regents joins the recommendation (February 2006). Both recommendations are merely advisory, but the burden has clearly shifted to the NIH either to strengthen the policy or justify continuing with a weakened policy that doesn't meet its own goals. We're one step closer to an OA mandate for the world's largest funder of medical research.

An alternative to DRM for non-OA content

Daniel A. Nagy, DRM beyond copyright enforcement – alternative models for content distribution, INDICARE, February 15, 2006. Abstract:
In this article, we propose an alternative content distribution framework, which provides the necessary incentives for creating digital content without resorting to copyright enforcement. The proposed business model relies on peer-to-peer digital payment for which technical solutions already exist. Existing DRM technologies may actually be recycled for the purposes of the proposed business model, while removing the incentive misalignments currently plaguing the industry.

February D-Lib

The February issue of D-Lib Magazine is now online. Here are the OA-related articles.

  • Bonita Wilson, Unrestricted Access. An editorial noting the success of OA journals, JEP and Ariadne. Excerpt: "Although D-Lib Magazine, the Journal of Electronic Publishing and Ariadne each serve a particular audience, they are complementary in the topics they cover, and they face the same challenges as they strive to provide unrestricted access to the content they disseminate. Few could argue with the goal of timely and open access to high quality scholarly information. New business models are emerging to ensure this goal is met and that such access will continue far into the future. However, it is too early to know whether these models will prove sustainable over the long term."

  • Titia van der Werf-Davelaar, Facilitating Scholarly Communication in African Studies. Abstract: "Web publishing and its technical possibilities, as well as the open access movement that has accompanied it, have resulted in a number of tendencies with mixed implications for scholarly communication. This article examines the impact of these changes in the field of the African studies, where the North-South divide in scientific publishing poses an additional challenge to the issues at stake. It looks at several initiatives taken by the Africanists community in the Netherlands to bridge the divide, in particular the establishment of a digital platform for African studies. It concludes that these initiatives are all geared towards redressing the balance and establishing open scholarly communication on an equal footing, but that true open access can only be achieved if practiced both ways (by North and South) and not at the expense of academic quality standards. In addition it requires the active commitment of each and every individual scholar. This commitment still needs to grow in Africanist circles."

  • Henry Jerez and three co-authors, ADL-R: The First Instance of a CORDRA Registry. Astract: "The Advanced Distributed Learning Registry (ADL-R) is a newly operational registration system for distributed e-learning content in the U.S. military. It is the first instance of a registry-based approach to repository federation resulting from the Content Object Repository Discovery and Registration/Resolution Architecture (CORDRA) project. This article will provide a brief overview of CORDRA and detailed information on ADL-R. A subsequent article in this month's issue of D-Lib will describe FeDCOR, which uses the same approach to federate DSpace repositories."

  • Giridhar Manepalli and two co-authors, FeDCOR: An Institutional CORDRA Registry. Abstract: "FeDCOR (Federation of DSpace using CORDRA) is a registry-based federation system for DSpace instances. It is based on the CORDRA model. The first article in this issue of D-Lib Magazine describes the Advanced Distributed Learning-Registry (ADL-R) [1], which is the first operational CORDRA registry, and also includes an introduction to CORDRA. That introduction, or other prior knowledge of the CORDRA effort, is recommended for the best understanding of this article, which builds on that base to describe in detail the FeDCOR approach."

  • William Y. Arms and five co-authors, A Research Library Based on the Historical Collections of the Internet Archive. No abstract. From the introduction: "As libraries change, scholarship is also changing. For centuries, academic researchers have combed through the collections of libraries, museums, and archives to analyze and synthesize the information buried within them. As digital libraries expand, we can look forward to a day when humanities and social science scholars replace much of this tedious manual effort with computer programs that act as their agents. This article describes a library that is being built for such research. The collections are some ten billion Web pages from the historical collections of the Internet Archive. Initially, the primary users are social scientists."

  • Esther Hoorn and Maurits van der Graaf, Copyright Issues in Open Access Research Journals. Abstract: "This article presents results of a survey undertaken as part of a series of work packages under a joint initiative by JISC and SURF to explore the attitudes of authors in the UK and the Netherlands towards Open Access. The Open Access environment has created a number of entirely new copyright models, which stand in contrast to the traditional academic journals in which the copyright has to be transferred from the author(s) to the journal publisher. The following emerging copyright models in OA journals were identified: [1] a model in which the author keeps the copyright: this was preferred by nearly half of the respondents, [2] two models in which the author shares the copyright (with Creative Commons licences): these were preferred by nearly a third of the respondents, [3] a model in which the author transfers only the exploitation rights to the journal publisher: this was preferred by a small minority. These and other results seem to reflect a desire on the part of academics to change the balance of rights within copyright between authors and publishers in scholarly communication journals. Libraries and academic institutes are already taking part in the scholarly communication copyright debate and could use these results to align their positions with the academics' views."

OA in Turkey

Today at OA Librarian, Ilkay Holt has two postings on OA in Turkey (first, second). Joint excerpt:
The Congress of Informatics Technologies IV, Academic Informatic 2006 took place on February 9th - 11th at Pamukkale University in Denizli, Turkey. It was a very successful conference with variety of subject matters in seperate sessions from e-learning to open access. In the e-library sessions, the ones on open access and institutional respositories were organized around creating an awareness among information professionals about open access and its benefits. At the end of the e-library sessions, the Berlin Declaration was accepted and it was decided that a leading committee on open access and institutional respositories would provide research institutions with necessary information. This committee will be formed with participants from the Turkish Librarians' Associations, Turkish Academic Network and Information Center. After the conference, a press release was distributed covering these decisions and the benefit of open access....

[S]ome major developments on open access [are taking] place in Turkey. One of them is that having the Open Access and Institutional Repositories Working Group under Anatolian University Libraries Consortium (ANKOS). ANKOS is the strongest consortiual body in Turkey and it aims to be a guiding institution for university libraries in meeting their information needs on open access and institutional repositories with the establishment of this group. Its mission, goal, and tasks will be available in English very soon. First activity of the group was to prepare a handout on definition of open access and libraries' role in open access, providing main resources available through the Internet on a web page.

Google VP explains company policy in China

Google has posted the Congressional testimony of Elliot Schrage, the company's VP for Global Communications and Public Relations, explaining why Google decided to cooperate with Chinese censorship demands.

(PS: I'm not blogging this story in depth; it's voluminous and slightly offtopic for OAN. But I'm following it personally and will occasionlly post a key document or development.)

Google Earth as a platform for OA geospatial data

Declan Butler, Virtual globes: The web-wide world, Nature, February 15, 2006. Google Earth is becoming a platform for OA geospatial data. Butler explores how scientists are using GE and how --because GE is free, fun, and spectacular-- this science is reaching the public.

Also see the accompanying commentary on using GE for humanitarian relief after natural disasters, Nature's editorial on the exciting potential for 'virtual globe' software, and Butler's blog posting describing the cluster of related articles.

Journal editor resigns to protest high price

In an open letter dated yesterday, Max Steuer resigned as an editor of Emerald's Journal of Economic Studies in part because of its high subscription price. (Thanks to Ted Bergstrom.) Excerpt from his open letter:
You may want to know that I resigned from editing the Journal of Economic Studies on 6 January 2006. Shortly before that date it was suggested to me that the financial policy of the journal is inconsistent with the culture and practices of the academic community. It was careless of me not to look into this before taking on the job. I simply assumed that the fees charged and other aspects of policy were roughly in line with academic conventions. This turns out not to be the case. On the 6th of January I met with a representative of Emerald Publications to discuss the position. I wanted to be sure of the position, and if possible to affect a change in policy. It was clear that the pricing policy was and is very different from that of many well-known economics journals. In particular, the current price of £6,000 plus vat for six copies is far out of line. It was also clear from our discussion that no change in policy was to be forthcoming. As we know, the contributors and referees of academic journals are on the whole not paid and regard taking on work, particularly refereeing, as part of being members of a scholarly community. I feel badly at having asked many people to devote time to the journal....The policy of the Journal of Economic Studies is not determined by the Board of the journal, but by the owners....Board members should consider their positions.

Threats to the scientific commons

Richard R. Nelson, The Market Economy, And The Scientific Commons, text of a talk given at the University of Michigan School of Law, January 26, 2006. (Thanks to John Wilbanks.) Nelson does not discuss OA to literature or data, but focuses on threats to the scientific commons from patents, exclusive licenses, and high licensing fees. Excerpt:
[I]t is widely recognized that the power of market stimulated and guided invention and innovation often is dependent on the strength of the science base from which they draw....This science base largely is the product of publicly funded research, and the knowledge produced by that research is largely open and available for potential innovators to use. That is, the market part of the Capitalist engine rests on a publicly supported scientific commons. The message of this essay is that the scientific commons is becoming privatized. While the privatization of the scientific commons up to now has been relatively limited, there are real dangers that unless halted soon important portions of future scientific knowledge will be private property and fall outside the public domain, and that could be bad news for both the future progress of science, and for technological progress. The erosion of the scientific commons will not be easy to stop. Here I want to call the alarm, and to suggest a strategy that has some promise....

An associated belief or ideal ["that until recently has served well to protect the scientific commons"] is that the results of scientific research are and should be published and otherwise laid open for all to use and evaluate. As Robert Merton (1973) argued, the spirit of science is ‘communitarian’ regarding access to the knowledge and technique it creates. All scientists are free to test the results of their fellows and to find them valid or not supported, and to build on these results in their own work. Because the results of scientific research are laid in the public domain for testing and further development, the bulk of scientific knowledge accepted by the community is reliable (as John Ziman (1978) has emphasized) and scientific knowledge is cumulative. These are basic reasons why the scientific enterprise has been so effective as an engine of discovery. And economists often have argued that keeping science open is the most effective policy for enabling the public to draw practical benefits from it. My argument in this essay is that the part of the theory about good science that stresses the value of open science is basically correct, but is in danger of being forgotten, or denied....The case for open scientific knowledge clearly needs to be reconstructed recognizing explicitly that much of scientific research in fact is oriented towards providing knowledge useful for the solution of practical problems, that the applications of new scientific findings often are broadly predictable, and that this is why control over scientific findings in some cases is financially valuable property. I think there is a case for keeping basic scientific knowledge open, even under these conditions. To privatize basic knowledge is a danger both for the advance of science, and for the advance of technology....In Section II, I discuss the rise and erosion of the idea that public support of open science is warranted because the expected returns are high but the areas of return are so uncertain that market mechanisms will not suffice....

I believe the key to assuring that a large portion of what comes out of future scientific research will be placed in the commons is staunch defense of the commons by universities. Universities almost certainly will continue to do the bulk of basic scientific research. If they have policies of laying their research results largely open, most of science will continue to be in the commons. However, universities are not in general supporting the idea of a scientific commons, except in terms of their own rights to do research. In the era since Bayh-Dole, universities have become a major part of the problem, avidly defending their rights to patent their research results, and license as they choose....The argument that if an exclusive license is not given, no one will try to advance, seems particularly dubious for research tools of wide application, or for findings that appear to open up possibilities for new research attacks on diseases where a successful remedy clearly would find a large market....Universities will not give up the right to earn as much as they can from the patents they hold unless public policy pushes them hard in that direction. I see the key as reforming Bayh-Dole. The objective here, it seems to me, is not to eliminate university patenting, but to establish a presumption that university research results, patented or not, should as a general rule be made available to all that want to use them at very low transaction costs, and reasonable financial costs. This would not be to foreclose exclusive or narrow licensing in those circumstances where this is necessary to gain effective technology transfer. Rather, it would be to establish the presumption that such cases are the exception rather than the rule.

OA repositories are not peer-reviewed publications

Stevan Harnad, OA IRs are not peer-reviewed publications: They are access-providers, Open Access Archivangelism, February 16, 2006. Excerpt:
On Wed, 13 Feb 2006, Sarah Kaufman wrote in JISC-REPOSITORIES:
...having spoken to academics within this institution, it has become apparent that potential depositors may be wary of depositing into a digital repository as they fear that a repository that includes pre-prints may not appear 'credible'. Has anyone else dealt with this sort of concern, and how you responded to those that have voiced this concern? Do any repositories exclude items that have not gone through the peer-review process? If you accept items that have not gone through the peer-review process, do you apply any forms of quality control on the item?

The following may perhaps save people a lot of time that will otherwise be wasted re-inventing this superfluous wheel:

  1. The right way to make the distinction between published, peer-reviewed material and unpublished material is the classical way: by tagging it as such.
  2. The IR softwares have tags for peer-reviewed articles as well as for unrefereed preprints.
  3. The scholarly/scientific community is quite aware of this distinction; it has already been dealing with it for years in the paper medium, in the form of published articles versus unpublished drafts.
  4. An IR is not a publication venue -- it is a means of providing access to published -- and, if the author wishes, unpublished -- work.
  5. Any user who wishes to reserve their time and reading to peer-reviewed, published work can do so; they need only note the tags (is it "peer-reviewed"? is it "published"? what journal is it published in?).
  6. Disciplines differ in the degree to which they use pre-refereeing preprints: physics relies heavily on them, biology less. This is a choice for researchers to make, both as authors (deciding what to deposit) and as users (deciding what to read).
  7. This decision cannot and should not be made a priori by IR managers. An IR deposit is not a publication, any more than a mailed first draft on paper is. It is a decision by an author to provide, and by a user to use, a document.
  8. The most absurd thing of all would be to institute an IR-level system of "quality" control: Leave that to the peer specialists and the journals. IRs are just access-providers.
  9. It can and should, however, be decided whether an IR is for research output only (documents and data, whether pre- or post-peer-review) or it is also for non-research output (e.g., teaching materials). Some IRs that are sectored by subject matter will also want to decide what disciplines they are catering for.
  10. The right thing to tell naive researchers who have never self-archived or never used an OA IR is that an OA IR is neither a publication nor a library catalogue of publications: It is a means for researchers to maximize access to their research output, both before and after peer-reviewed publication.

NIH report to Congress

The NIH progress report to Congress on the public-access policy is now online (dated January 2006). It's a scanned image, so I can't cut and paste an excerpt and I don't have time to rekey the important parts. Here are the highlights.

The number of articles deposited in PubMed Central under the policy from its launch on May 2, 2005, to December 31, 2005 (namely, 1,636), the total number of articles covered by the policy that should have been deposited in the same period (43,000), and the embargo periods requested by authors (60% authorized release immediately upon publication, 23% requested embargoes of 10-12 months, and 17% requested something in between). The compliance rate is a miserable 3.8%. "Lack of awareness does not appear to be the primary reason for the low submission rate."

PMC usage increases as its size increases.

NIH sees no evidence that its public-access policy "has had any impact on peer review".

The cost of handling submissions and administering the policy was $1 million for fiscal 2005. If the compliance rate grows to 50%, the cost would grow to $2 million/year. If the compliance rate were 100% (65,000 articles/year), the cost would be $3.5 million/year.

The report describes the NIH's outreach efforts to educate stakeholders about the policy: NIH staff, grantees, grantee institutions, and the journals where grantees publish their work.

Finally, the report describes the November 15, 2005, decision of the Public Access Working group (PAWG). Ten out of 11 members wanted grantees to deposit the final, published versions of their articles. Nine of 11 voted to mandate deposit and public access. Eight of 11 voted to shorten the permissible delay to six months, with some flexibility for rare exceptions. (The minutes of the PAWG meeting are also online.) The report says nothing about the PAWG recommendations except that "[t]he NLM Board of Regents will consider the opinions of the Working Group at its next meeting."

Update. The NLM Board of Regents meeting mentioned in the final sentence took place on February 7-8, 2006. In the meeting, the Board endorsed the PAWG recommendations. For details, see my blog posting later on February 16, above.


Wednesday, February 15, 2006

OA repositories and related projects in the UK

Neil Jacobs, Digital Repositories in UK universities and colleges, FreePint, February 16, 2006. (Thanks to Garrett Eastman.) Excerpt:
Building on a previous development programme (Focus on Access to Institutional Resources - FAIR), the current Digital Repositories development programme [from JISC] consists of some 25 projects that are exploring the role and operation of repositories. Many of these are concerned with how repositories can help academic researchers both do and share their work more effectively. Open access is a key driver and demands are growing for the outputs of publicly-funded research to be freely available on the web....Repositories have a key role to play, since they both enable open access, and help universities and colleges manage the intellectual output of their researchers....In terms of active development, work is underway to help universities set up and populate repositories (Sherpa), to establish a Scottish research repository infrastructure (IRI Scotland), and to investigate the questions of different versions of academic papers (Versions).

In the rest of the article, Jacobs describes each of the other projects in the Digital Repositories Programme.

Update. Also see the JISC press release on Jacobs' article, February 16, 2006.

DRM and academic publishing

The February 5 issue of Groklaw has two articles on DRM and academic publishing:
  1. Roy Bixler, Digital Copyright Issues in Academic Publishing
  2. Bruce Barton, The tension between DRM and academic publishing.

The articles are at the same URL. You'll have to scroll down for Barton's; there's no internal link. (Thanks to Ross Scaife.)

Excerpt from Bixler:

Academic publishing is a more interesting case because the market dynamic is different from commercial publishing. Academic publishing generally serves niche markets which are inherently unprofitable. The mission of academic publishing tends to focus on the dissemination of knowledge instead of on pleasing shareholders....Michael Jensen of the National Academies Press says that they have considered the issues of DRM and have decided not to use it in their publications because they would prefer to maxmimise availability of their content, they do not want to lock it down and also they do not want to deal with the customer service issues that may come with DRM. Jensen also says that they have started putting books on their Web site for free reading/browsing in 1994, have more than 3,500 books online and now have a significant amount of traffic at 1.25 million hits per month. In the past, they have implemented watermarking on their downloadable PDF (Portable Document Format) files but abandoned that practice 8 months ago since they have found so few issues with online copyright infringement that it was not worth the trouble....According to the AAUP, university presses are subsidised and on average make about 85% of their revenues on sales. Given this, it is easy to understand why some university presses with uncertain subsidies are less enthusiastic about the idea of easily available copies which current technology enables. They can ill afford any significant loss to their already pinched revenues. But, at this point, any loss due to unrestricted digital copies is hypothetical....Ultimately, since it costs money to edit and produce print works, the questions are about business models. At the same time, the customers of academic works value the ability of free access to print works and would frown on any technogical restrictions which make this more difficult. If free copying is available, will current business models still work? If not, can an alternate model compatible with free copying be found? If DRM is inevitable, then can it at least be made minimally intrusive and user-friendly?

Comment. Bixler seems to be thinking of academic books, not academic journals. Journal publishing is extremely profitable. Likewise, his romantic claim that "The mission of academic publishing tends to focus on the dissemination of knowledge instead of on pleasing shareholders" is more accurate for university presses that publish books than commercial behemoths that publish journals. Finally, apart from the NAP, Bixler seems to be unaware of open access, which dispenses witih DRM. We can look for ways to make DRM minimally intrusive, but I'd rather look for ways to promote OA and leave DRM behind.

Excerpt from Barton:

With respect to DRM, what strikes me as both interesting and a challenge for university presses is the tension between the mission they serve and the business model under which they operate. Their mission, of course, is the certification and dissemination of scholarship. The business model comprises a number of things, most notably, the direct recovery of costs from readers. The tension follows from the most common cost recovery strategy: by restricting access to scholarship to only those readers who have paid for access, presses limit distribution and therefore, potentially, dissemination. I say "potentially" because in some disciplines I imagine presses reach the 200 people in the world capable of or interested in reading the most arcane of their publications. (Libraries purchase access for the communities they serve. Access nearly always implies that someone has paid for it.)...All would be well if purchasing power were unlimited. It isn't. And consequently scholarship is not thoroughly nor, one should note, equitably distributed. To the extent that this is true, university presses are failing their mission....[I]n the electronic world, both notification and second-copy distribution costs are dropping dramatically. Let's assume for a moment that universities, scholarly societies, or other sources of funding were to pay for certification (managed peer review) and first copy costs. Then there would be no significant costs remaining and no need for DRM as a means of extracting payment in exchange for access. In effect, these funding sources are already paying for the production of scholarship. And compared to those costs, the cost of publication is tiny. There is certainly a precedent for this approach to publishing: a portion of research grants routinely go to paying for the page charges commonly assessed by scientific journal publishers. Moreover, as Steve Izma points out, the same funding sources are paying much of the DRM fees. It seems like madness to suffer the transaction costs involved in this cost recovery model. The way out of this? I do not expect to see it coming from within the university press or the library communities. Budgetary expectations are too entrenched. I think that it is more likely that we will see a new generation of scholars organizing peer review and publication amongst themselves and deciding for their peers that this counts towards tenure and promotion. And they will teach their graduate students where to look for the best scholarship (as their teachers taught them). They will publish to whomever can find them and the good stuff by virtue of its citation network will rise to the top of Google's hit list (assuming Google doesn't make you pay to get there).

Comment. Barton too seems unaware of open access, though he comes very close to rediscovering OA journals. Friends of OA: we still have a lot of educating to do!

Advice for providing OA to data

Tom Coates, Native to a Web of Data, a large slide presentation on the future of the web. (Thanks to Richard Ackerman.) Much of Coates' presentation can be taken as advice for those providing OA to data:
  1. Look to add value to the Aggregate Web of data
  2. Build for normal users, developers and machines
  3. Start designing with data, not with pages
  4. Identify your first order objects and make them addressable
  5. Use readable, reliable and hackable URLs
  6. Correlate with external identifier schemes
  7. Build list views and batch manipulation interfaces
  8. Create parallel data services using standards
  9. Make your data as discoverable as possible

More on why Google Library will help authors and publishers

Cory Doctorow, Why Publishing Should Send Fruit-Baskets to Google, BoingBoing, February 14, 2006. Excerpt:
Google's new Book Search promises to save writers' and publishers' asses by putting their books into the index of works that are visible to searchers who get all their information from the Internet. In response, publishers and writers are suing Google, claiming that this ass-saving is in fact a copyright violation. When you look a little closer, though, you see that the writer/publisher objections to Google amount to nothing more than rent-seeking: an attempt to use legal threats to milk Google for some of the money it will make by providing this vital service to us ink-stained scribblers. Opponents of Google Book Search (GBS) argue that publishers should have been consulted before their works were scanned, but it's in the nature of fair use that it does not require permission -- that's what a fair use is, a use you make without permission. They argue that GBS should pay some money to publishers because anyone who makes money off a book should kick some back -- but no one comes after carpenters for a slice of bookshelf revenue. Ford doesn't get money from Nokia every time they sell a cigarette-lighter phone-charger. The mere fact of making money isn't enough to warrant owing something to the company that made the product you're improving....The reality is that the biggest threat to book-writers and publishers is that their works are simply invisible to people who get all their information from the Internet. Google Book Search makes our books visible to those people. In so doing, Google will save our asses from oblivion. Instead of sending legal threats to Google, I think that writers and publishers should be sending them fruit-baskets and thank-you notes....GBS puts books on a near-equal footing with other information resources, the ones that are currently kicking the hell out of us. When a customer performs a Google search, she can get results, right there on her screen, from real, actual books, books that can often be purchased with a single click. This is our single best hope for extending our industry's lifespan for a decade or two.

Research libraries shouldn't fear OA

Ross Atkinson, Introduction for the Break-Out Sessions: Six Key Challenges for the Future of Collection Development. A talk at the Janus Conference on Research Library Collections, Cornell University, October 10, 2005. Atkinson is the Associate University Librarian at Cornell. (Thanks to William Walsh.) Excerpt:
Collections attract scholars, graduate students, government support, donor funding --and add prestige to the institution. This rationale for collection building --the collection as institutional capital-- is a primary motivation, even though it is seldom specifically discussed. One point we must bear in mind with respect to this rationale, however, is that it entails or implies the existence of a separate collection at each institution which can in effect compete with all others. The new environment into which we are now moving, on the other hand, is likely to be increasingly characterized by a much more unified collection to which all users would have access. Indeed, what perhaps so fascinates us and unnerves us about open access, I think, is that it might serve as a first, decisive step in the direction of a more unified, less institutionally based collection. While there is no question whatsoever that open access represents a supremely valuable trend ideologically --perhaps the ultimate aim of all collection services-- libraries continue to wrestle with its implementation and implications, including its effect on institutional identity. However, such a concern about identity, if I am correct in sensing it, is a red herring --because of what we might call the “axiom of non-equivalence.” By this I mean the trivially simple fact that individual libraries are not the same, nor will they ever be. Each has vastly different resources --not only financial, but also human and creative resources, including different visions and values. The fact is, therefore, that all scholarly publishing could convert to open access tomorrow --every scholarly publication could be made openly accessible-- and still, the accessibility, the collection service, the ability of the user to find, understand, use and apply the individual object, would vary enormously from one institution to the other. Any morbid fear we might harbor, therefore, of becoming mirror images of each other as we move toward a more unified collection is unfounded, and we cannot allow it to deter us from moving in that direction, if we decide that direction is in the best interest of our user communities....

We speak often and rightly of a crisis in scholarly communication. That crisis is not a matter of egregiously priced science journals; as disastrous as such excessive pricing is, it is really only a symptom of the so-called crisis. Nor is the crisis simply a result of the fact that each player on the horizontal line is trying to use the information object for a different purpose --for that has always been the case, probably back to antiquity. No, the crisis is rather a result of the fact that there is now a level of technology available to each player on the line, such that each player can assert its will and compete with other players much more effectively. What any player on the horizontal line can do is therefore now heavily contingent upon what other players can and want to do....[P]ublishers are...obviously competing among themselves --as are libraries. What is perhaps most different about libraries, however, is that they have some difficulty acknowledging and dealing with that competition. They may even pretend sometimes that no such competition takes place. They focus instead with intensity on the horizontal line --publishers, the Evil Empire, vendor effectiveness-- perhaps in order to avoid taking the vertical line into account....Speaking personally, what scares me about the brilliant, trail-blazing, revolutionary arrangement the Google 5, and especially Michigan, have made with Google, is not the effect of that arrangement on the horizontal line. Such a service, if it can be effected, can indeed only benefit the movement of scholarly information from writer to reader. What scares me is rather the effect of the arrangement on the vertical line --on research libraries’ relationships with each other. I am frankly frightened that I will not be able to provide users at Cornell with a level of collection service that will be competitive with the collection service that Michigan will be able to provide its users, once its entire print collection is in digital form. And I think many research libraries are concerned about this --although, again, we are loath to discuss it.

Code of conduct for ... Google and Yahoo in China

The Electronic Frontier Foundation (EFF) has written an open letter to members of Congress proposing A Code of Conduct for Internet Companies in Authoritarian Regimes. The proposal wouldn't ban doing business in authoritarian regimes and probably wouldn't even ban the controversial practices by Yahoo and Google now making news. But it would ask companies to retain as little identifiable data on users as possible; to notify users when removing or hiding web sites; when forced to suppress information within a country, to make the same information available elsewhere; to document government censorship requests and the laws, if any, that require compliance; to keep these records even if they cannot be made public until after a regime change; to avoid "actively and knowingly providing services that facilitate censorship or repression"; and to offer users encryption and circumvention technologies. This is a very good start.

Spanish National Research Council signs Berlin Declaration

The Spanish National Research Council (Consejo Superior de Investigaciones Científicas, CSIS) has signed the Berlin Declaration on Open Acces to Knowledge.

(PS: This is big, and could mean that we'll soon see a proposal to mandate OA to publicly-funded research in Spain.)


Tuesday, February 14, 2006

OA for democracy and the knowledge economy

Becky Hogge, The year of free culture? Open Democracy, February 14, 2006. Excerpt:
As the UK government reviews patent and copyright law to boost Britain's creative economy,...democratic access to knowledge ought to benefit too....Three separate reviews, two conducted by the British government, and one by a leading thinktank, will examine current patent and copyright law, and their findings will have direct implications for the democratic future of the information society....What has this got to do with technology or democracy? The dissemination of intellectual property, or access to knowledge, is one of the key pillars of democracy. As information courses ever more rapidly through the internet, barriers to access are gradually reduced....If Britain is to come out top in the knowledge economy, it needs to understand new ways of managing intellectual property that have emerged with the information society, existing under the controversial umbrella term "free culture". One such development is the Gnu project's general public licence (or "copyleft"), which nurtures collaborative authorship of software within the open source movement. More recently, Lawrence Lessig's Creative Commons licences have shown how relaxing key areas of traditional copyright can democratise creativity in a new remix culture, facilitated by the wider availability of digital sound and moving image production tools. Within the scientific community, the open access publishing movement is exploring how best to disseminate scientific knowledge in the internet age. And new proposals to decouple research and development from production in the pharmaceutical industry, inspired by the success of the Sanger Institute in mapping the human genome, are gaining gradual support. Will the British government take note of these innovations in their reviews?...What is crucial now is that defenders of the public good vested in the democratic dissemination of information step forward to make their voices heard.

More on the journals pricing crisis

Mallory Bowman and Chris Brown, Libraries face debt: Journal subscriptions may be in jeopardy, Louisville Cardinal Online, February 14, 2006. Excerpt:
University of Louisville students may soon lose access to a number of scholarly journals provided through the library system if funds cannot soon be obtained. Rising costs of subscriptions for both hard copies and electronic versions of scholarly journals and research databases have created budget woes for the library, which is now $500,000 - $600,000 behind in subscription payments, said U of L Dean of Libraries Hannelore Rader. While no subscriptions to scholarly journals have been permanently deactivated, if funds are not identified soon, cuts will have to be made, she said....“This year we thought, ‘Oh my God, can we get any more money?’” she said. “In order to pay for these in years to come, our base budget needs at least another million dollars.”...John Drees, director of U of L’s Office of Communication and Marketing, said the university is underfunded by $52 million compared to its average benchmark institutions....[Rader] said part of the problem is that she knows the exact cost of a subscription only when a bill hits her desk. “We really have no idea how much these journals are going to cost us from year to year,” she said. “Inflation for these journals is usually 10 to 20 percent every year, and we don’t know how much they are going to cost us until we get the invoices.”...Rader said the library’s materials budget was $7 million for the 2005-2006 fiscal year, and that the budget this year wouldn’t cover all the subscriptions that the library holds. She explained that large databases, especially those based outside of the United States, have a monopoly on many of the scholarly journals....Mary Beth Thomson, associate dean for Collections and Technical Services at University of Kentucky Libraries, said her school’s libraries are facing the same situation. “This is a problem that’s not unique to U of L or UK. It’s not unique to the state of Kentucky. It’s a nationwide problem,” Thomson said. “The fact is that the cost of scholarly communication is increasing, and our budgets cannot keep up.”...Rader said the U of L and UK are working together to ensure that at least one of institutions has access to certain journals. “We’re trying to do something, [like] keeping one copy in the state,” she said. Students and faculty will have limited access to journal articles provided through interlibrary loan, however, since Rader says copyright laws strictly limit the number of articles that can be shared between schools.

Openness as a response to globalization

Hans van Ginkel, Toward a smarter information superhighway, Asahi Shimbun, February 14, 2006. Van Ginkel is Under-Secretary-General of the United Nations and the Rector of the United Nations University. Excerpt:
Universities...are increasingly recognizing that the real expansion today in international knowledge exchange does not relate to physical mobility of individuals, but rather to the mobility of knowledge itself. Openness and sharing knowledge have always been the hallmarks of the most successful universities around the world....MIT offers a rather interesting model of an educational institution responsive to globalizing trends. In particular, I would like to refer to their work with free online learning through the OpenCourseWare program. Many universities use distance and online learning as an important measure to increase student numbers....This represented a revolutionary step forward since up until then the trend had been for universities to restrict access to their knowledge behind password-controlled learning management systems. It is MIT's boldness in promoting this open approach that I believe contributes to its success and high ranking among international educational institutions. This same kind of openness is highly relevant to Japanese universities and would contribute to their climbing up the international university rankings. Hence, I am delighted to see that a number of Japanese universities have decided to work together to promote the idea of OpenCourseWare in Japan. The Japan OCW Alliance includes Keio University, Kyoto University, Osaka University, Tokyo Institute of Technology, Tokyo University and Waseda University. This alliance was launched in May 2005, so it is relatively young, but has already placed a large amount of educational content online. It represents an important showcase of quality educational materials and, as professor Yuichiro Anzai, president of Keio University, has remarked, the Alliance is indicative of a new leadership role for Japanese universities. This view is echoed by President Kazuo Oike of Kyoto University who views the Japan OCW Alliance as contributing to the accumulation of intellectual capital on the World Wide Web. Moreover, as President Hiroshi Komiyama of the University of Tokyo remarked, "our goal is to give back to society the fruits of our educational activities. We would also like to assist those who are eager to learn by themselves, and those who would like to take part in a dynamic creative activity."...We at UNU share the same vision. At the 2002 Johannesburg World Summit on Sustainable Development, I called for the creation of a Global Learning Space to support science-based education through various projects, including the U.N. Water Virtual Learning Center, the Global Virtual University and the Asia Pacific Initiative. The latter includes close collaboration with Keio University and a network of universities in Southeast Asia and the Pacific to develop and run educational programs in the region on issues of sustainable development. More recently, at the World Summit on the Information Society held in Tunis in November 2005, I called for an "Information Society Open to All" where the provision of open educational resources would represent a core component....The way forward in this globalizing world, I would argue, is to promote openness within the education system.

Sharing knowledge works better than billions of bucks

Si Chen, When Billions Aren't Enough, Open Source Strategies, January 21, 2006. (Thanks to PhotoSydney via Jean-Claude Bradley.) Excerpt:
Our favorite anti-open source article, "Winning the Linux Wars", suggested that Microsoft partners should be "Playing the R&D card" by emphasizing that "Microsoft invests north of $6 billion a year on R&D. There is nobody in the Linux world that does that." Well, Merck (MRK) invests about $4 billion a year in R&D. Bristol-Meyers (BMY) $669 million. Eli Lilly & Co. (LLY) $2.7 billion. Pfizer (PFE) $1.8 billion. Sanofi-Aventis (SNY) a whopping $10.2 billion, or nearly half of its $20.5 billion in revenues. Together, that's about $19.5 billion a year in research and development. Apparently, though, that's not enough. This Friday (January 20, 2006), The Wall Street Journal's "Science Journal" ran article entitled "In Switch, Scientists Share Data to Develop useful Drug Therapies" [PS: blogged here 1/2/06] which pointed out that there is a "crisis in 'translational science,' or turning basic discoveries into therapies," and that only twenty new drugs were approved by the US Food and Drug Administration in 2005. One billion a drug, approximately. More importantly, the article points out some interesting trends: [1] The pace of basic biomedical research is outstripping the pace of translational research. In other words, we're learning about genetics and biology faster than we're able to make drugs based on that knowledge. [2] As a result, foundation grants have not produced concrete results of cures for illnesses. [3] In response, the foundations are now shifting funds from basic research into therapies, taking over a role once left to industry....What is the unique collaboration? Sharing knowledge. The foundation which sponsored the research is apparently requiring the scientists it funds "share results in real time," rather than keep their discoveries proprietary. As a result, it has made the scientists feel more accountable for their work and therefore become more engaged in curing diseases.

More evidence that OA increases submissions

Sara Schroter, Importance of free access to research articles on decision to submit to the BMJ: a survey of authors, BMJ, February 14, 2006. NB: "This is version 2 of the paper. In this version we have clarified the role of the BMJ in publishing studies carried out by its staff."
Abstract. Objectives. To determine whether free access to research articles on bmj.com is an important factor in authors’ decisions on whether to submit to the BMJ, whether the introduction of access controls to part of the BMJ’s content has influenced authors’ perceptions of the journal, and whether the introduction of further access controls would influence authors’ perceptions.

Design. Cross sectional electronic survey.

Participants. Authors of research articles published in the BMJ.

Results. 211/415 (51%) eligible authors responded. Three quarters (159/211) said the fact that all readers would have free access to their paper on bmj.com was very important or important to their decision to submit to BMJ. Over half (111/211) said closure of free access to research articles would make them slightly less likely to submit research articles to the BMJ in the future, 14% (29/211) said they would be much less likely to submit, and 34% (71/211) said it would not influence their decision. Authors were equally divided in their opinion as to whether the closure of access to parts of the journal since January 2005 had affected their view of the BMJ; 40% (84/211) said it had, 38% (80/211) said it had not. In contrast, 67% (141/211) said their view of the BMJ would change if it closed access to research articles. Authors’ comments largely focused on disappointment with such a regressive step in the era of open access publishing, loss of a distinctive feature of the BMJ, a perceived reduction in the journal’s usefulness as a resource and global influence, restricted readership, less attractive to publish in, and the negative impact on the journal’s image.

Conclusions. Authors value free access to research articles and consider this an important factor in deciding whether to submit to the BMJ. Closing access to research articles would have a negative effect on authors’ perceptions of the journal and their likeliness to submit.

Version 1 of the paper was published on January 9 (blogged here on January 10), 2006.

Profile of Thomas Krichel

Heather Morrison, Thomas Krichel: a man with ideas, and drive! OA Librarian, February 13, 2005. Another installment in Heather's celebration of librarians working for OA. Excerpt:
Money doesn't make the world go round. Ideas do! So said Thomas Krichel, Assistant Professor, Palmer School of Library and Information Studies, Long Island University, at the First E-LIS Workshop....May I add: it is people with ideas - and the drive and determination to see their ideas realized - that really make the world go round! People like Thomas Krichel - E-LIS team member, early open access pioneer, and founder of the world's second largest archive (after arXiv), RePec. Thomas is recognized twice on Peter Suber's Open Access Timeline. On February 1, 1993, Thomas launched the Working Papers in Economics (WoPEc), with the deposit of an open access working paper (not his own), the first in economics. On May 12, 1997, Thomas launched RePEc, Research Papers in Economics, which as of today holds over 362,000 items of interest, 261,000 of which are available online....WoPEC and RePEc both emerge from traditional practices in economics, which has a strong working papers culture. The distributed archives approach also reflects practice in the field. This model works so well because it fits the discipline, rather than the other way around....The American Economics Association has been collecting information about working papers for years, but their collection was not as comprehensive as RePEc's. Now, information about RePEC is being added directly to the Association's EconLit. Why would a volunteer-based organization like RePEc choose to give away their work to a profit-making publisher for free? Because, says Thomas Krichel, this works to the advantage of both: EconLit is more valuable, and placing your work in RePEc is the best way to ensure your working papers are included in EconLit, which enhances the success of RePEc....Thomas' advice on what students and librarians should be learning for the future: open source software!

India digitized 149 academic libraries

Rediff India Abroad reports that India has digitized 149 university libraries. The very brief article doesn't say whether the digitized content will become OA. (Thanks to LIS News.)

OA distortions from Elsevier CEO

Elsevier CEO Erik Engstrom gave a presentation on Open Access at the Reed Elsevier Investor Seminar 2005 last November 21, 2005. (Scroll to p. 39 of the seminar report.) Since the slides are not easy to cut and paste, I'll link to William Walsh's excerpts, since he's already done the work. (Thanks, William.)

Comment. I see no signs of animosity in the talk. But I do see several misunderstandings: (1) Engstrom frequently refers to the "author pays" model, even though this is an inaccurate label for the business model he has in mind. (2) He repeats the inaccurate observation that OA journal launches peaked in 2001 and have declined since. (3) On author attitudes, he cites the CIBER 2005 report, which seems to have asked authors whether they'd be willing to pay a fee out of pocket, and does not cite the Key Perspectives reports, which better understand the way that OA journals actually operate. (4) He relies on the discredited Cornell calculation, which assumes that all OA journals charge author-side fees and that all fees would be paid by universities. (5) He relies on the discredited assumption that OA journals exclude indigent authors, unaware that most OA journals charge no fees at all and most of the others waive them in cases of economic hardship. (6) He cites the Kaufman-Wills report for its retracted conclusion about weak peer review, but seems unaware that the same report also showed that fewer than half of OA journals charge processing fees, and that a greater percentage of non-OA journals charge author-side fees than OA journals.

OA to data in the UK

Liz Lyon, Digital Libraries and e-Research: new horizons, new challenges? A presentation at the 8th Bielefeld conference, Academic Library and Information Services - New Paradigms for the Digital Age (Bielefeld, February 7-9, 2006).

One university's experience trying to fill its IR

Carol Hixson, If We Build It, Will They Come (Eventually)? Scholarly Communication and Institutional Repositories, The Serials Librarian, 50, 1/2 (2006). Prepublication.
Abstract: Specialists in serials have been dealing with the effects of an imbalance in the scholarly communication process for some time. The increase in scholarly output coupled with the decreasing ability of libraries to provide access to that output due to spiraling journal costs has created tensions for libraries and their communities. By advocating and providing a means to provide open access to scholarly output, institutional repositories have been promoted as one strategy for redefining the scholarly communication model. Since January 2003, the University of Oregon libraries have been exploring this approach. The paper will discuss the challenges and opportunities that such repositories face and examine their effectiveness in changing the nature of scholarly communication. This will be done primarily through a case study of the experience of the University of Oregon Libraries.

Results of a publisher survey on access issues

John Cox, Access to Scholarly Literature: Publishing for an Extended Readership, The Serials Librarian, 50, 1/2 (2006). Prepublication.
Abstract: Access to scholarship and research has become controversial. It is described in apocalyptic terms as 'open'-or good and moral, or 'toll-gated'-with the life-blood of the system ebbing away. The real world is more complex. Publishing is not a homogenous activity, because it reflects the varied needs of scholars. This paper will be based on evidence from surveys and from published inquiries. It will pose, and attempt to answer, some questions about the future of publishing scholarly information, including open access, in the context of what publishers are actually doing. It will describe the challenge that faces publishers and librarians in meeting both scholarly and societal needs.

Two new documents from the RepoMMan project

JISC's RepoMMan project (Repository Metadata and Management project) has released two new documents: (1) a new draft of its experience and advice to help newcomers use Fedodra, and (2) a management report covering the first six months of the project, June 2005 - January 2006.

OA champion receives ACRL award

Ray English, Director of Libraries at Oberlin College and champion of OA, has received the ACRL Academic/Research Librarian of the Year Award. From today's announcement:
“Ray English is an influential librarian,” said award committee chair Les Canterbury. “He is a leader in various organizations on state and national levels including the Oberlin Group of Liberal Arts Colleges, OhioLINK, ACRL, and other units of the American Library Association. Under his direction, Oberlin College has led a Mellon Foundation initiative involving six academic libraries that's designed to attract a more diverse population to the library profession through undergraduate internships. “English's greatest impact as a librarian, perhaps, and the area of his work that stands out to the selection committee, is his advocacy for open access to the results of scholarly research. The breadth and depth of his knowledge of issues related to dissemination of scholarly output, and his commitment to access to information, led to his leadership role in information policy-setting arenas. He has been a primary leader of the ACRL scholarly communications program, has been active in the Scholarly Publishing and Academic Resources Coalition (SPARC), and has fostered close cooperation on scholarly communications issues among ACRL, SPARC, and the Association of Research Libraries. In addition, and on a larger stage, he has influenced, as an expert contributor, national policy on public access to federally-funded research, including the recent National Institutes of Health Public Access Policy.”

PS: Ray is also one of the most active and effective members of the Open Access Working Group. Congratulations, Ray!

Update. Also see Rani Molla, Librarian Ray English Wins National Award, Oberlin Review, February 24, 2006.

Tools to find online journal articles

Mary Ellen Bates, Finding Articles Online, Search Engine Watch, February 14, 2006. A review of some tools for finding OA and non-OA journal articles online, including Google Scholar, Scirus, PubMed, CiteSeer, SMEALsearch, and OAIster. At the end of her article she briefly mentions OA journals.

Happy birthday, BOAI

Today is the fourth anniversary of the Budapest Open Access Initiative (BOAI), the first initiative to call for OA journals and OA archives as complementary strategies, the first to be accompanied by significant funding, the first to use the term open access, and the first of the major public statements (along with the Bethesda and Berlin statements) to form the consensus definition of OA that now structures the OA movement. The BOAI emerged from a December 2001 meeting in Budapest convened by the Open Society Institute, which committed $3 million to implement the BOAI vision. The BOAI was officially issued on February 14, 2002. (Disclosure: I helped draft the BOAI and receive support from OSI.)

I wish I had time to review the last four years of OA activity and show the influence of the BOAI. I don't, but I can offer these pieces: Open Access in 2005, Open Access in 2004, and Open Access in 2003. I didn't write an OA review for 2002, but I did review OA archiving activity in the first six months after the BOAI launched. And for the details missing from these reviews, there's always my timeline.

Happy birthday, BOAI, and Happy Valentines Day.


Monday, February 13, 2006

Google Scholar for OA and non-OA content

Frederick J. Friend, Google Scholar: Potentially Good for Users of Academic Information, Journal of Electronic Publishing, Winter 2006.
Abstract: Use of the Google search engine is commonplace amongst all sectors of the academic community. The development of the specialist Google Scholar search service will benefit the academic community in bringing to their attention content more relevant to their needs. The vast number of Web sites containing potentially relevant information requires a search engine ranging over many millions of Web sites but with the ability to target very specific types of information. The Google Scholar service has the potential to grow if it develops close contacts with both providers and users of academic information. Use of Google Scholar will benefit the authors and managers of open access content, but there are opportunities for all types of academic content providers in the way Google Scholar is set up. Google Scholar will face competition and have to keep pace with user expectations and technological developments.

More on OA books supported by ads

Paula Berinstein, Ad-Supported Free Books Arrive, Information Today NewsBreaks, February 13, 2006. Excerpt:
Perhaps information really does want to be free. Citing the desire to create new revenue streams for authors, mega-publisher HarperCollins has announced the first free Web-based, ad-supported, full-text business book. Go It Alone! The Secret to Building a Successful Business on Your Own by Bruce Judson is now available on the author’s Web site, where an affiliate link to Amazon, not the publisher, can also be found. Not only can the book be read at the site, but it can also be searched. HarperCollins Publishers is calling the project a test of a new business model. Some self-published authors also offer ad-supported books online, but HarperCollins’ move is the first by a major publisher. For now, the project is limited to the one book, with publisher and author sharing the advertising revenue. The author’s contract was specially amended to accommodate the arrangement. Company spokesperson Erin Crum said: “We are exploring how online advertising programs can add value for publishers and authors. The results will be measured by the income generated through ads, number of page views and visitors to the site, and by sales of books from the site. If successful, this kind of digital product might be a new format that supplements the paperback edition.”...HarperCollins has previously indicated the desire to control its own digital assets rather than let others scan and store them. (Until now, the company has sent its books to others for digitizing, such as Amazon, which produces Search Inside the Book pages for its online bookstore.) Nevertheless, Go It Alone! is hosted not on HarperCollins’ servers but on the author’s, which are under the control of a Web site company. The company says that they will evaluate where future free books should be served from on a case-by-case basis....On Judson’s site, the book, displayed in HTML rather than as a PDF, is flanked
by clickable Yahoo! text ads. Because the content of these contextual ads
reflects that of the page on which they appear, some of them may not be
attractive to readers....Ad-supported digital books spark the same fears in publishers and authors as other types of digital content. If they are pirated and widely distributed
without the ads, revenue will be lost.

Comparing Hive, Eprints, and DSpace

Simon Pockley has made a table comparing the functions of Hive, Eprints, and DSpace. He welcomes comments and suggestions.

Maintaining IRs is a key part of the future role of research libraries

Richard Ackerman, Is the research library obsolete? Science Library Pad, February 12, 2006. Excerpt:
Research libraries on the other hand, don't play any of [the] roles [played by public libraries or undergrad-focused academic libraries]. There is no public to serve. There is no community meeting place role. There are no confused or desperate undergrads to help. So shouldn't a research library just [1] digitize and index all of its current (out of copyright) paper holdings, and then send the paper into storage in some climate-controlled cave somewhere, [2] provide good licensed access to the necessary publisher websites for its researchers, [3] close down[?]

Does anyone disagree that the traditional role of a research library, that of providing local convenient access to scientific publications, is erased by the presence of publisher websites on the Internet? That being the case, what value is left for research libraries to add? Researchers don't need (or want) the guidance or handholding that undergrads require. Is there anything left for the research library other than inventing new roles for itself? I can only see three roles that make sense: [1] institutional repository for pre-prints and post-prints of the research organization's publications, [2] data repository for the research conducted at the organization, [3] providing advanced (data/publication/information/discovery/etc.) tools that integrate into the researcher's workflow....To put it more concisely, either your research library becomes part of the E-Science Cyberinfrastructure, or it gets paved over.

Bailey's introduction to OA

Charles W. Bailey, Jr., What Is Open Access? A preprint of a chapter forthcoming in Neil Jacobs (ed.), Key strategic, technical and economic aspects, Chandos Publishing, 2006. From Charles' blog description of it:
This chapter provides a brief overview of open access (around 4,800 words). It examines the three base definitions of open access; notes other key OA statements; defines and discusses self-archiving, self-archiving strategies (author Websites, disciplinary archives, institutional-unit archives, and institutional repositories), and self-archiving copyright practices; and defines and discusses open access journals and the major types of OA publishers (born-OA publishers, conventional publishers, and non-traditional publishers).

Update. See the HTML edition of Charles' chapter, which has more live links than the PDF edition. Also see his 2/20/06 blog posting explaining the differences between this chapter and his other introductory treatments of OA.

Google Scholar as a citation index

Alireza Noruzi, Google Scholar : the new generation of citation indexes, LIBRI 55, 4 (2005) pp. 170-180. Self-archived February 11, 2006.
Abstract: Google Scholar provides a new method of locating potentially relevant articles on a given subject by identifying subsequent articles that cite a previously published article. An important feature of Google Scholar is that researchers can use it to trace interconnections among authors citing articles on the same topic and to determine the frequency with which others cite a specific article, as it has a "cited by" feature. This study begins with an overview of how to use Google Scholar for citation analysis and identifies advanced search techniques not well documented by Google Scholar. This study also compares the citation counts provided by Web of Science and Google Scholar for articles in the field of "Webometrics." It makes several suggestions for improving Google Scholar. Finally, it concludes that Google Scholar provides a free alternative or complement to other citation indexes.

Patenting facts of nature

Lori Andrews, The Patent Office as Thought Police, Chronicle of Higher Education, February 17, 2006 (accessible only to subscribers). Excerpt:
The boundaries of academic freedom may be vastly circumscribed by the U.S. Supreme Court this term in a case that is not even on most universities' radar. Laboratory Corporation of America Holdings v. Metabolite Laboratories Inc. is not a traditional case of academic freedom involving professors as parties and raising First Amendment concerns. In fact, nobody from a university is a party in this commercial dispute, a patent case between two for-profit laboratories. But at the heart of the case is the essence of campus life: the freedom to think and publish. The saga began when researchers from Columbia University and the University of Colorado Health Sciences Center developed a test to measure the level of homocysteine, an amino acid, in a patient's body. In research on thousands of people, the investigators learned that a high level of homocysteine is correlated with a vitamin deficiency: low levels of cobalamin or folate. Other tests for homocysteine existed and were used for a variety of medical disorders. But considering theirs to be an improvement, the researchers applied for a patent. In their application, they also claimed that, because they were the first to recognize that a high level of homocysteine is connected to a vitamin deficiency, they should be allowed to patent that basic physiological fact. Thus they would be owed a royalty anytime anyone used any test for homocysteine and concluded that an elevated level signified a vitamin deficiency. They received U.S. Patent No. 4,940,658 — known as the '658 patent — and later licensed it to Metabolite Laboratories....[A]fter LabCorp published an article stating that high homocysteine levels might indicate a vitamin deficiency that could be treated by vitamins, Metabolite sued LabCorp for patent infringement and breach of contract, and was awarded more than $5-million in damages. LabCorp appealed to the U.S. Court of Appeals for the Federal Circuit, which hears all patent appeals. Astonishingly, it held that LabCorp had induced doctors to infringe the patent by publishing the biological fact that high homocysteine levels indicate vitamin deficiency. The court also ruled that the doctors had directly infringed the patent by merely thinking about the physiological relationship....By considering publishing and thinking about a law of nature to be actionable under patent law, the Federal Circuit court has severely threatened academic freedom. Professors everywhere should be concerned about the case, and how the Supreme Court will rule on LabCorp's appeal. The decision has set off a rush to the patent office to assert ownership over other scientific facts and methods of scientific and medical inquiry....Upholding the '658 patent would discourage the sharing of scientific information through publication.

Sunday, February 12, 2006

Migrating journals to Open Journal Systems (OJS)

Sukhdev Singh, Indian biomedical journals at NIC - issues for migration to OJS, a slide presentation delivered at the National Workshop on Journal Publishing in India (Bangalore, February 10-11, 2006).

OA journals from MedKnow

D.K. Sahu, MedKnow Publications: open dissemination of research, a slide presentation delivered at the National Workshop on Journal Publishing in India (Bangalore, February 10-11, 2006).

Anthropology journal converts to OA

From the front page of the Paleoanthropology Society web site:
The online journal PaleoAnthropology is now being published jointly by the Society and the University of Pennsylvania Museum. As a result, beginning immediately, the journal will be accessible free of charge to everyone, including non-members of the Paleoanthropology Society.

From the Society's Spring 2006 message to members:

Dues for 2006 have been set at $20....Our situation has been complicated by the fact that the Society’s journal PaleoAnthropology will no longer be published by Penn Press [and will be OA]. Student and regular subscriptions through Penn Press had included both access to the journal and Society dues. For those who made payments for 2006 subscriptions to Penn Press, $20 will be credited to dues; those who paid $30 will be contacted concerning a $10 refund.

Here are a couple of blogger comments. First from John Hawks:

I think this is super cool, and I can't overstate the importance of this in my decision about where to send research. Nothing galls me more than having to stick that "subscription only" reminder next to things I link to, because I know there are a lot of interested people who can't get them.

This from Kambiz Kamrani:

When news is posted at 1:03am on Sunday mornings, you know it's important... and if you know what's really important to me (you should by now)... open access makes the list....This is extremely good news for the open access movement, and for anthropology in general. Now published works in PaleoAnthropology will be freely accessible to the public. I wonder if someone has told Open Access News of the good news?

This from Duane Smith:

I'm very glad to see this for two reasons. First, I personally like having access to these articles. But perhaps more important, I think having them publicly available through open access will go a little way to defuse the nonsense about the cultic nature of science. While I'm not so naive as to think that such articles will be widely read, I do think that scientists should make their works more readily available to the public. At a minimum, the public might come to recognize the difference between a work of science and a piece of religious or political propaganda.

Bush budget cuts will decrease access to environmental research

Bush Axing Libraries While Pushing For More Research, a press release from the Public Employees for Environmental Responsibility (PEER), February 10, 2006. (Thanks to LIS News.) Excerpt:
Under President Bush’s proposed budget, the U.S. Environmental Protection Agency is slated to shut down its network of libraries that serve its own scientists as well as the public....In addition to the libraries, the agency will pull the plug on its electronic catalog which tracks tens of thousands of unique documents and research studies that are available nowhere else.....At the same time, President Bush is proposing to significantly increase EPA research funding for topics such as nanotechnology, air pollution and drinking water system security....“How are EPA scientists supposed to engage in cutting edge research when they cannot find what the agency has already done?” asked PEER Executive Director Jeff Ruch, noting that EPA Administrator Stephen Johnson is moving to implement the proposed cuts as soon as possible. “The President’s plan will not make us more competitive if we have to spend half our time re-inventing the wheel....Access to information is one of the best tools we have for protecting the environment....[C]losing the Environmental Protection Agency libraries actually threatens to subtract from the sum total of human knowledge.”

PS: Also see the OMBWatch report, Dismantling the Public's Right to Know: EPA's Systematic Weakening of the Toxic Release Inventory, December 1, 2005, or this blogged excerpt.

ACRL/ARL Institute on Scholarly Communication

ACRL seeks applicants for the first Scholarly Communication Institute, co-sponsored with the Association for Research Libraries, to be held in Los Angeles, July 12-14, 2006. The deadline for application is March 1, 2006. Acceptance to the Scholarly Communication program is limited to 100 individuals, and selection is on a competitive basis. Read details about the program as well as the online application form and instructions. FAQ
Participants will work with experts in the field to understand how to better engage faculty at their institution around the crisis in the systems of scholarly communication. You will also learn about the emergence of new models for scholarly communication as well as strategies for creating systemic change. These will include: * Faculty activism (e.g. editorial board control, author rights, copyright management, and self-archiving) * New publishing models * Digital repositories * Legislative and policy advocacy
Read more »