Open Access News

News from the open access movement

Saturday, March 04, 2006

Oslo self-archiving policy

The University of Oslo has adopted a strong self-archiving policy and signed the Registry of Open Access Repository Material Archiving Policies (ROARMAP). Excerpt from its policy:
All researchers must deposit their metadata (for articles) in FRIDA, UiO's research documentation system. It is as yet only compulsory to deposit metadata, not full-text. DUO is UiO's full-text institutional repository. In the near future DUO and FRIDA will be integrated so that UiO researchers can self-archive their full-texts (postprints) through FRIDA. The metadata and full-text will be transfered to DUO. Submission of theses will shortly be mandatory: From 2007 it will be compulsory for all postgraduate students to submit their theses electronically. DUO - Digital publishing at UiO - has been developed by the University Center for Information Technology (USIT) and the Oslo University Library (UBO) as a system for the net-based archiving of publications by UiO authors. It provides support for depositing, archiving and searching in diverse formats. Today DUO includes electronic versions of theses, special research papers, etc. and a growing number of scientific publications. The goal is to archive the whole spectrum of the research output (journal articles, books, reports, series, research documents, videos, etc.) published by UiO authors.

Last call for bids to run UK PMC

Kim Thomas, UK PubMed goes out to tender, Information World Review, March 6, 2006.
Medical research funding body the Wellcome Trust has invited tenders to find a supplier to host, manage and develop a UK version of PubMed Central, the US National Library’s free database of medical research papers. Potential suppliers have been asked to express interest by 8 March. The WellcomeTrust and its UK partners,which include the Department of Health, Cancer Research UK and the British Heart Foundation, hope to make a decision by late July. The Wellcome Trust was expecting applications from a mix of suppliers,said Robert Kiley, head of systems strategy at the Wellcome Library ( click here for more news on Wellcome Trust), including publishers and experts at hosting large databases or running manuscript submission systems. The UK version of PubMed Central would be launched early in 2007,said Kiley, and consist of three systems: a mirror of data already held in the US version (500,000 articles), a manuscript submission and tracking system, and an authenticated login enabling people to deposit articles. Kiley said the trust wanted to make publicly accessible the research it funded, to evaluate its impact and ensure the long-term digital preservation of research papers. “All papers will be held in a standard XML format, thus ensuring that the record of biomedicine is preserved, irrespective of changes in software and hardware platforms,” Kiley said. From October 2006, Wellcome Trust grant holders must ensure peer-reviewed articles are deposited in the UK or US version of PubMed Central within six months, either by themselves or the journal publisher. “If the publisher cannot accept these conditions, grant-holders will have to publish elsewhere,”Kiley said. “Our preferred route is open access.”

Another consortial OA repository

The HELIN Library Consortium is launching a consortial OA repository. From its announcement (undated but apparently today):
The HELIN Library Consortium, made up of ten academic libraries in Rhode Island and Massachusetts - including Bryant University, has received funding to create a digital repository. The grant was received from the Davis Educational Foundation established by Stanton and Elisabeth Davis after his retirement as chairman of Shaw's Supermarkets, Inc. The goal of the Digital Repository will be to preserve and archive historical and current materials held by each member institution, and make these materials accessible to students, faculty and staff at all of the HELIN Library Consortium schools to enhance teaching and learning. The Digital Repository will use ProQuest's Digital Commons platform as the technology for storing, searching and accessing the digital materials, including papers, images, and videos created by the students and faculties of the HELIN member institutions....The HELIN Consortium includes nearly all the academic institutions in the state of Rhode Island, including both public and private colleges, and its digital repository would reflect that same broad inclusiveness.

More on the need for OA to European geodata

Bull_[UK], Damn the EU! Bull's Rambles, March 3, 2006.
They are at it again, it seems all the EU wants to do is suck every penny they can out of us, this time they want to pass a directive that ensures we pay twice for data, first with our taxes and second to view the data, see for more information....Now I'm usually against US policy, but this time I have to say they are far ahead of us in Europe, by allowing US and non-US citizens to access high resolution data of their country for free they shame the rest of the world, and show our governments to be money grabbing and selfish.

OA inside and outside libraries

Dorothea Salo, Open Access Outside Libraries, Caveat Lector, March 3, 2006. Excerpt:
Stevan Harnad posted a glowing encomium for Poynder’s article to the JISC-REPOSITORIES mailing list. This is no surprise. What surprised and gratified me were two cogent, politely critical responses, which Harnad has as yet not answered....Let me be clear: what Poynder said about librarian open-access efforts was offensive and shortsighted, and he could do worse than apologize for it on his weblog. Harnad’s enthusiastic response to Poynder tars him with the same brush, and he could stand to apologize also. But just for fun, let us play out a Harnadian scenario, in which libraries use IRs for their own projects and OA happens somewhere else entirely....Harnad’s own vision (based on his email, which is repeated at this post to his blog) appears to be small departmental faculty-spurred and faculty-owned fiefdoms, which would then be aggregated at the university level for harvesting and dissemination purposes. This would indeed carry some advantages: faculty evangelizing their colleagues is the best OA marketing there is. Also, such a fiefdom often requires consulting departmental administration, which is a sensible opening to lobby for a mandate.

My first question is this: If faculty cannot even drag themselves to deposit material into IRs where the library has done all the tech work for them up-front, how will they be convinced to start them? It is assuredly technically simple to do, but the complexity of the technical process is not and has never been the problem. The complexity of the social process is the problem, and I fail to see how Harnad’s proposal solves it....My second question concerns coverage. To achieve his stated goal of 100% OA to the peer-reviewed journal literature via departmental repositories, Harnad will have to convince every department and research unit on every college and university campus everywhere containing faculty who publish in the peer-reviewed journal literature to open a repository....This is a tall order....My third question concerns unnecessary duplication of effort. Though the technical and staff requirements of setting up and maintaining a digital repository are quite small, they are not zero. What is Harnad’s justification for duplicating the necessary effort and machinery across thousands of departments?...Still, I entirely favor Harnad pursuing this angle; there may be advantages to it I am not considering, and I stand to learn a lot about outreach to the faculty at MPOW should he succeed....[T]he rising tide would float my boat as well as Harnad’s; I should be insane to object....My sense still is that OA needs academic libraries and academic librarians.

Digesting biomedical blogs

Postgenomic tracks the biomedical papers being discussed by bloggers, identifies the most-discussed papers and journals, and shows what kinds of researchers are discussing what kinds of papers. (Thanks to Richard Ackerman.) Excerpt:
Postgenomic collates posts from life science blogs and then does useful and interesting things with that data. For example, you can see which papers are being cited most often by neurologists, or which stories are being heavily linked to by bioinformaticians. It's sort of like a hot papers meeting with the entire biomed blogging community. Sort of.

Science 2.0. Postgenomic's primary purpose is to act as an open access repository of literature reviews and conference reports. A review in our case is an analysis of - or a piece of useful information concerning - a scientific paper. Feel free to look through the papers cited by life science bloggers recently or to browse the reviews already collected. If you own a blog, you can mark any of your posts as a review using some very simple HTML code. If you don't own a blog, you can submit reviews directly.

Comment. As serious blogging grows in biomedicine, the usefulness of this site will grow exponentially. Long term, its usefulness could become both a cause and effect of that growth in blogging. Every field should have an equivalent.

"The OA problem is not Preservation tomorrow, but Provision today."

Stevan Harnad, preservation vs. Preservation, Open Access Archivangelism, March 3, 2006. Excerpt:

This is perhaps a good juncture at which to make it explicit that there is "small-p preservation" and "large-P Preservation." Of course GNU Eprints, like everyone else (including ArXiv since way back in 1991) is doing small-p preservation, and will continue to do so: Open Access is for the sake of immediate access, today, tomorrow, and into the future -- and this, in turn, is for the sake of maximising immediate usage and impact, today, tomorrow, and into the future. Hence small-p preservation is a necessary means to that end.  But big-P Preservation, in contrast, is Preservation as an end in itself...So it is absurd to imagine...that Eprints is either oblivious to small-p preservation or that its contents are one bit more or less likely to vanish tomorrow than any other digital contents that are being conscientiously preserved and migrated and upgraded today, keeping up with the ongoing developments in the means of preservation....Why is it so important to make it crystal clear that Eprints and OA are not for Preservation projects? that their primary motivation is not to ensure the longevity of digital contents (even though Eprints and OA do provide longevity, and do keep up with whatever developments occur in the means of long-term preservation of their contents)?  Because OA's target contents are 85% missing! The pressing problem of absent content cannot be its Preservation!...What has been (and continues to be) lost for the 85% of annual OA target content that has not been (and is not being) self-archived, is access, usage, and impact. That is the true motivation for Eprints and OA self-archiving....[T]hat content will never be self-archived by its authors for the sake of Preservation, because it need not be: its Preservation is already in other hands than its authors (or its authors' institutions), as it always was, and for the foreseeable future will continue to be. The mission of authors and their institutions was not, is not, and should not have to be the Preservation of their own published journal article output....The OA problem, in other words, is not Preservation tomorrow, but Provision today. Hitching today's Provision problem to tomorrow's Preservation problem is yet another recipe for prolonging the non-Provision of 85% of OA's target content.

Profile of LOCKSS

Editors' Interview with Victoria Reich, Director, LOCKSS Program, RLG DigiNews, February 15, 2006. This is a useful, detailed overview of how LOCKSS works and how it's being used. I excerpt only the parts most relevant to OA:
In addition to subscription and open access electronic journals, LOCKSS Alliance members are collecting and preserving government documents, electronic thesis and dissertations, websites, in-house image collections, and, soon, books and blogs....Through the manifest page mechanism, publishers give permission to authorized libraries to preserve their content. The use and access restrictions of subscription-based content are governed by the original license agreement. If a publisher’s terms and conditions are fairly stable across customer bases, we urge them to put rights and restrictions on the LOCKSS publisher manifest page so this information is bundled with and preserved with the content. A manifest page is not always required. The LOCKSS system ingests content from websites that support OAI-PMH with permission but without the need for a manifest page full of links. For open access publishers, we encourage use of an appropriate Creative Commons license. This machine-readable license is then preserved with the content, making clear current and future rights. Also, in response to community requests, we are currently working on methods to preserve blogs. We expect that in most cases this will be done via RSS. Other ingest mechanisms may also be implemented, such as Google SiteMaps.

Canadian public domain registry

Canada will soon have an OA Public Domain Registry. From yesterday's press release:
Access Copyright, The Canadian Copyright Licensing Agency and Creative Commons Canada, in partnership with Creative Commons Corporation in the US, today announced the development of a Canadian public domain registry. The ground-breaking project – the most comprehensive of its kind in Canada – will create an online, globally searchable [and OA] catalogue of published works that are in the Canadian public domain. “Canada has a rich cultural heritage of literature, music and fine art that is in the public domain just waiting to be freely enjoyed,” said Marcus Bornfreund of Creative Commons Canada, a non-profit organization that works in collaboration with Creative Commons US. “The problem until now was that there was no easy way to identify whether or not works are in the public domain. This registry will change that.” There is currently no one place where information about the public domain is collected. The registry will make published works in the Canadian public domain easily identifiable and accessible in an online catalogue. The project will develop in two stages – first, a comprehensive registry of works by Canadian creators that are in the public domain will be established. Eventually, the reach of the registry will expand to include the published works of creators from other countries. The public domain registry will be a non-profit project and freely accessible to the public online....The Wikimedia Foundation, developers of the popular online encyclopedia Wikipedia, will supply software that will allow the public to contribute information to the registry. "The public domain is our shared cultural heritage, and the best ground for the great new ideas of the future," said Wikipedia founder Jimmy Wales. "Without access to the public domain, we are cut off from our past, and therefore cut ourselves off from our future."...Individuals will be able to use the registry to determine whether a published work is in the public domain. The registry will also link to digital versions of the work, and provide information about where a paper copy of the work can be purchased. “Quick and easy access to legally available content is vital as we move further into the digital age,” said Roanie Levy, Access Copyright’s Director of Legal and External Affairs. “The public domain registry has limitless possibilities and will place Canadian cultural content at the leading edge of the public domain.”

Comment. Because legislators (in Canada and elsewhere) have frequently extended the term of copyright, it's non-trivial to figure out when a work is in the public domain. If the new registry is comprehensive, it will be useful for lifting this burden from ordinary users who don't want to risk liability, err on the side of non-use, or spend time and money tracking down copyright holders when it's no longer necessary. By adding links to OA versions of PD works, it will be even more useful. It should bring about one other consequence of great importance: heightened public awareness of the value of the public domain and the urgency of protecting it from further encroachments.

Blogjects for OA

Julian Bleecker, A Manifesto for Networked Objects — Cohabiting with Pigeons, Arphids and Aibos in the Internet of Things, an undated preprint. (Thanks to Cory Doctorow via Ray Corrigan.) Excerpt:
The Internet of Things has evolved into a nascent conceptual framework for understanding how physical objects, once networked and imbued with informatic capabilities, will occupy space and occupy themselves in a world in which things were once quite passive. This paper describes the Internet of Things as more than a world of RFID tags and networked sensors. Once “Things” are connected to the Internet, they can only but become enrolled as active, worldly participants by knitting together, facilitating and contributing to networks of social exchange and discourse, and rearranging the rules of occupancy and patterns of mobility within the physical world. “Things” in the pervasive Internet, will become first-class citizens with which we will interact and communicate. Things will have to be taken into account as they assume the role of socially relevant actors and strong-willed agents that create social capital and reconfigure the ways in which we live within and move about physical space. To distinguish the instrumental character of “things” connected to the Internet from “things” participating within the Internet of social networks, I use the neologism “Blogject” — ‘objects that blog.’...[B]logjects in the near-future will participate in the whole meaning-making apparatus that is now the social web, and that is becoming the "Internet of Things." The most peculiar characteristic of Blogjects is that they participate in the exchange of ideas. Blogjects don’t just publish, they circulate conversations. Not with some sort of artificial intelligence engine or other speculative high-tech wizardry. Blogjects become first-class a-list producers of conversations in the same way that human bloggers do — by starting, maintaining and being critical attractors in conversations around topics that have relevance and meaning to others who have a stake in that discussion....A Blogject can start a conversation with something as simple as an aggregation of levels of pollutants in groundwater. If this conversation is maintained and made consequential through hourly RSS feeds and visualizations of that same routine data, this Blogject is going to get some trackback.

Comment. I didn't blog this article because it's about OA. I blogged it for what the underlying idea can do for OA. I see this potential falling into two categories.

  1. Blogjects can help monitor OA progress. Imagine OA repositories blogging their deposits --either through raw tallies, showing their growth, or through metadata about the new articles themselves, showing what's going in. Imagine OA repository registries or directories (like OpenDOAR and ROAR) blogging the world growth in repository deposits, broken down by country and discipline. Imagine OA journals blogging their new TOCs. Imagine book-digitizing projects blogging their progress. Few of these feeds would be directly interesting to human readers. But they could be inputs to software monitors or feed aggregates that give human readers an instant and up-to-date overview, something like OA by the numbers but not limited to numbers, not confined to a small number of sources, and not slowed the need for human updates. (Of course the same technology could tell us about non-OA research but here I'm focusing on what it could do for OA.)

  2. Blogjects can help disseminate OA data about anything. Some scientists want to keep their data to themselves until they publish, but others are willing to (sometimes compelled to) share their data in real time as it is generated. For scientists in the second camp, blogjects could feed data to all who care to plug in and process it. Imagine blogjects feeding weather and pollution data, geospatial data, census data, crime data, economic data. Because blogject feeds automatically contain their own histories, they lend themselves just as much to longitudinal studies as to instanteous sampling. In many of these areas we have the equivalent today --but for the XML mark-up and standardization of the feeds, which permit RSS, mash-ups, aggregations, trackbacks, pings, and the other information connectors introduced with the blogosphere. One of the many network nodes monitoring a blogject data stream could be a LOCKSS system, capturing it for long-term preservation. Blogjects could not only accelerate research, but inform actions (weather prediction, pollution control, investment) and improve policy-making in all areas that depend on real-time open-access data feeds.

New alliance promotes OpenDocument Format worldwide

The mission of the new ODF Alliance is to promote the use of the OpenDocument Format (ODF) worldwide, especially for government documents. From yesterday's press release:
As documents and services are increasingly transformed from paper to electronic form, there is growing recognition that governments and their constituents may not be able to access, retrieve and use critical records, information and documents in the future. A broad cross-section of associations, academic institutions, industry and related groups today joined together to form the OpenDocument Format Alliance (ODF Alliance), an organization dedicated to promoting open solutions to this problem. As technologies rapidly evolve, documents are created by public sector agencies using different applications that may not be compatible with one another today, let alone into the future. Through the use of a truly open standard file format that can be implemented by numerous and varied applications, the Alliance seeks to enable governments and their constituents to use, access and store critical documents, records and information both today and in the future, independent of the applications or enterprise platforms used for their creation or future access. Specifically, the ODF Alliance supports the use of the OpenDocument Format (ODF), an open XML-based collection of office document formats, including text, presentation and spreadsheet formats. ODF, the only established open standard document format, enables the retrieval of information and exchange of documents between different applications, agencies and/or business partners in a platform and application independent way. "With a broad cross section of support, the ODF Alliance will work to enable governments around the world to have greater control over and direct management of their documents, now and forever," said Ken Wasch, President of the Software & Information Industry Association, a leading member of the Alliance and the principal trade association of the software and digital content industry. "There's no doubt that the momentum of ODF is gaining traction worldwide as more people every day are discovering that it's a better way to preserve and access documents."...The Alliance is building support globally for use and recognition of ODF, and all organizations that share its goals are welcomed to join the effort. Organizations can join [here]....In recent months, jurisdictions such as the State of Massachusetts in the United States and others around the world are leading the way, by embracing open document formats. According to recent press reports, 13 nations globally are considering adoption of the OpenDocument format.

Friday, March 03, 2006

Theoretical Economics -- new OA jnl begins publishing

As noted last fall, Theoretical Economics is a peer-reviewed, open access journal, published by the Society for Economic Theory. The debut issue is now online. Ted Bergstrom and R. Preston McAfee are members of the excutive board of the journal.

Spreading the word about what researchers need to know about OA

The March issue of Informed Librarian Online highlights my article, Six Things That Researchers Need to Know About Open Access (from SOAN for February 2006), as an Editor's Pick. (Thanks, ILO.)

SLA statement on EPA library closings

SLA, AALL, ALA, and ARL are in strong opposition to the Bush Administration proposal to close the network of libraries and information centers operating within the U.S. Environmental Protection Agency (EPA). A joint letter from the library associations was sent to Rep. Charles H. Taylor and Rep. Norman D. Dicks, chair and ranking member, respectively, of the House Appropriations Committee, Subcommittee on Interior, Environment, and Related Agencies. Excerpt:
It is especially disappointing that these budget cuts to the EPA Libraries follow the good faith efforts of their staff over the past few years to analyze their network and develop a vision for centralized and coordinated services to improve their operations. The findings, recommendations and action plans in three important reports—Business Case for Information Services: EPA’s Regional Libraries and Centers (January 2004), Transforming EPA Libraries: Creating a national capacity for information management and retrieval (June 2004) and EPA Library Network: Challenges for FY 2007 and Beyond (November 2005)—have been all but ignored by the Agency. They provide new models for information services to reduce duplication, cut costs and improve the national capability of the EPA. If implemented, these proposed new models would align resources with an optimal structure for leveraging investments, staff and vision.
[Thanks to Michael Knee, University at Albany, SUNY.]

Declaration on the freedom of research

On February 18, the World Congress for the Freedom of Research released its Declaration. It emphasizes freedom from political and religions constraints, recommends stem-cell research and therapeutic cloning, but doesn't raise access issues.

The OA Biodiversity Heritage Library

Eight international natural history libraries are working together on an OA Biodiversity Heritage Library. From yesterday's announcement:
Eight of the world’s major natural history and botanical libraries are working together to develop a strategy and operational plan to digitize the published literature on biodiversity that they jointly hold and make it freely accessible to all on the Web. The project, the Biodiversity Heritage Library, will establish a major corpus of publications drawn from each of their collections. Much of the published literature is rare or has limited global distribution and is available in only a few select libraries. From a scholarly perspective, these collections are of exceptional value because the domain of systematic biology depends --more than any other science-- on historic literature.

The eight participating libraries have more than 2 million volumes of biodiversity literature collected over 200 years to support scientists and students throughout the world. Until now, this body of biodiversity knowledge effectively has been unavailable for wide use in a broad range of applications, including research, education, taxonomic study, biodiversity conservation, protected area management, disease control and maintenance of diverse ecosystems. The participants are:

• Smithsonian Institution Libraries and the National Museum of Natural History, Washington
• American Museum of Natural History, New York
• Harvard Museum of Comparative Zoology, Cambridge, Mass.
• Harvard University Botany Libraries, Cambridge, Mass.
• Missouri Botanical Garden, St. Louis
• The Natural History Museum, London
• The New York Botanical Garden, Bronx, N.Y.
• Royal Botanic Gardens, Kew, Surrey

The participating organizations will work with the global taxonomic community, rights holders and other interested parties to ensure that their collective historical heritage is available to all who seek it. Web-based access to the collections will provide a substantial benefit to people living and working in the developing world who have not been able to benefit from the information held in these collections.

New OA journal in environmental studies

The Institute of Physics (IOP) is launching a new OA journal, Environmental Research Letters. Excerpt from its advertisement for an executive editor:
IOP is launching a new journal, Environmental Research Letters (ERL), and companion website focused on environmental studies and science. The journal will feature short, focused articles on genuine advances in environmental understanding, and interdisciplinary pieces that transform our knowledge of the linked social, technical and environmental systems in which we live. The website will feature specially written and commissioned content as well as selected highlights from the journal, and other information of interest to the environmental science community. The whole site will be available free as a service to the community and the journal will be the first Open Access journal in the field.

Excerpt from the journal site:

The 2006 article charge for publishing in ERL is $1750. You should send your payment to Institute of Physics Publishing, and you may pay in UK pounds sterling (£1000), Euros (EUR1450) or US dollars. We send invoices for payment after articles are accepted for publication....Members of the Institute of Physics pay $1500 (£850; EUR1250). Institute of Physics Publishing is currently in the process of seeking sponsors from research foundations and funding agencies to cover the article charge for publishing in ERL.

OA repositories for government documents

Kristin E. Martin, Moving into the Digital Age: A Conceptual Model for a Publications Repository, Internet Reference Services Quarterly, 11, 2 (2006) prepublication.
Abstract: Libraries have developed methods and processes for collecting, processing, providing access to, and preserving paper publications. As publishing shifts from printed to digital form, libraries need to rethink and revise systems to handle the ever-increasing flow of digital publications. These challenges are being addressed by the State Library of North Carolina as it attempts to manage digital state government publications, which have characteristics similar to most gray literature now produced on the Internet. This article discusses changes in cataloging and workflow designed to accommodate the increased amount of digital information and presents a model for managing digital publications in the context of state government publications. The conceptual model covers six aspects of a publication's management: creation, collection, description, storage, access, and preservation.

French publishers may sue Google

VNU Staff, French PA may sue Google, Information World Review, March 3, 2006. Excerpt:
The French Publishers Association and several publishers are likely to file a lawsuit against Google during the Paris Book Fair later this month for digitising hundreds of French books without permission...France would be the first European country to take action against Google, following in the footsteps of the Association of American Publishers...The French titles that Google is scanning--the most recent of which were published in the 1970s--come from Michigan University under an agreement with the search engine. The university has 500,000 French language titles. Google has yet to respond to the French Publishers Association's (Syndicat National de l'Edition) demand to withdraw the titles. It has offered an opt-out arrangement, but this is unacceptable to the SNE, which accuses the company of infringing copyright law.

Comment. Fair use law in the US probably favors Google but I have no idea how Google might fare under French law. Note, however, that when it looked as if Google wouldn't digitize French books, it was savaged by the country's president and national librarian, and when it looked as if it would digitize French books after all, it was threatened by French publishers. I'm glad that publishers and governments don't always see eye to eye, but maybe French people should tell their publishers that they're not acting in the interests of French readers or French authors.

Google Scholar indexing the British Library document service

Mark Chillingworth, Google Scholar becomes direct link to British Library, Information World Review, March 2, 2006. Excerpt:
The British Library (BL) became a directly linked resource for scientific and academic information on the Google Scholar search engine following a deal between the two parties today. Search results in Google Scholar will now feature – if the article is in the national collection – a BL Direct tag alongside the cache and citation links (click here to see). The Google Scholar search engine will now match its results against the holdings of the British Library Direct document delivery service which provides researchers, students and academics with electronic scans of journal articles. A direct link will appear in the results giving users the option of instantly purchasing knowledge online. Greater integration between the BL Direct and Google Scholar services allows Google to search BL collections and order form fields containing bibliographic details will already be filled out, making the e-commerce experience faster and easier. The British Library is describing the service as a "discovery-to-desktop-delivery process". "Linking up with partners like Google is the way forward," said Matt Pfleger, BL Head of Sales and Marketing, adding, "They get richer search results and content and we get to make our collections more easily available."...This is the second deal the BL has inked with a major online services provider in the last four months....In December 2005 the library announced a deal with the software giant which sees 25 million pages of out of copyright book titles digitised and placed on MSN BookSearch, an online book searching service to rival Google Book Search, also launched last year.

Expanding the Amedeo Challenge

Until now, the Amedeo Challenge, which pays physicians to write OA medical textbooks, has only recruited large sponsors to underwrite the costs. But Vaughan Bell thought the idea would spread faster if it also solicited small sponsors. Excerpt from his post to Mind Hacks:
I've emailed [Bernd Sebastian] Kamps to suggest a small donations system for individuals to donate towards a 'running bounty' for any book of their choice. I'd happily donate 50 euros knowing that it would contribute towards the development of a high-quality, open-access psychiatry, neurology or neuropsychology textbook. If you'd like the opportunity to do something similar, contact the project and suggest the same. It seems like many small donations could create large bounties in a relatively small amount of time.

UPDATE: Good news! I just got the following back from Bernd Sebastian Kamps:

Thank you for your suggestion: Your idea is brilliant (I had never thought of asking for small contributions). We'll open the PayPal account next week and by the end of the month, everything should be in place. I'll keep you informed.

Comment. This will not only raise more money for OA medical books but help create books that respond to unmet needs. For example, it could be very helpful to the world of orphan diseases.

Calling on the ASC to open up access at least to developing countries

George Porter, RSC Provides Free Online Access to Developing Countries -- Whither ACS? STLQ, March 2, 2006. George calls on the American Chemical Society (ACS) to follow the lead of the Royal Society of Chemistry (RSC) and provide its journals free to developing countries. Excerpt:
Kudos to RSC for taking this commendable step. RSC will not lose much, if any, potential revenue, but will surely generate goodwill and good science, the latter of which is a core mission of the society. In fact, the core mission of scientific societies (RSC, ACS, AIP, APS, IoP, AGU, etc) is to promote the advancement of their science and its understanding by society at large. Society publishers can easily be understood arising from this core mission. From the RSC charter:
The RSC's original Charter was granted in 1848. The RSC's Royal Charter, granted in 1980, states that: "The object for which the Society is constituted is the general advancement of chemical science and its application and for that purpose: 1. to foster and encourage the growth and application of such science by the dissemination of chemical knowledge;..."

I cannot fathom how anyone can justify this heritage and trust being held hostage by a profit motive, by a desire to extract continuing revenue at the cost of advancing the science which is supposed to be the fundamental purpose of the organization....The challenge to ACS to live up their charter and begin to provide access to a portion of the world's chemical heritage which they control is apparent....

SEC. 2. That the objects of the incorporation shall be to encourage in the broadest and most liberal manner the advancement of chemistry in all its branches; the promotion of research in chemical science and industry; the improvement of the qualifications and usefulness of chemists through high standards of professional ethics, education, and attainments; the increase and diffusion of chemical knowledge; and by its meetings, professional contacts, reports, papers, discussions, and publications, to promote scientific interests and inquiry, thereby fostering public welfare and education, aiding the development of our country’s industries, and adding to the material prosperity and happiness of our people.

OA on German radio

On March 7, Germany's WDR radio will broadcast an hour-long program on OA, Freier Zugang zum Wissen, by Marcus Schwandner. (Thanks to medinfo.)

New details on the OA European Digital Library

Yesterday the The European Library (TEL) told us much more about the plans for its forthcoming OA European Digital Library. It published the results of online consultation (begun September 30, 2005), the comments received during the consultation period, an FAQ, and a press release. Excerpt from the press release:
The European Commissions’ plan to promote digital access to Europe’s heritage is rapidly taking shape. At least six million books, documents and other cultural works will be made available to anyone with a Web connection through the European Digital Library over the next five years. In order to boost European digitisation efforts, the Commission will co-fund the creation of a Europe-wide network of digitisation centres. The Commission will also address, in a series of policy documents, the issue of the appropriate framework for intellectual property rights protection in the context of digital libraries....By the end of 2006, the European Digital Library should encompass full collaboration among the national libraries in the EU. In the years thereafter, this collaboration is to be expanded to archives and museums. Two million books, films, photographs, manuscripts, and other cultural works will be accessible through the European Digital Library by 2008. This figure will grow to at least six million by 2010, but is expected to be much higher as, by then, potentially every library, archive and museum in Europe will be able to link its digital content to the European Digital Library. This European Digital Library is a flagship project of the Commission’s overall strategy to boost the digital economy, the i2010 strategy.

The Conference of European National Libraries (CENL), which sponsors TEL, issued its own press release. Excerpt:

The Conference of European National Librarians, CENL, welcomes with enthusiasm the European Commission’s plan to promote digital access to Europe’s heritage through the European Digital Library by co-funding the creation of a Europe-wide network of digitisation centres and by addressing the issue of the appropriate framework for intellectual property rights protection in the context of digital libraries. CENL shares the vision of a European Digital Library and has been working towards this goal by creating TEL, The European Library, which received European Community funding in its early stage. CENL is honoured and pleased by the Commission’s intention to build the European Digital Library upon the TEL-infrastructure. The European Library is the webservice of the 45 CENL members providing access to the catalogue records and digital collections of currently 15 European National Libraries. By the end of 2006 the number of European National Libraries fully participating in the service will be enlarged by the ten new member states of the European Union which prepare to join The European Library in the European Community funded project TEL-ME-MOR, and by all remaining EU and EFTA member states....CENL especially welcomes the European Commission’s plans to expand this collaboration among the national libraries to archives and museums to create a true European Digital Library. CENL will embrace any moves to deepen existing contacts to groups of European archives and museums and to discuss the roadmap towards a European Digital Library.

Articles on OA in new Open Source Yearbook

The Open Source Jahrbuch 2006 has four articles on OA in science, all in German. (Thanks to Klaus Graf.)

Making OA journals more visible through library catalogs

Anna Hood and Mykie Howard, Adding Value to the Catalog in an Open Access World, The Serials Librarian, 50, 3/4 (2006) prepublication.
Abstract: As serial prices rise exponentially and budgets plummet, serials aficionados can still increase access to information by adding bibliographic records for open access journals to library catalogs. Anna Hood shared procedures and practices on how she is adding value to her library's catalog by adding such records. Bridging the gap between users and open access journals in such a manner allows for true openness. An open access journal is not truly “open” unless we take the time to unrestrict and make them available to all.

OA encyclopedia getting closer to its fund-raising goal

Stanford Encyclopedia of Philosophy ReceivesSupport from the California Digital Library, a press release dated February 2006. Excerpt:
The Stanford Encyclopedia of Philosophy (SEP) is pleased to announce sponsorship by the California Digital Library (CDL) on behalf of the ten campuses of the University of California. This represents a full commitment by all of the libraries in the UC system, eight of whose campuses offer degrees in philosophy....Edward Zalta, Principal Editor for the SEP, notes “We are very grateful to receive significant support from the CDL and the University of California. With this pledge, the SEP tops the $1 million mark in commitments from libraries worldwide. This cooperative effort ensures that philosophical scholarship is widely available and accessible to all. The broad-based support from institutions (even some without degrees in philosophy) demonstrates that the SEP is valuable for students and faculty of diverse subjects such as mathematics, religion, law, medicine, and science.” Tom Sanville, spokesperson for the International Coalition of Library Consortia (ICOLC), notes that “With libraries world wide working collaboratively we are creating and preserving open access to a vast body of philosophical scholarship. This pioneering effort illuminates a unique and revolutionary approach to achieve open access. It’s a technique we hope the library and academic communities can utilize again.”

Comment. The SEP is an authoritative OA encyclopedia. For a couple of years now it's been pursuing an unusual but promising method to build an endowment so that it can cover its costs without having to abandon OA. I hope it succeeds: the SEP deserves to survive and endowments are a perfect fit with the needs of OA.

Separating the OA mission of IRs from their other missions

Stevan Harnad, Time for a Digital Divide: DL top-down, OA bottom-up, Open Access Archivangelism, March 2, 2006. Excerpt:
Richard Poynder, the astute, eloquent chronicler of scholarly communication in the online era has done it again, with a shrewd, original and insightful review of the short history of the institutional repository movement. His conclusions are surprising, but (I think) very apt. His analysis, among other things, goes some way toward explaining why on earth a "Repository Comparison" such as the one Rachel Heery cites [by Thom Hickey]..., would have left the first and most widely used Institutional Repository (IR) software (GNU Eprints) out of its comparison.  The answer is simple: Eprints is and always was very determinedly focused on the specific goal of 100% Open Access (OA), as soon as possible; it can of course also do everything that the other IR softwares can do (and vice versa!), but Eprints is focused on a very particular and urgent agenda: generating 100% OA to each institution's own research article output. Those who prefer leisurely fussing with the curation/preservation of arbitrary digital contents of any and every description will of course have plenty to keep them busy for decades to come. Eprints, in contrast, has an immediate, already-overdue mission to fulfil, and it is becoming clearer and clearer that -- with some prominent and invaluable exceptions -- the library community has found other rows to hoe.  Richard has accordingly proposed that it might be time for a parting of paths between the Generic Digital Curation/Preservation IR movement and the OA IR movement, and he might be right. One has a diffuse, divergent goal, the other a focused, convergent -- and urgent and immediately reachable -- goal, a goal that might now be hamstrung if it is subordinated to or subsumed under the diffuse, divergent goal of the other....I think the optimal strategy is latent in Richard Poynder's very timely and perspicacious article. We should especially recommend using Eprints modularly, at the departmental level, via computing-services and/or library support, along the lines CalTech are doing it with CODA:  Instead of building one monster-archive, Dspace/Fedora style, and then partitioning it top-down into "communities", CalTech have made natural and effective use of the OAI interoperability to create lots of Eprints modules, all harvested and integrated bottom-up into CODA. The rationale to be stressed in this is that this easy, light modularity can be used to get OA-specific archives going even in institutions that are slogging away at their own monster-archive, in parallel.

More on the low compliance rate with the NIH policy

Yesterday the Publishing Research Consortium announced a study of NIH grantees, what they understood about the NIH public access policy, and why they were not complying with it in greater numbers. From the announcement:
Scientific, technical and medical publishers called today for an increase in communications to science and medical authors in light of a new survey that finds low understanding of the National Institutes of Health’s (NIH) Public Access Policy for posting peer-reviewed articles to PubMed Central (PMC), NIH’s online database. The online survey, conducted in January of this year by the Publishing Research Consortium (PRC), shows that although most authors are aware of the NIH policy, many authors do not post on PMC because they do not understand the process, nor do they identify clear benefits for posting their work. Of the NIH-funded authors who responded to the survey, 15% have never heard of the policy and a further 23% have heard of the policy, but know nothing about it. The survey found awareness of NIH-funded authors is only marginally higher than of all life sciences and medicine authors. “As publishers, we are committed to working with the NIH in improving dissemination of and enhancing access to scientific and medical research,” Robert Campbell, Chairman of the PRC. “Publishers remain willing and prepared to work with the National Library of Medicine to advance the goals of the NIH’s Public Access Policy as currently construed, and to aid the NIH in facilitating voluntary compliance by NIH-funded authors.” The PRC survey also revealed authors have limited understanding of the benefits of the NIH policy for the scientific research community, the public or existing journals. However, approximately 42% of survey respondents reported that they intend to post in the future and just 3% responded that they are not planning to post.

Comment. Those who want to strengthen the NIH policy and those who want to keep it weak or even repeal it agree that the compliance rate is dismally low. Just last month the NIH reported to Congress that the rate was below 4%. I interpret the PRC report as an attempt to boost voluntary compliance and head off mounting pressure on the NIH to adopt an OA mandate. This pressure is coming from the NIH's own Public Access Working Group and the NLM Board of Regents, not to mention the original directive from Congress and the pending CURES Act. Whether or not the NIH adopts a mandate, I support the call for greater outreach to grantees and education about the policy. (I was going to link to a 12/9/05 message I posted to the SSP list, showing that I support this kind of outreach, but it seems to have disappeared from the archive.) However, I strongly support a mandate and do not believe that the NIH policy can meet its goals or the goals of Congress without one.

Thursday, March 02, 2006

Comparing four packages of OAI/OA repository software

Thom Hickey, Repository Comparison, Outgoing, March 1, 2006. (Thanks to Rachel Heery.) A comparison of four software packages for OAI-compliant repositories: CONTENTdm, WikiD, DSpace, and Fedora. (Why not Eprits?) The first two are both from OCLC, like Hickey himself.

March SOAN

I just mailed the March issue of the SPARC Open Access Newsletter. This issue takes a close look at recent steps to strengthen the NIH policy and three trends that could cause collateral damage to OA: the webcasting treaty, the opposition to network neutrality, and the end of free email. The Top Stories section takes a brief look at Hindawi's simultaneous conversion of 13 subscription journals to OA, new support for OA to data, OA commitments in Spain, the launch of Open J-Gate, and several recent awards that honor and recognize work on OA.

Mapping the OA movement onto the OS movement

Glyn Moody, Parallel universes: open access and open source,, February 22, 2006. Excerpt:
[In] scholarly publishing...advocates of free (as in both beer and freedom) online access to research papers are still fighting the battles that open source won years ago. At stake is nothing less than control of academia's treasure-house of knowledge. The parallels between this movement - what has come to be known as “open access” – and open source are striking. For both, the ultimate wellspring is the Internet, and the new economics of sharing that it enabled. Just as the early code for the Internet was a kind of proto-open source, so the early documentation – the RFCs – offered an example of proto-open access. And for both their practitioners, it is recognition – not recompense – that drives them to participate.

Like all great movements, open access has its visionary – the RMS figure - who constantly evangelizes the core ideas and ideals. In 1976, the Hungarian-born cognitive scientist Stevan Harnad founded a scholarly print journal that offered what he called “open peer commentary,” using an approach remarkably close to the open source development process....Harnad has long had an ambitious vision of a new kind of scholarly sharing (rather as RMS does with code): one of his early papers is entitled “Post-Gutenberg Galaxy: The Fourth Revolution in the Means of Production of Knowledge”, while a later one is called bluntly: “A Subversive Proposal for Electronic Publishing.” Meanwhile, the aims of the person who could be considered open access's Linus to Harnad's RMS, Paul Ginsparg, a professor of physics, computing and information science at Cornell University, were more modest. At the beginning of the 1990s, Ginsparg wanted a quick and dirty solution to the problem of putting high-energy physics preprints (early versions of papers) online. As it turns out, he set up what became the preprint repository on 16 August, 1991 – nine days before Linus made his fateful “I'm doing a (free) operating system (just a hobby, won't be big and professional like gnu) for 386(486) AT clones” posting. ...Beyond self-archiving - later termed “green” open access by Harnad – lies publishing in fully open online journals (“gold” open access). The first open access magazine publisher, BioMed Central – a kind of Red Hat of the field – appeared in 1999. In 2001 the Public Library of Science (PLoS) was launched; PLoS is a major publishing initiative inspired by the examples of, the public genomics databases and open source software, and which was funded by the Gordon and Betty Moore Foundation (to the tune of $9 million over five years). Just as free software gained the alternative name “open source” at the Freeware Summit in 1998, so free open scholarship (FOS), as it was called until then by the main newsletter that covered it - written by Peter Suber, professor of philosophy at Earlham College - was renamed “open access” as part of the Budapest Open Access Initiative in December 2001. Suber's newsletter turned into Open Access News and became one of the earliest blogs; it remains the definitive record of the open access movement, and Suber has become its semi-official chronicler (the Eric Raymond of open access - without the guns).

Comment. There's more, but read the whole thing. We all know large and small parallels between the two movements, not to mention actual overlap, but Moody is remarkably specific about matching up counterparts. A thumbnail of his map:

  • Harnad → Stallman
  • Ginsparg → Torvalds
  • Suber → Raymond
  • BOAI → Freeware Summit
  • Soros → O'Reilly
  • BioMed Central → Red Hat
  • Elsevier → Microsoft

Update. Also see Glyn's blog posting to follow-up the article.

More on Freeload Press

Freeload Press, which publishes OA textbooks supported by ads, is expanding its line. From today's announcement:
Following a highly successful beta launch, Freeload Press, Inc. (“FP”) announced today the expansion of its unique free textbook initiative. The company plans on providing one million free downloads of college e-textbooks and study aids--all supported by commercial sponsors. Staffed by experienced academicians, entrepreneurs and former textbook publishers, FP developed a breakthrough model providing students with free e-book versions of textbooks. Students download free e-books from FP’s distribution web site, The e-books, which are in Adobe’s PDF format, contain sponsors’ advertising or marketing messages. Corporate sponsors, among them FedEx Kinko’s, pay FP to place unobtrusive messages in e-books at natural breaks in the text and layout. Paperback versions of the e-books, with advertising, can also be purchased for 65% less than comparable textbooks for the same courses. Halfway through the beta launch season, FP model has been used for 75 courses in over a dozen colleges, including U of Texas, Georgia Tech and U-Mass Lowell. The company estimates its beta list of books will save students over $500,000 in textbook costs this year. College instructors at over 350 colleges are registered and reviewing FP’s offerings for next year’s courses with users.... “The student survey data is trending at more than 97% approval rate of the model,” said Ivancevich. “and the instructor approval rating is north of 80%--which is well beyond the original forecast we conducted to launch the business. Instructors are more concerned with the high price of textbooks than we anticipated.”

Disease tracking: Another area where OA is a matter of life and death

Glyn Moody, There's No INSTEDD without Open Access, Open..., March 1, 2006. Excerpt:
An interesting story in Larry Brilliant, newly-appointed head of the philanthropic foundation, wants to set up a dedicated search engine that will spot incipient disease outbreaks. The planned name is INSTEDD: International Networked System for Total Early Disease Detection - a reference to the fact that it represents an alternative option to just waiting for cataclysmic infections - like pandemics - to happen. According to the article:
Brilliant wants to expand an existing web crawler run by the Canadian government. The Global Public Health Intelligence Network monitors about 20,000 Web sites in seven languages, searching for terms that could warn of an outbreak.

What's interesting about this - apart from the novel idea of spotting outbreaks around the physical world by scanning the information shadow they leave in the digital cyberworld - is that to work it depends critically on having free access to as much information and as many scientific and medical reports as possible. Indeed, this seems a clear case where it could be claimed that not providing open access in relevant areas - and the range of subjects that are relevant is vast - is actually endangering the lives of millions of people. Something for publishers and their lawyers to think about, perhaps.

Comment. Exactly. As I've said before, the more knowledge matters, the more OA to that knowledge matters.

More on integrating text and data

Gregory A. Petsko, Let's get our priorities straight, Genome Biology, February 1, 2006 (accessible only to subscribers). Excerpt:
I always thought that the most important thing in any scientific paper was supposed to be the data and how they were obtained. Everything else is window-dressing, because it's filtered through the lens of subjectivity. The background, the discussion - these are somebody's opinions. If the experiments have been done carefully and analyzed thoroughly, the data are the only facts in the paper, the only thing that can be trusted. They're what I want to read and understand. The people who obtained the data have the right to tell me what they think it all means, and I often find their opinions useful, but I also have the right to decide for myself. Yes, I can still do that if I dig out the supplementary material, but I shouldn't have to dig. If our priorities are straight, the methods and the data should be the centerpiece. And in the modern era, there's no reason not to put them there.

Thanks to Alf Eaton for pointing to Petsko's piece, for this excerpt, and for setting up an example (based on OJS) of what Petsko might have had in mind. Excerpt from Eaton's description:

[In this example,] as much of the data as possible is embedded directly in the article, so that when the page is saved, the media should be saved with it. The 'supplementary data' will be linked to directly from the page as well, ideally with a useful rel attribute so that it can be automatically fetched along with the rest of the page. This should make things much easier, rather than having to go to a separate page and click links to see each microscopy movie, or open and scroll around tiny pop-up windows for every table or figure.

Another Swiss research institution signs the Berlin Declaration

The Swiss Federal Institute of Technology Zurich (Eidgenössische Technische Hochschule Zürich, or ETH Zurich) has signed the Berlin Declaration on Open Acces to Knowledge.

This signature is the 150th for the Berlin Declaration.

Three new OA journals hosted by BMC

Three new independent, Open Access journals debuted early this week, hosted by BioMed Central, bringing the total of such titles to 85.

Algorithms for Molecular Biology - Fulltext v1+ (2006+); ISSN: 1748-7188.

Orphanet Journal of Rare Diseases - Fulltext v1+ (2006+); ISSN: 1750-1172.

Radiation Oncology - Fulltext v1+ (2006+); ISSN: 1748-717X.

In the near term, all three titles will be mirrored at PubMed Central, one of the international sites which provide mirror and archive services for Open Access content from BioMed Central. Other such repositories include the e-Depot of Koninklijke Bibliotheek (KB), the National Library of the Netherlands, Potsdam University in Germany, and INIST in France. BioMed Central is also a participant in LOCKSS (Lots of Copies Keeps Stuff Safe).

Wednesday, March 01, 2006

Drug company assures researchers of access to their own data

Last week Jamie Boyle argued that copyright law is so backward that banal developments, like letting policy be informed by empirical evidence, count as progress.

Here's an example from the world of access. Proctor & Gamble is making news for the progressive idea of guaranteeing its researchers access to their own data.

Calicut Medical College organizes and expands its set of OA journals

The Calicut Medical College in Karala, India, has launched a portal for its three OA journals. Two of the journals --the Calicut Medical Journal and the Journal of Orthopaedics-- are several years old but were not previously showcased together at a single portal. The third, Pulmon: The Journal of Respiratory Sciences, is a recent convert from toll-access. Soon its back issues will be OA at the new portal. Dr. Johnson Francis, Associate Professor of Cardiology at Calicut, tells me that the school plans to convert two other non-OA journals to OA shortly and add them to the portal. (Kudos to Calicut!)

More on RSC Archives for Africa initiative

The Royal Society of Chemistry issued a press release about this new development. The free access is for developing countries, many of which are in Africa, although former Soviet states Tajikistan, Uzbekistan, and the Kyrgyz Republic are also included. The program applies to the RSC archives (1841-1996). However, as Hareg Tadesse notes, "... the RSC is only one publisher of Chemistry Journals. And some of my key papers were not from the RSC. Therefore, I would like to call on all publishers of Chemistry Journals to follow the lead of the RSC to support young Chemists like me with their archives so that we can bring the benefits of Chemistry to our great continent."

RSC gives free online access to African researchers

Africa given free journal access, BBC News, March 1, 2006. An unsigned news story. Excerpt:
The Royal Society of Chemistry (RSC) has announced that it will give Africa free access to its journal archives. A total of 1.5 million pages and 250,000 articles will be available electronically to African scientists. At the launch of the initiative, an Ethiopian researcher called on other chemistry journals to open their archives free-of-charge, too. Hareg Tadesse said: "It is not about only me, and only Africa - the whole of the developing world needs supporting." The Archives for Africa scheme was launched on Tuesday at the House of Commons. "Access to scientific information is an essential ingredient for the establishment of a sustainable science base," said Dr Simon Campbell, the president of the RSC. "We believe that free access to the RSC Archive will make a major contribution towards building scientific capacity, which African leaders have stated is essential for social and economic development." The decision to open access to the journals followed the recent G8 meeting in Gleneagles, Scotland, which highlighted the need for capacity building in developing countries in science and technology. Hareg Tadesse, a chemist from Ethiopia, who attended the launch, believes her research has benefited from access to the journals. "In order to do my work, I needed to know the results from previous research. And it was really hard in Addis for me to get hold of the right papers. This is where the archives are going to be so useful," she explained....However, to aid African research, she said more scientific journals needed to give free access to their papers. "I would like to call on all publishers of chemistry journals to follow the lead of the RSC to support young chemists like me with their archives so that we can bring the benefits of chemistry to our great continent," she added.

Frustrations filling an IR

Dorothea Salo, Throw your own spaghetti, Caveat Lector, February 27, 2006. Excerpt:
Some guest librarians are coming in on Friday. They’re interested in institutional repositories, which means me. So I got to spend an hour this morning....answering their questions about IRs. I’ve done this for a number of librarians now. Surveys, exploratory interviews, random email, you name it. The same question invariably comes up: “How do I get stuff for the IR?” I am a nice person. (Well, I like to think I am. I recognize that opinions differ on this point, however.) I like to share things. I especially like to share the answers to tough questions. If I had the answer for this one? The magic bullet? The never-fail recipe for How To Attract Content to a Digital Repository? I would have shared it already. Truly. I haven’t shared it, therefore I don’t have it. QED. Stevan Harnad has an answer....My paraphrase of his answer is “Make IR deposit mandatory!” Sure, that’s a magic-bullet answer --if you have that kind of power. I don’t, and neither do the librarians asking me this question....

Sometimes they’re asking, “How can I fill the IR without extra librarian work?” Simple answer: you can’t. That’s what most libraries with IRs have been trying to do, and it flat-out does not work. I have an obvious bias toward this answer, so don’t trust me --go out and try to find a counterexample. Either the librarians are going to be depositing stuff, or they’re going to be marketing to faculty so that faculty deposit stuff, or (most likely) both....

Sometimes they’re asking “How can I make faculty deposit?” Same answer: you can’t. You don’t control faculty behavior. That leaves you some choices: you can lobby the people who do control faculty behavior, you can dangle carrots in front of faculty, you can take it out of faculty hands, you can build on what faculty are already doing, or you can hope for serendipity. I’m doing all of these things, to varying degrees, and if you look at the (sparse, admittedly) literature, I think you’ll find that most suggested strategies fall into one of these areas....

Sometimes they’re asking “How do I justify my IR’s existence, if it’s not attracting stuff?” I’ll tell you: I don’t know. My job hangs off this question, and I still don’t know what the right answer is. For master’s and Ph.D institutions, electronic theses and dissertations may be the right answer. For some institutions, Special Collections has the answer....I’ll tell you this, though: if you’re trying to answer this question now, you’re almost certainly too soon. “Word is starting to get out,” someone kindly said to me today....I’ve been here seven and a half months; word is just starting to get out, and seeds I planted months ago are just starting to sprout. If you’re expecting immediate results, you shouldn’t be in this business; it takes patience, fortitude, and persistence....What all this grumpiness amounts to is, I am still throwing spaghetti at the wall like a mad thing. I don’t know and can’t tell you which strands will stick where you are. If you want to do this, throw your own spaghetti, and then let’s get together and talk about what stuck.

Blog tracking OA contributions by Caltech researchers

Open Access Authoring @ Caltech was officially announced today. In a nutshell, this is a documentary, rather than an advocacy blog. The blog has more than 100 entries to date documenting various OA activities:

* authoring articles and/or books;

* serving on OA journal editorial boards;

* refereeing for OA journals.

Content management journal converts to OA

The Rockley Report, a quarterly journal on content management, has converted to OA.

Keeping our eye on the OA mission of IRs

Richard Poynder, Institutional Repositories, and a little experiment, Open and Shut, March 1, 2006. This is another of Richard's elegant, detailed, and comprehensive articles --and almost impossible to excerpt. Read the whole thing. On his experiment:
While I am keen to continue writing about OA, and in a way that will enable me to maximise the number of people who can read what I write, it would clearly be helpful if I could earn a little money from that writing too! To this end I have decided to try a little experiment: to self-publish some of my articles about OA via my blog, and then invite readers to pay to read them. No has to pay to read them, but those who find some value in doing so, and feel they would like to help, can do — on a strictly voluntary basis. I am publishing the first such article today. This looks at the history of the institutional repository, and its relation to the OA movement....

In the substantive part of the paper, Poynder traces the history of IRs, disentangling the many different problems they have been proposed to solve, from the affordability of journal literature and digital preservation to the impact of research. He covers a lot of ground but focuses on the contributions of Raym Crow, Paul Ginsparg, Stevan Harnad, Clifford Lynch, the Open Archives Initiative (OAI), Budapest Open Access Initiative (BOAI), the NIH public-access policy, the draft RCUK policy, and the OA mandates in place at a handful of universities around the world. He concludes that the OA mission for IRs is being lost in a confusing welter of other missions, some more attractive to non-research stakeholders and all of them more expensive and more difficult to achieve. "The danger, therefore, is that unless the self-archiving movement puts some clear blue water between itself and the mists of confusion enveloping the institutional repository it could end up shipwrecked."

More on Wikipedia v. Encyclopedia Britannica

In the March issue of Information Today, Paula Berinstein picks up the comparison of Wikipedia and the Encyclopedia Britannica where Nature left off in December.

Another misleading study of author attitudes toward OA journals

Sara Schroter and Leanne Tite, Open access publishing and author-pays business models: a survey of authors' knowledge and perceptions, Journal of the Royal Society of Medicine, 99 (2006) 141-148. Abstract:

Objectives: We aimed to assess journal authors' current knowledge and perceptions of open access and author-pays publishing.

Design: An electronic survey.

Setting: Authors of research papers submitted to BMJ, Archives of Disease in Childhood, and Journal of Medical Genetics in 2004.

Main outcome measures: Familiarity with and perceptions of open access and author-pays publishing.

Results: 468/1113 (42%) responded. Prior to definitions being provided, 47% (222/468) and 38% (176/468) reported they were familiar with the terms `open access' and `author-pays' publishing, respectively. Some who did not at first recognize the terms, did claim to recognize them when they were defined. Only 10% (49/468) had submitted to an author-pays journal. Compared with non-open access subscription-based journals, 35% agreed that open access author-pays journals have a greater capacity to publish more content making it easier to get published, 27% thought they had lower impact factors, 31% thought they had faster and more timely publicaitons, and 46% agreed that people will think anyone can pay to get published. 55% (256/468) thought they would not continue to submit to their respective journal if it became open access and charged, largely because of the reputaiton of the journals. Half (54%, 255/468) said open access has `no impact' or was `low priority' in their submission decisions. Two-thirds (66%, 308/468) said they would prefer to submit to a non-open access subscription-based journal than an open access author-pays journal. Over half thought they would have to make a contribution or pay the full cost of an author charge (56%, 262/468).

Conclusions: The survey yielded useful information about respondents' knowledge and perceptions of these publishing models. Authors have limited familiarity with the concept of open-access publishing and surrounding issues. Currently, open access policies have little impact on authors' decision of where to submit papers.

Comment. Also see the similar studies by Schroter and Tite in BMJ for January 2005 and February 2006. Schroter and Tite write in the current article that "the term 'author-pays' reflects the shift in the cost of publishing from the reader to the author. In reality, though it is the authors' funding body that is expected to cover the costs on the authors' behalf. Some author-pays journals waive fees in cases of author economic hardship." However, they used the term "author-pays" without these qualifications when interviewing subjects. They did define the term during interviews but their definition (in Box 1 of the article) doesn't mention that the fees are usually paid by funding agencies or waived. They acknowledge that 35% of their interview subjects had never heard the term before. Those subjects probably answered interview questions under the false and harmful impression that the fees were to be paid by authors out of pocket. Many interview subjects who had heard the term before probably had the same false idea of its meaning and implications. It also appears Schroter and Tite, while eliciting the useful information that their interview subjects believed that OA journals charging author-side fees compromised peer review, did nothing to correct this false belief before continuing with the interview. Nor did Schroter and Tite apparently inform their subjects that fewer than half of all OA journals charge any author-side fees and that a greater percentage of subscription-based journals than OA journals charge such fees. In short, the study is based on interviews with subjects who had critical misconceptions about OA journals, some of which were inculcated or subtly ratified by the authors. Either the authors should recast the study as showing the effects of these misleading beliefs or redo the interviews.

JRSM converts to OA for its research articles

Kamran Abbasi, Open access for the JRSM, Journal of the Royal Society of Medicine, 99 (2006) 101. Excerpt:
This is an historic issue of the JRSM. All material in the research and the original articles sections on our website can be read for free --from this issue onwards and also back issues online. In addition, all other articles will be free to access 3 years after their publication date --an agreement with PubMed Central that creates a back archive of The Royal Society of Medicine's flagship publication that, by early 2007, will reach back to 1809....Many of you will be aware of the open access debate that has been raging among authors, readers, and publishers of medical journals. Definitions of open access vary but range from one extreme of all journal content being freely available over the internet --and all authors and publishers waiving copyright-- to the other extreme of journals still charging for content but allowing authors to post articles on institutional websites. While many advocates of open access find anything less than full open access repulsive, there is an emerging consensus that for any journal to legitimately claim to be an open access publication the original research articles should be freely available on the journal's website from the moment of publication. This is exactly what the JRSM will now be doing....[W]e will make research articles free on from this point forwards and backwards. This column, one other selected article from the current issue, and articles older than three years will also be free to access online. The remainder of the content will be behind access controls. Our ambition is that this initiative will allow the JRSM to champion the best principles of science while ensuring that we can derive enough revenue to safeguard the journal's future.

Importantly, the work of JRSM authors will suddenly be open to the whole world and will help us attract even higher quality articles for the benefit of readers. The complexities of this debate are thrashed out by Sara Schroter and colleagues who study the views of authors about open access [p 141] [PS: this is OA], Jeffrey Aronson who remains highly sceptical about how the world of open access can be funded [p 103] [PS: not OA], and Richard Smith who begins a series of extracts from his forthcoming book on the trouble with medical journals [p 115] [PS: not OA]. Finally, in another first for JRSM, this month's research paper by Brent Caldwell and others is a fast track publication --published just over 6 weeks after it was submitted [p 132] [PS: OA]. It demonstrates a link between celecoxib and myocardial infarction, a message that underlines the importance of open access to research findings.

Comment. (1) Kudos to JRSM for taking this step. (2) I don't know any OA proponents who "find anything less than full open access repulsive". None of the major public definitions of OA calls for authors and publishers to waive copyright. It's unusual for a journal to misrepresent and oversimplify OA and its proponents when acknowledging enough common ground to join them. Is it possible that JRSM's deliberations would have been easier if it hadn't been responding to a demonized stereotype?

Tuesday, February 28, 2006

Lessig on OA

David Kushner, Uncommon Law, IEEE Spectrum, March 2006 (accessible only to subscribers). An interview with Lawrence Lessig. Excerpt:
Since first being released in 2002, Creative Commons licenses have been used more than 50 million times, and the rate of adoption is growing. With the launch this year [PS: it was last year, January 2005] of an offshoot, the Science Commons, Lessig hopes to expand into the world of research. IEEE Spectrum contributing editor David Kushner talked to Lessig about his plans....

[DK] How does a Creative Commons license benefit the copyright holder?

[LL] Well, a lot of copyright holders benefit primarily by having their work made accessible and encouraging others to build on their work. The clearest example of that is the world of scholarship. As a scholar, I'm interested in people reading my article. I don't get paid when people copy the article. I don't get paid by journals that distribute the article. We support the open-access publishing movement in the context of scholarship, especially through our Science Commons project, because this is perfectly consistent with the desires of the author, which are basically to spread his or her work.

Blackboard users gain access to ResearchNow

Berkeley Electronic Press (bepress) and Blackboard have struck a deal allowing Blackboard users to search the bepress portal ResearchNow and incorporate the results into their Blackboard projects. From today's announcement:
Blackboard Inc. and The Berkeley Electronic Press (bepress) today announced the launch of the ResearchNow Blackboard Building Block, a new tool that integrates with the Blackboard Learning System, enabling online access to tens of thousands of scholarly materials. The ResearchNow Content Building Block makes it possible for educators to search directly in Blackboard for access to more than 85,000 journal articles, working papers, institutional repository materials, theses and dissertations hosted within ResearchNow, The Berkeley Electronic Press's innovative database of scholarly information. Relevant resources may be easily selected and incorporated into the Blackboard Academic Suite. ResearchNow is a collection of academic materials drawing from several primary sources: the roster of peer-reviewed, Berkeley Electronic Press journals (27 and counting), bepress-hosted subject matter repositories such as the bepress Legal Repository and COBRA: The Collection of Biostatistics Research Archive, and all working papers, preprints and other "grey literature" content from institutional repositories hosted by bepress that have opted for inclusion. More than 50 schools -- including the University of California system, Boston College, Cornell, and the University of Nebraska, as well as major universities in Europe and Australia -- use the bepress platform for their institutional repositories. The bepress repository platform has been co-marketed with ProQuest Information & Learning since 2004 as Digital Commons. Collectively, ResearchNow materials have been downloaded more than 3 million times in the past year.

Comment. Bepress offers three degrees of access to ResearchNow, two forms of paid access and its famous quasi-open access. Today's announcement doesn't say what level of access Blackboard users will get.

Update (3/2/06). I just heard from Greg Tanenbaum, President of Bepress. In answer to my question about the level of access that Blackboard users will get, he writes, "It mirrors the existing ResearchNow setup. All content that is freely available to the world (e.g., IR working papers, reports, etc.) will be free to Blackboard instructors and students. Our own journal content will be free under the quasi-open access policy." (Thanks, Greg.)

Australia may charge fees for viewing OA pages in school

Simon Hayes, Copyright makes web a turn-off, Australian IT, February 28, 2006. Excerpt:
Schools have warned they will have to turn off the internet if a move by the nation's copyright collection society forces them to pay a fee every time a teacher instructs students to browse a website. Teachers said students in rural areas would bear the brunt of cuts if the Copyright Agency was successful in adding internet browsing charges to the $31 million in photocopying fees it rakes in from schools. The agency calculates the total due by randomly sampling schools each year for materials they copy, and extrapolating the results...."If it turned out we'd have to pay them, we'd turn the internet off in schools," the council's national copyright director Delia Browne said. "We couldn't afford it; it would not be sustainable. How on earth are we going to deliver education in the 21st century? How are taxpayers going to afford this." The move has teachers up in arms, with some warning "ludicrous" charges for using websites would increase the gap between haves and have-nots.

Comment. Australian members of Parliament: This is insane. Charging a fee for browsing, especially open-access sites, especially in school, will undermine education and research. And you plan to divvy up the collected fees among copyright holders who refuse to provide open access? Hello?

Book-scanning and a forthcoming IR at Hawaii-Manoa

Blaine Tolentino, Hamilton library begins digitizing endeavor, Kaleo, February 27, 2006. Excerpt:
Universities across the United States are considering digitizing classic works for their libraries; the University of Hawai'i at Manoa's Hamilton Library is no exception. UHM [University of Hawaii at Manoa] has already made efforts to digitize unique documents such as old Hawaiian language newspapers and images from the Charlot collection....Library officials at colleges like the University of Texas at Austin have made preliminary efforts over the last year to employ the Open Content Alliance, an organization attempting a mass-book digitization. Both MSN and Yahoo! have announced that they are willing to participate in providing books online. Despite the perception that this is a new field, book digitization has been occurring for more than 10 years. UTOPIA, an online program offering a broad range of content to the general public, has provided sources online, including the Gutenberg Bible....Columbia University and the University of California are among higher education institutions interested in providing free, digital access to major public works....According to Rutter, books from 1850-1923 are those that should be digitized soonest. Books from this period of time, referred to as the "brittle book period," are exempt from the 50-year copyright period; therefore, there would be no copyright infringement. Cornell University has made efforts to digitize agriculture literature specifically from this time. "The books printed at that timelot of pulp and acid in it, so they're crumbling a fair amount faster," Rutter said. "The books from that time will not be available, so digitization makes sense."...UHM's dissertations are currently digitized using Proquest, a licensed database that has moved from using microfilm as a resource to using online versions of the documents. Currently UHM is looking at an institutional repository that would hold all of the data produced on campus in a safe and accessible place."

More on OA to medical research for lay readers

Ray Corrigan has posted a draft book chapter to his blog on the subject of OA to medical information. He welcomes comments. Excerpt:
In 1998 the British Medical Journal (BMJ), based on the principle of facilitating free and unrestricted access to scientific information, decided to make the entire contents of the journal freely available on the Internet. By January 2005, due to a drop in income, the journal partly reversed that decision, making some of the contents accessible online only to paying subscribers, though many elements of the journal such as a selection of research articles remained freely available at In February 2006, the BMJ published the results of a survey ‘To determine whether free access to research articles on is an important factor in authors' decisions on whether to submit to the BMJ, whether the introduction of access controls to part of the BMJ's content has influenced authors' perceptions of the journal, and whether the introduction of further access controls would influence authors' perceptions.’ It was a relatively small survey with a little over 200 authors participating but the results suggested free online access was important to a large majority (75%) of them, so the publishers agreed to retain their partial open access policy for the time being. Other important medical journals, like The Lancet, only provide online access to paying subscribers....So is putting complex personal healthcare decisions in the hands of the individual a good idea?...What about if I have a bit more time to do some research and find out a bit more about say an ongoing chronic condition? A friend of mine with a hip complaint went to great lengths to research his condition and ended up impressing his doctor with the depth of his knowledge on the subject. But supposing the materials he read had not been as freely available as they had been and he had to pay The Lancet, the BMJ and hundreds of other sources a hefty fee for each article he read, would he have had the ability to make the decisions he did about his treatment? Doctors could justifiably claim that most lay people are insufficiently well trained to understand even the language of medics or the reliability of the sources, especially on the Internet, from which we might derive much of this medical ‘information.’ And if the truly reliable peer reviewed sources like the BMJ do gradually move towards a subscription only service, where is the average patient going to get access to important medical information required to make informed healthcare choices?

PubChem keeps growing

Chemists using Wikipedia

Jon Evans, Information free-for-all, Chemistry World, February 24, 2006. Excerpt:
The online encyclopaedia Wikipedia could become the main source of chemical information in 5–10 years, according to a professional chemist who contributes to the site....a recent study in Nature found that the accuracy of scientific entries in Wikipedia was not far behind those in Encyclopaedia Britannica. This finding is supported by regular contributor Martin Walker, assistant professor of organic chemistry at the State University of New York at Potsdam, US. Many of the chemistry entries are now reasonably accurate, he said, but you have to know where to look. ‘A general rule on Wikipedia is that an article that has been heavily edited and around for a long time is usually pretty good,’ Walker told Chemistry World, ‘if it hasn’t, it may be flawed.’ The accuracy of Wikipedia’s entries will continue to improve as contributors begin to organise themselves and take responsibility for certain subjects, he said. Chemistry content on the site is coordinated through two so-called ‘wikiprojects’, said Walker: chemistry and chemicals. Entries can still suffer from poor English and deliberate vandalism, but these problems are gradually being resolved and can be pinpointed quickly. ‘Try vandalising something like hydrochloric acid and see how long it takes someone to fix it,’ said Walker. Many of the Wikipedia contributors are quite young, but Walker estimates that there are around 10 PhD-qualified chemistry contributors, as well as several knowledgeable graduate and undergraduate chemists. More professional chemists should get involved, urged Walker. ‘We have come a long way, but there is still a huge amount to be done,’ he said. Walker will speak about his Wikipedia involvement at the American Chemical Society national meeting in March. ‘[Wikipedia] will become for information what Google is for searching,’ he predicted.

Strong defense for publicly-published OA journal

Steve Gibb, Opposition Heavy To US Environmental Health Agency Plan To Privatise Open Access Journal, IPWatch, February 28, 2006. Excerpt:
The director of the US National Institute of Environmental Health Sciences (NIEHS) is facing overwhelming opposition to a plan to privatise an open-access environmental science journal NIEHS publishes, according to public comments on the plan. NIEHS last September proposed privatizing Environmental Health Perspectives (EHP), a free, online monthly that publishes information on major toxics like dioxin, mercury and lead and distributes the data free to developing countries. The privatization plan came after a budgetary review, and NIEHS’ new director suggested the funding could be re-directed toward research. But the privatization plan is drawing broad domestic criticism from academics, state health agencies, and many US Environmental Protection Agency officials, who fear they would lose access to critical data that helps agency scientists set toxic risk limits and other policies affecting vulnerable populations, according to comments obtained under the Freedom of Information Act. There also is a lengthy set of international comments expressing opposition to the plan as well. In addition to comments from Taiwan, Israel, Argentina, New Zealand and India, over 35 Chinese scientists emphasized the value of the information to them in addressing the country’s environmental challenges and praised EHP’s quarterly Chinese edition. “Don’t let the world fall down into black fog,” a Taiwanese commenter says. NIEHS director David Schwartz said in a recent interview that he will decide about the privatization option in the next three to six months. “We will be exploring all the opportunities and options available to make this the strongest environmental science journal that is accessible to the widest audience. If he opts for privatization, he likely would establish a formal procedure soliciting bids, reviewing them, and deciding on open-access policies for the archives. About 94 percent of over 330 public respondents opposed privatization of the journal outright, saying it would be a “disadvantage” or “strong disadvantage.” In addition, just over four percent support privatization only if EHP’s content remains free online, according to an analysis of the comments. Less than two percent of the comments supported unconditional privatization.

Library attitudes toward the big deal

Karla Hahn, The State of the Large Publisher Bundle: Findings from an ARL Member Survey, ARL Bimonthly Report, April 2006. Excerpt:
As Frazier noted, a key feature of journal bundling is strict limits on a library’s ability to cancel titles. This restriction of cancellations creates challenges for budget management, for collection management, and for the marketplace of scholarly journals. As library budgets are locked into large bundles, cancellation pressure on unbundled titles increases and funds to acquire new journals outside of bundles are squeezed out. In addition, the size of the largest bundles has been growing. There is a history of significant consolidation among the largest publishers in recent years, reducing libraries’ choices among journal bundles. In the three years since ARL last surveyed its members regarding e-journal subscriptions, several of the largest players have merged: Academic with Elsevier, and Kluwer Academic with Springer. The consequence of these mergers is that a short list of publisher bundles account for substantial and growing proportions of library budgets. A short list of negotiations increasingly determine sales terms and access to hundreds of titles and consume the majority of a research library’s journals budget....Experience with past surveys has shown how difficult it is to obtain comparable price information on journal bundles. Because of nondisclosure clauses, the diversity in the structure of pricing models between publishers and among customers, varying practices for allocating of costs within and between institutions, and occasional funding from sources outside the library budget, it is nearly impossible to construct a rubric for reporting pricing that is not unbearably burdensome for libraries to use....There is evidence in the survey responses that, by and large, publishers are encouraging this movement from print to electronic journals. However, savings incentives are generally reported to be modest or even nonexistent....The picture is more mixed with regard to assessments of publishers’ archiving arrangements to protect content in the case of business failure or other loss of content on the publishers’ side. A strong majority of libraries (71% of contracts) reported that they had investigated a publisher’s archiving plans. However, for those who checked, the publisher’s arrangements were found satisfactory by only 60% of the respondents. While a 40% dissatisfaction rate represents a minority assessment, it is a substantial customer base to be expressing “no confidence”. It may be that until publishers increase customer confidence in their archiving arrangements, substantial numbers of libraries will continue to retain print subscriptions out of concerns about long-term access to electronic journals....Libraries reported an average satisfaction rating of 3.4 (on a 5-point scale) for the pricing of their first contract with any given publisher. They reported somewhat lower satisfaction with consecutive contracts—down to 3.25 for current contracts—suggesting a perception that, as contracts are renegotiated, perceptions of advantageous pricing are weakening....There is no doubt that large commercial publishers’ bundles are a substantial part of research library collections. It is also clear that significant changes in library collections are underway. Cancellation projects are common....Cancellation of bundled titles has been effectively limited in recent years. Publisher’s archiving arrangements are unsatisfactory to at least a substantial minority of the community. Satisfaction with bundle pricing is decreasing through successive negotiations. This survey documented that journal bundles have already enjoyed substantial protection from cancellation. With the majority of respondents reporting recent cancellation projects, the inescapable conclusion is that other segments of research library collections have been reduced to a greater extent in compensation for the protection afforded to bundles. This should be of concern to the library community and to publishers without the market power to gain similar protection for their titles....If libraries could eliminate nondisclosure clauses, obtain more generous cancellation terms, and achieve better price structures, satisfaction with bundles would likely increase.

Monday, February 27, 2006

Clearing the patent thicket for stem-cell research

Merrill Goozner, Innovation in Biomedicine: Can Stem Cell Research Lead the Way to Affordability? PLoS Medicine, May 2006. Excerpt:
The IP system may be contributing to the slowdown [of biomedical innovation]. The current innovation system encourages researchers to patent and commercialize discoveries that in an earlier era were considered basic science insights. This has led to an active market in the building blocks of further research, which can be anything from a genetic sequence or a cell receptor to the reagents needed to culture cells. This proliferation of basic science patents has raised the bar --what economists call transaction costs-- for other researchers who want access to those research tools. While many researchers, especially in academia, find ways around patent restrictions, and many companies have no trouble executing license agreements, there are cases where “patent thickets” have discouraged other researchers from pursuing similar or subsequent lines of inquiry. The stem cell field, which is still years away from its first approved therapy, has already experienced patent thicket problems....A recent survey by the United Kingdom Stem Cell Initiative identified nearly 18,000 stem cell patents issued around the world since 1994, with two-thirds issued in the US. The Washington-based law firm of Sterne Kessler Goldstein and Fox has warned clients that “any company or research institution that plans to develop stem cells for therapeutic purposes may face a number of blocking patents and applications that will require licenses, if available”....CIRM [California Institute for Regenerative Medicine] and other stem cell funders can become catalysts for cutting through this patent thicket. They can require that all grant recipients agree to donate the exclusive license to any insights, materials, and technologies that they patent to a common patent pool supervised by a new, nonprofit organization set up for that purpose. A patent pool serves as a one-stop shop where investigators can obtain no-cost or low-cost licenses for subsequent research.

Launch of Open J-Gate

Today marks the launch of Open J-Gate, an OA journal portal from Informatics India. From the site:
Open J-Gate is an electronic gateway to global journal literature in open access domain. Launched in 2006, Open J-Gate is the contribution of Informatics (India) Ltd to promote [OA]. Open J-Gate provides seamless access to millions of journal articles available online. Open J-Gate is also a database of journal literature, indexed from 3000+ open access journals, with links to full text at Publisher sites. Open J-Gate Features and Benefits: [1] Portal with the largest number of e-journals. Open J-Gate indexes articles from 3000+ academic, research and industry journals. More than 1500 of them are peer-reviewed scholarly journals. [2] Links to one million+ open access articles. This number is growing with 300000+ new articles added every year. Full-text links are regularly validated. [3] Constant updating. The Open J-Gate site is updated every day. [4] Well designed journal classification. All journals are classified in a three-level hierarchical system to provide for better relevancy in search results. [5] Table of Content (TOC) Browsing. Users can browse the TOC of latest issue and the back issues. [6] Easy-to-Use search functionalities. Database allows various search options for the user’s convenience. The subscriber can search by Title, Author, Abstract, Authors’ Address/Institution, Keywords, etc.

GBIF recommends OA to biodiversity data

On January 16, 2006, the Governing Board of the Global Biodiversity Information Facility (GBIF) adopted a Recommendation On Open Access To Biodiversity Data. Excerpt:
The Global Biodiversity Information Facility (GBIF) Governing Board -- representing 47 countries, 31 international organizations and the Secretariat on the Convention of Biological Diversity - hereby recommends that research councils, other funding agencies and private foundations:
  • Promote that proposals for funding for biodiversity research include a plan for the maintenance and sharing of the digital biodiversity data generated in proposed projects;
  • Promote that species and specimen level data and associated metadata that are generated in funded projects are made publicly available through mechanisms cooperating with GBIF, within a specified period after completion of the supported research.

Rationale: Many research projects generate biodiversity data sets that are relevant for the wider scientific community, government natural resource managers, policy makers, and the public. Because data sharing now requires small marginal costs compared to the full research costs that generate the data, it is wise to allow for further shared use of these data to benefit the widest possible range of users....

Two of the goals of GBIF are to bring together data for multiple uses, and to find incentives and mechanisms to make data freely available as quickly and effectively as possible. These goals underlie the recommendations made here....The advantages of free and open data sharing have been documented (Arzberger et al. 2004) and brought together in the collaborative Conservation Commons:

  • Sharing data is good scientific practice and is necessary for the advancement of science, public awareness and education;
  • Expanded access to data sources could impressively increase the value to taxpayers of the more than $650 billion spent annually by governments on all research disciplines....
  • The openness of science stimulates and facilitates creativity;
  • Open access to data enables greater accountability to funding sources as quality, reliability, productivity and use of data are enhanced with public utilization and review.

Requirements for open access to data...signal the importance of data sharing to science and to decision-making, as well as to the long-term benefits to society and the environment, while respecting the right of scientists to publish on their data before releasing it for use by others.

Comment. Also see last year's GBIF Statement on Free and Open Access to Data (formulated 12/04, revised 1/05, released 3/05, issued 10/05).

Chemistry World opens debate on OA in chemistry

Chemistry World, the Royal Society of Chemistry's news magazine, posed the following question for its "Your views..." column in the February 2006 issue 3(2):27 (subscription required):
How should chemists respond to open access publishing?
The three respondents qualified their responses noting a suspected potential for lower standards and, most commonly, how to achieve a high level of editing, review and presentation while attempts are made to lower the cost of distribution. [Thanks to Dana Roth, Caltech, for bringing this to my attention.]

More on whether Google Library violates copyright

A webcast of the presentations at the AEI-Brookings Institute symposium, The Google Copyright Controversy: Implications of Digitizing the World's Libraries (Washington, D.C., February 24, 2006), is now online. Also see Rob Capriccioso's summary in today's issue of Inside Higher Ed.

Proposing an OA wiki for mechanics

Zhigang Suo, Wikipedia and Applied Mechanics, Applied Mechanics News, February 25, 2006. Excerpt:
Technology is now available to start a wiki on mechanics (wikimechanics) to document in a useful way everything known about mechanics. I mean everything: from everyday experience to esoteric theories, and everything in between. It should also have an exhaustive collection of pictures and data, all properly hyperlinked. I also mean useful. How do we catalog everyday experience to make it useful for serious decisions? How about an open-source finite element code, with links to a materials database? What if Ashby's Materials Selector becomes an open-access, user-enriched, and ad-supported repository? Many open-source wiki engines are available today; our wiki need not be part of Wikipedia. The authors of the wiki could be from the entire international community of applied mechanics – professors, students, engineers, and amateurs. They would also be the users. Along the way, we'll figure out how to assign credits to individual authors in such a collaborative effort. This wiki would co-evolve with the subject of mechanics: they would influence each other.

INFLIBNET's institutional repository

Yatrik Patel, J.K. Vijayakumar, and T.A.V. Murthy, Institutional digital repositories/E-Archives: INFLIBNET’s initiative in India. In M.G. Sreekumar (ed.), Proceedings 7th MANLIBNET Annual National Convention, Kozhikode (India), 2006, pp. 312-318. (Thanks to SEPW.)
Abstract: The technological advances today make it possible to think in terms of storing all the knowledge of the human race in digital form and several organizations worldwide are experimenting with less-expensive ways to create Institutional Repositories. For long-term preservation of our knowledge base and cultures, we have to find out an economical way to save digital content for future generations. INFLIBNET (Information and Library Network Centre) decided to opt DSpace for its Institutional Repository and archive its publications, conference proceedings, lecture notes etc. D-Space, jointly designed by MIT and HP, is a groundbreaking digital library system that captures, stores, indexes, preserves and redistributes the intellectual output of a university’s research faculty in digital formats. This paper narrates the practical experiences and provide an overview of INFLIBNET’s institutional repository and dArchive-India developed for Indian academic and research community to archive their intellectual work.

Institutional repositories in India

Anup Kumar Das, B.K. Sen, and Chaitali Dutta, Collection development in digital information repositories in India, Vishwabharat@TDIL, 17 (2005): pp. 91-96. Self-archived February 22, 2006. (Thanks to SEPW.)
Abstract: The institutional repository (IR) is a contemporary concept that captures and makes available through Internet and intranet the institutional research output and other relevant documents to the users by way of digitizing the output The IRs have already started emerging in India. This study highlights the importance of IR, delineates the scope and methodology projects the findings. Most of the repositories are using open source information repository software like DSpace, Greenstone Digital Library Software and GNU EPrints. It is observed that generally documents like theses and dissertations, seminar papers, journal articles, etc., are being found more in the repositories. Some of the problems of the repositories have been highlighted and suggestions offered.

Sunday, February 26, 2006

How hosting OA/OAI repositories will change libraries

Leo Waaijers, From Libraries to 'Libratories', First Monday, December 2005. (I check FM for OA-related articles but somehow overlooked this one; I thank Kimmo Kuusela for the reminder.)
Abstract: While the eighties of the last century were a time of local automation for libraries and the nineties the decade in which libraries embraced the Internet and the Web, now is the age in which the big search engines and institutional repositories are gaining a firm footing. This heralds a new era in both the evolution of scholarly communication and its agencies themselves, i.e. the libraries. Until now libraries and publishers have developed a digital variant of existing processes and products, i.e. catalogues posted on the Web, scanned copies of articles, e–mail notification about acquisitions or expired lending periods, or traditional journals in a digital jacket. However, the new OAI repositories and services based upon them have given rise to entirely new processes and products, libraries transforming themselves into partners in setting up virtual learning environments, building an institution’s digital showcase, maintaining academics’ personal Web sites, designing refereed portals and — further into the future — taking part in organising virtual research environments or collaboratories. Libraries are set to metamorphose into ‘libratories’, an imaginary word to express their combined functions of library, repository and collaboratory. In such environments scholarly communication will be liberated from its current copyright bridle while its coverage will be both broader — including primary data, audiovisuals and dynamic models — and deeper, with cross–disciplinary analyses of methodologies and applications of instruments. Universities will make it compulsory to store in their institutional repositories the results of research conducted within their walls for purposes of academic reporting, review committees, and other modes of clarification and explanation. Big search engines will provide access to this profusion of information and organise its mass customization.

From the body of the paper:

Although we may not know exactly what the future information needs will be of the academic community, i.e. students, teachers and researchers, to me one thing is certain: open access to state–of–the–art knowledge is crucial in order for both research and learning environments to succeed. Limited access, be it the result of either technical or juridical implications, impedes solid growth in the human knowledge base. Put another way, there is no point going to great pains to overcome technological obstacles facing ICT only to come up against the legal copyright barriers. An interesting example here is the Elsevier content stored in the e–Depot of the Netherlands National Library: the costly technological infrastructure required for guaranteeing long–term access to this material is renowned. But in order to enjoy this access one has to travel to the library in The Hague and then possibly stand in a queue, as only one person at a time is allowed access — a replica of the situation in the paper era....Authors have to be convinced that depositing is in their own interest. In doing so it is most important to demystify the issue. For example, they need to be told that current research shows that open access publishing increases the number of citations and hence impact factors, that the Romeo site proves that publishers are gradually giving in on copyrights, that experiences of authors, who formulate their own copyright statements, teach us that publishers accept them, that parallel publishing on the Internet stimulates sales of a book’s paper version — and so on and so forth. A project like the Netherlands’ Cream of Science demonstrates that it is possible to overcome the hurdles and make top authors, even Nobel Prize winners, enthusiastic about placing their work in repositories. It has also shown that so–called objections sometimes amount to no more than librarians’ perceptions of author viewpoints. And that it is occasionally impossible to publish the complete oeuvre of an author simply because his or her publications have become lost. This in itself constitutes another powerful argument for depositing materials in repositories.

More on Perfect 10 and the Google Library project

There's now a Slashdot thread on whether the Perfect 10 decision will hurt the Google Library project.

Kansas may provide OA to court records

A bill in the Kansas State legislature would provide OA to state court records, with the costs to be paid by a small increase in docket fees. The bill has passed the Senate and moved to the House.

PS: This is much better than the plan proposed by the Kansas Supreme Court, which would charge citizens $2 for every online record search.

New Information Commons wiki covers OA issues

The Canadian Library Association has launched the Information Commons Wiki. IC Wiki will cover OA issues and already has a strong page on Canadian OA Initiatives.

(Thanks to Olivier Charbonneau for the wiki and to Heather Morrison for the alert that it was online and ready for use.)

Google Scholar citation counts nearly as good as Thomson's

Léo Charbonneau, Google Scholar service matches Thomson ISI citation index, University Affairs, March 2006. (Thanks to Dean Giustini.) Excerpt:
The free Google Scholar service does as good a job as Thomson ISI’s science citation index for performing citation counts and could be used as a cheap substitute to the costly Thomson service, says a University of British Columbia professor. Thomson’s citation databases are accessible through the company’s Web of Science portal only by subscription, which can cost a university tens of thousands of dollars a year. Daniel Pauly, director of the Fisheries Centre at UBC, and Konstantinos Stergiou, of Aristotle University of Thessaloniki in Greece, compared the two methods using 114 papers from 11 disciplines published between 1925 and 2004. For papers published before 1990, the authors found that the citation counts were proportional. In other words, if Thomson ISI found that a particular paper was cited 10 times as often as another, Google Scholar found the same ratio. However, for these older papers, the actual citation counts with Google were about half that of Thomson. But, for papers published from 1990 on, not only were the citation counts proportional, the actual number of citations was nearly the same. The result is “very surprising,” said Dr. Pauly. “I didn’t expect Google to perform so well. . . . I expected some vague proportionality, but I did not expect that it would be roughly one to one.”

Update. See Gary Price's short review of Pauly's article on ResourceShelf, February 25, 2006.

Update (2/27/06). Anurag Acharya, the man behind Google Scholar, responded to a few questions from Dean Giustini about the Daniel Pauly article. Excerpt:

[Google Scholar presents] the same index and the same ranking [to all users regardless of location]....We cannot share update information, but our long-term goal is to update every day.

Mandating OA to data, and then accommodating it

Bela Tiwari, Dawn Field, and Jason Snape, Public repositories need serious funding, Nature, February 23, 2006. A letter to the editor. Excerpt:
We support the suggestion made by Carlos Santos and colleagues in Correspondence (Nature 438, 738; 2005) that data associated with peer-reviewed articles should be submitted to recognized, public repositories wherever possible.  We suggest that attaining this goal requires the support of national and international funding bodies that are willing both to implement data policies and to fund efforts to create community-driven standards and public repositories.  For funding bodies, supporting a data policy can be expensive. Current policy for the UK Natural Environment Research Council (NERC) states that all data generated by projects funded under the environmental-genomics programme and the post-genomics and proteomics programme must be submitted to a suitable public repository, when one is available. NERC puts approximately 12% of the funds from each of these programmes towards data management and training through the establishment of the NERC Environmental Bioinformatics Centre, which facilitates 'omic' data management by developing data standards, software, databases, bioinformatics workstations and courses, and delivering these to the community, as well as hosting digital data for cases where suitable public repositories do not already exist.  We argue that putting the tools and facilities in place to enable good data management is an area worth investing in. As well as addressing the aims of integrated, long-term data storage and access, this investment would minimize duplication of effort, facilitate uptake and sharing of data and maximize the potential for comparative analyses.