Open Access News

News from the open access movement


Saturday, December 20, 2008

IFLA call for papers on OA

The International Federation of Library Associations and Institutions has issued a call for papers for its 2009 conference (Milan, August 23-27, 2009). The theme for the Library Theory and Research Section is "Research into open access". Abstracts are due by December 31, 2008. Submissions can be made in the IFLA languages (Arabic, Chinese, English, French, German, Russian and Spanish) and Italian.

Open Access (OA) has become the subject of much discussion amongst researchers, academics, librarians, university administrators, funding agencies, government officials, and publishers. Although OA has become a topic of considerable interest, with a growing body of work exploring the impact of OA on scholarly research and communication for various disciplines, surprisingly very little rigorous research has been conducted into or about OA itself.

At its session in Milan, the IFLA Library Theory and Research Section (LTR) will focus on research that explores the reality of providing OA. The intention of the session is to provide a forum for library professionals to critically discuss key issues related to developing, managing and sustaining OA across the world. Papers may address issues such as challenges and barriers, the realities of financial and institutional support, policy and planning or principles involved in matters of OA development, management and sustainability. Papers discussing these issues from the perspective of different disciplines or contexts are welcome. ...

New OA social science journal

The Journal of Alternative Perspectives in the Social Sciences is a new peer-reviewed OA journal published by the Guild of Independent Scholars. The inaugural issue is now available. Authors retain copyright to their work. (Thanks to John Reidelbach.)

New issue of ScieCom Info

The new issue of ScieCom Info is now online.

Google pulls plug on Palimpsest project

Alexis Madrigal, Google Shutters Its Science Data Service, Wired Science, December 18, 2008.  Excerpt:

Google will shutter its highly-anticipated scientific data service in January without even officially launching the product, the company said in an e-mail to its beta testers.

Once nicknamed Palimpsests, but more recently going by the staid name, Google Research Datasets, the service was going to offer scientists a way to store the massive amounts of data generated in an increasing number of fields. About 30 datasets — mostly tests — had already been uploaded to the site.

The dream appears to have fallen prey to belt-tightening at Silicon Valley's most innovative company.

"As you know, Google is a company that promotes experimentation with innovative new products and services. At the same time, we have to carefully balance that with ensuring that our resources are used in the most effective possible way to bring maximum value to our users," wrote Robert Tansley of Google on behalf of the Google Research Datasets team to its internal testers.

"It has been a difficult decision, but we have decided not to continue work on Google Research Datasets, but to instead focus our efforts on other activities such as Google Scholar, our Research Programs, and publishing papers about research here at Google," he wrote.

Axing this scientific project could be another sign of incipient frugality at Google....

"The Space Telescope Science Institute has had a long positive relationship with Google that started with our partnership in GoogleSky in early 2006," said astrophysicist Alberto Conti of STSI. "We were looking forward to Google's commitment to helping the astronomical community with the data deluge...."

And Conti noted, other companies may step up to help scientists manage their information.  "Amazon is doing exactly the opposite and they might actually fill the void," he said.

Google representatives did not respond immediately to request for comment.

PS:  Also see our past posts on the Palimpsest project and our post on the ongoing Amazon alternative, Public Data Sets.

Update.  See Antony Williams' comments.


Friday, December 19, 2008

JISC launches a YouTube channel

JISC has launched a YouTube channel. Some of the videos already available discuss OA projects, such as the First World War Poetry Archive (see OAN post) and The Cabinet Papers, 1915-1977 (see OAN post). (Thanks to Klaus Graf.)

Open archives in Spain

Fernanda Peset and Antonia Ferrer, Implantación de la Open Archives Initiative en España, Information Research, December 2008. English abstract:

Introduction. The article approaches the situation in Spain of actions related to the Open Archives Initiative Protocol for Metadata Harvesting (OAI-PMH), used for the harvesting and diffusion of metadata for documents. The type of digital libraries, under this protocol, is experiencing worldwide development with institutional repositories developed mainly by academic organizations.

Method. International inventories have been consulted and have contrasted the data through two types of surveys.

Analysis. Studying the state of implantation of OAI-PMH in Spain.

Conclusions. The situation in 2006 was encouraging in terms of number of projects, but less so in terms of the amount of stored data. It is necessary to remedy the omissions and errors in the official international registers and measures are required in Spain for coordination to enable profitable investments. Finally, the lack of studies such as presented here hinders the understanding of this area and therefore the application of indicators to assess the dissemination of science through open archives.

Thoughts on CC and the public domain

Mike Linksvayer, CC6+, Mike Linksvayer's blog, December 17, 2008.
... Retroactive copyright extension cripples the public domain, but there are relatively unexplored options for increasing the effective size of the public domain — instruments to increase certainty and findability of works in the public domain, to enable works not in the public domain to be effectively as close as possible, and to keep facts in the public domain. [Creative Commons] is pursuing all three projects, worldwide. I don’t think any other organization is placed to tackle all of these thorny problems comprehensively. ...

Study on licensing practices for OERs

ccLearn, What status for “open”? An examination of the licensing policies of open educational organizations and projects, report to the Hewlett Foundation, December 15, 2008. See also the summary on the Creative Commons blog.

SHERPA's Chrismas card

SHERPA has posted a tongue-in-cheek repository-themed Christmas card.

Comment. A twelve year embargo? That's awfully long.

Interview with Internet Archive's Brewster Kahle

Amy Van Vechten, Know It All, FLYP, December 12-22, 2008. A text/multimedia/video interview with Brewster Kahle of the Internet Archive. (Thanks to ResourceShelf.)

Obama's NOAA nominee and OA

U.S. President-elect Barack Obama has selected his nominee to head the National Oceanic and Atmospheric Administration, marine biologist Jane Lubchenco. (Thanks to Subbiah Arunachalam.)

Lubchenco has been involved in the debate about OA, and especially about open data, most notably during her tenure as president of the International Council for Science. At the World Summit on the Information Society in 2003, Lubchenco highlighted the importance of access to scientific data, but stopped short of calling for OA. But an editorial in Science, co-authored by Lubchenco and Shuichi Iwata, seemed to go further:

The World Summit on the Information Society ... provides an unprecedented opportunity for the scientific community to promote the importance of open access to scientific knowledge to world leaders ...

Although no one appears to be strongly opposed to the principle of open and equitable access to scientific data and knowledge, that value can easily be relegated to a secondary position relative to short-term commercial interests. Hence, it is crucial that the science community continue to promote the societal benefit of widely shared scientific knowledge. ...

P.S. NOAA is part of the Department of Commerce, one of 11 departments and agencies that was covered under the proposed Federal Research Public Access Act, which would have required public access to research funded by those departments and agencies.

See also:

More on OA at Harvard

Corydon Ireland, Fair shows progress of humanities in digital world, Harvard University Gazette, December 18, 2008.

... In some cases, digital resources at Harvard have been accelerated or expanded because of an [Faculty of Arts and Sciences] vote in February on open access to scholarly literature. It requires Harvard faculty members to submit an electronic version of scholarly articles to the Provost’s office for an online repository, where they will be available free.

In June, Harvard plans to “open the repository to the world,” said Amy Brand, program manager in the Harvard University Library’s new Office for Scholarly Communication. “The nice thing about a digital archive, as opposed to a physical archive, is that we can make it visible and accessible to the wider world.”

Her office’s DASH repository — as in, Digital Access to Scholarship at Harvard — is now only available to members of the Harvard community as a beta test, she said.

But by this summer, DASH will be accessible to anyone worldwide through Google Scholar and other online indexing services.

In May, Harvard Law School (HLS) instituted a parallel open-access requirement for its faculty scholarship.

Harvard University Press (HUP) is also exploring open access, said Daniel Lee, director of digital content development. Early next year it will publish the first online issue of the Journal of Legal Analysis, a peer-reviewed, open-access journal sponsored by HLS. ...

Want to use the Dewey Decimal system in your IR? Get a license

Dorothea Salo calls attention to a mailing list thread where a DSpace user learns that the Dewey Decimal Classification is owned by OCLC and requires a license for use.

Videos from Free Culture 2008

A collection of video interviews from Free Culture 2008 (Berkeley, October 11-12, 2008) is now online. At least the interview with Lawrence Lessig mentions OA.

See also our past posts on the conference: 1, 2, 3, 4, 5.

Year in review from Internet Archive

The Internet Archive has posted its year-end summary:

... Highlights of the Year

  • One million searchable, downloadable books
  • 150 billion web pages
  • 250,000 audio recordings
  • 135,000 videos
  • 796 new collections
  • 23 million Book Records on the new OpenLibrary.org site
  • Scan-on-Demand partnership with Boston Public Library
  • 182 employees in 18 facilities in 5 countries scanning 1,000 books a day
  • 145 libraries contributing to the Open Content Alliance
  • 140,000 images and videos on the new partnership with NASA NASAimages.org ...

Google scanning at Mexico's National University

Gonzalo Clemente Lara Pacheco, Libros de la UNAM a través de Google, Crítica Bibliotecológica, December 2008. English abstract:
Within the framework of the agreement signed by UNAM and Google in 2007 to digitize the National University's collection of books, there needs to be a means of promoting UNAM's Biblioteca Digital (BiDi-UNAM), which has been digitizing its collection of bibliographic holdings dating from 1950 to the present. The early experiences of BiDi-UNAM, the use of its technical resources and staff, and its policy of open-access e-book production should be considered as a case study for the new digitization project (the 2007 agreement).

ALA paper to Obama transition calls for OA to publicly-funded research

American Library Association, Opening the "Window to a Larger World": Libraries' Role in Changing America, report to the Obama transition team, December 17, 2008.

In addition to sections on funding for library programs (including for OA projects) and copyright, see especially the section on government information:

  • Require all federal agencies to implement open access policies that emphasize the value of public access to their documents, procedures and other information; hold agencies accountable if they do not meet the “openness” standards and do not provide long-term access to their documents, especially in digital formats;
  • Promote transparency and openness in government, by supporting Government Printing Office (GPO), USA.gov and others that produce and provide access to government information; assure feasibility of access by including library representatives in planning and design of information products and online services; ...
  • In conjunction with federal agencies, including the Office of Management and Budget (OMB), GPO, National Archives and Records Administration (NARA) and the Library of Congress, and librarians serving the public, develop long-term plans for ongoing permanent public access of government information including “born digital” and legacy paper documents;
  • Increase funding and develop long-term plans to maintain digital repositories for research and discovery of all types;
  • Insist that all federally funded research reports be publicly accessible (ex. NIH Public Access Policy) ...

Presentations from SPARC repositories meeting

The presentations from the SPARC Digital Repositories Meeting (Baltimore, November 17-18, 2008) are now online.

See also our past posts on the meeting: 1, 2, 3, 4, 5, 6.

Nuclear medicine journal shortens its delayed OA embargo

The Journal of Nuclear Medicine, which previously provided OA to its content after a 12 month delay, has shortened the moving wall to 6 months. See the September press release from the Society of Nuclear Medicine. (Thanks to HealthTech Wire.)

DRIVER comment on the EU green paper

DRIVER has released its comment on the EU green paper, Copyright in the Knowledge Economy.  (Thanks to Birgit Schmidt.)  Excerpt:

...(19) Should the scientific and research community enter into licensing schemes with publishers in order to increase access to works for teaching or research purposes? Are there examples of successful licensing schemes enabling online use of works for teaching or research purposes?...

DRIVER’s mission is to expand its content base with high quality Open Access research output, including textual research papers and other scholarly publications. The DRIVER consortium therefore sees a number of reasons why the scientific and research community should aim to increase their influence on the development and adoption of alternative licensing schemes.

Over the last years several licensing schemes have been developed which provide free access to copyrighted work for readers....

DRIVER recommends European harmonization and promotion of the above licenses (MIT, NIH, CC, SURF,...) in order to raise awareness about the rights of the author and strengthen his position further....

DRIVER only endorses [hybrid OA journal] models when there is an obvious pay-off on the side of the [reduced] subscription costs. A “hybrid model”, where article processing fees would be paid on top of subscription fees, is not acceptable....

One forward-looking result of the [PEER study] project may be to establish a consistent interface between an increasing number of publishers and repositories. This would enable licensing schemes between publishers and institutions which include immediate or embargoed deposit in institutional repositories.

A strong connection between current research information systems (CRIS) and the deposit of peer reviewed manuscripts in repositories would also facilitate reporting and evaluation procedures as it provides access to the works and their metadata as well as usage and citation information. Incentives in the form of a greater citation impact and other bibliometric advantages for Open Access publications could be rewarded via the connection between IRs and CRIS systems, as voluntary deposit (without mandates or policies) is not scalable.

On October 20th, the European Commission announced the start of an Open Access Pilot for research funded in the FP7 programme, in which researchers from different disciplines are presented with the clause of mandatory self-archiving. DRIVER fully endorses this initiative and, since it mandates self-archiving and further diffuses the development of repositories. DRIVER could even take up a role as a ‘motor’ for implementation, also through strategic relations with SPARC....

MacArthur Foundation adopts an OA mandate

The MacArthur Foundation adopted a research access policy, which took effect on September 18, 2008.  (Thanks to Donna Okubo.)  Excerpt:

...The Foundation's policy is to ensure that the Grant Work Product furthers charitable purposes and benefits the public. To that end, the Foundation seeks prompt and broad dissemination of the Grant Work Product at minimal cost or, when justified, at a reasonable cost.

The Foundation encourages openness in research and freedom of access to underlying data by persons with a serious interest in the research. Grantees are also encouraged to explore opportunities to use existing and emerging internet distribution models and, when appropriate, open access journals, Creative Commons license or similar mechanisms that result in broad access for the interested field and public.

The Foundation recognizes there may be circumstances where limited or delayed dissemination of Grant Work Product or limited access to data may be appropriate to protect legitimate interests of the grantee, other funders, principal investigators or participants in research studies. Such circumstances will be evaluated on a case-by-case basis.

Intellectual property rights (including copyright and patent rights) should not be used to limit or deny access to the Grant Work Product, to result in exclusive use of such Grant Work Product, or to create revenue that is not used for charitable purposes. While copyright to the Grant Work Product will ordinarily remain with the grantee, the Foundation will require that it be granted a no-cost assignable license to use or publish the Grant Work Product. The Foundation will exercise the license only if the grantee does not or cannot provide for broad and prompt dissemination consistent with this Policy. The Foundation may forego a license if the Foundation is reasonably satisfied that other appropriate arrangements will be implemented that will assure prompt public dissemination of the Grant Work Product.

Comment.  Kudos to all involved.  While the Foundation uses the language of encouragement, the policy operates more like a mandate with case-by-case exceptions.  I like the enforcement mechanism:  a license for each Grant Work Product, which the Foundation will use when grantees do not provide sufficiently open or early access on their own.  I also like the way the policy applies to "Grant Work Products" without restriction --hence, covering data as well as publications.

Labels:

Moore Foundation adopts an open data mandate

The Gordon and Betty Moore Foundation adopted an open data policy on September 18, 2008.  (Thanks to Donna Okubo.)  Excerpt:

The Gordon and Betty Moore Foundation’s (GBMF) goals of scientific advancement, environmental conservation, and healthcare improvement will best be served through a culture of open access to data. It is our philosophy that:

  • All data used in or developed in whole or in part by GBMF funded projects (and that can be shared in a manner consistent with applicable laws) will be made widely available and freely shared as soon as possible[1]. If data used in GBMF-funded projects are owned by an additional party other than the grantee, GBMF does not require it to be released, but the grantee will use its best efforts to encourage the data owners to make it openly and freely available.
  • Data are shared with full and proper attribution to the data provider.
  • Data developed in whole or in part by GBMF grant funding are the property of the grantee unless otherwise specified. The grantee may protect its property through patent, copyright and/or other intellectual property protection instruments, except that it may not impede the effective access and use of the data by the public.
  • GBMF is not responsible for any liabilities associated with errors in the data or misrepresentations or misinterpretations of publicly available data.
  • GBMF supports grant funding for costs associated with data sharing and open access publication of scientific findings, where appropriate.
  • GBMF and prospective grantees will jointly develop a Data Management and Sharing Plan prior to the finalization of a grant agreement.

The Data Sharing Philosophy applies to all activities that are financially supported in whole or in part by GBMF....

As part of the GBMF grant development process, potential grantees are required to develop a Data Management and Sharing Plan (the Plan) with their GBMF program officer....

[1].  Examples of when data should be released: For data created for scientific and environmental conservation purposes, six months from the time of collection, defined as the time when data enters an electronic database....Date of acceptance for publication of the main findings....For DNA sequence data, public release (as defined by submission to an appropriate public database) must occur not more than six months after “completion” (defined specifically in Grant) of the DNA sequence determination.

Comment.  Kudos for this strong policy.  Note that it only applies to data, not to peer-reviewed articles (except for the offer to pay publication fees).  But I hope the Foundation will consider extending it to cover peer-reviewed articles as well.  Note too that it does not require the data to be assigned to the public domain, as Science Commons would.  While I support the SC approach, the Moore approach is a reasonable second-best:  letting grantees hold whatever IP rights in their data the law allows, but not letting them use those rights to impede effective public access.  That may take some refinement in practice.  For example, does an attribution license impede effective public access for a collaborative, public dataset with thousands of contributors?

Labels:

EU funding for OA projects in 2009

The EU's Information and Communication Technologies Policy Support Programme (ICT PSP) has released its Draft Work Programme 2009.  If the EC approves the draft in January, then it should open a call for proposals from January 29 to June 2, 2009.  According to the draft, one thread of the new funding program is devoted to OA.  Excerpt:

Objective 2.4: Open access to scientific information

Funding instrument: Pilot B It is intended to support several actions.

Focus and outcomes:

Within the framework of the actions on scientific information initiated by the Communication on scientific information in the digital age the objective is to improve the spread of European research results. This objective is sought not only in the context of the Digital Libraries Initiative, but also within the 7th Framework Programme for Research and Technological Development (FP7).

The term "scientific/scholarly content" refers to the results of scientists' or scholars' research work in the EU Member States or other countries participating in the programme, which in the traditional publishing paradigm have been published as articles in learned journals, papers, conference proceedings, monographs or books.

In this context pilot B actions will be funded to carry out conclusive experiments with open access to digital libraries of scientific/ scholarly content, including experiments exploring new paradigms for peer reviewing, rendering, querying and linking scientific/scholarly content and (optionally) the related underlying datasets.

Conditions and characteristics:

  • The quality and quantity of the digital content (and related metadata) to be effectively contributed to the pilot by each content provider, as well as the criteria for its selection, must be clearly identified. The consortium and its members must ensure the necessary availability of the content to be contributed to the pilot. In particular, the input content should not depend on proprietary third- party rights or any other constraints, which would limit its use for the execution of the pilot.
  • The consortium and its members must agree on the necessary licensing or clearing arrangements for any Intellectual Property Rights (IPR) arising from the pilot to ensure wider use and dissemination of the project output.
  • The actual content should be accessible and retrievable at item level. Projects dealing only with catalogues of content will not be funded. The results of the pilot must be accessible by the target users beyond the end of the pilot.
  • The consortium must include content providers. It should also include or involve explicitly in the project different types of relevant stakeholders, i.e. academic community, libraries, institutional repositories, scientific publishers and the funding bodies.
  • The users, i.e. researchers, and their needs, also beyond the consortium participants, must be clearly identified. Proposers must present an analysis of demand based as much as possible on quantified evidence. The users and their needs should also be at the centre of the proposed approach.
  • The issues addressed and the way to tackle them should have a European dimension, i.e. they should impact on a large number of users in the largest possible number of EU countries.
  • Proposers should demonstrate that the underlying content constitutes the critical mass necessary to make a significant impact in terms of increasing access and use in the concerned area or that the experiments exploring new paradigms can have a considerable impact on the future development of the scientific information area.
  • Specific and realistic quantified indicators should be provided to measure the envisaged improvements in availability, access and use at different stages in the pilot lifetime.
  • A clear exploitation plan should be presented to ensure the sustainability of the proposed solutions, i.e. their capability of developing and surviving without Community funding after the end of the project. Sustainability comprises both economic and organisational aspects.
  • A clear dissemination plan should be established to ensure optimal use of the pilot results, also beyond the participants in the pilot.

Expected impact

Open access to more scientific/scholarly content and/ or the development of new ways to review, render, query and link scientific/scholarly content.

Labels:


Thursday, December 18, 2008

Year in review from Fedora

Fedora Commons has posted two year-end summaries on the progress of the free and open repository software project:

Will the "database right" block OA archiving?

A German court has ruled that journal editors may acquire a copyright in the collections they assemble, even if they don't hold copyrights on the individual articles.  Hence, if an OA repository contains copies of many or all of the same articles, arranged in the same systematic or methodical way that the journal arranged them, then it could violate the editor's right.

Thanks to Klaus Graf for his blog post on the decision, in German or Google's English.  Thanks as well to Sebastian Krujatz (specialist in German IP law at the Max Planck Institute for Intellectual Property, Competition and Tax Law) for helping me make my short summary accurate.

Comment.  This decision shouldn't stop routine self-archiving in Germany.  But it could be an obstacle as funder and university policies drive the rates of OA archiving toward 100%.  To use the terms of US law, it's another good reason to oppose the "database right", which gives quasi-copyright protection to the labor of collecting information together, even when the information is in the public domain or under copyright to someone else.

OA was a hot topic in 2008

The Chronicle of Higher Education's blog post on the Harvard OA mandate (February 12, 2008) was Number 3 of its Top 10 posts of 2008, based on reader traffic.

New OA database on trans-Atlantic slave trade

Voyages: The Trans-Atlantic Slave Trade Database is a new OA database sponsored by Emory University, the U.S. National Endowment for the Humanities, and the W.E.B. DuBois Institute at Harvard University. See the December 5 press release from Emory or coverage by the Associated Press. (Thanks to ResourceShelf.)

The site provides data on almost 35,000 trans-Atlantic slave-trading voyages, maps, images, data on some individual Africans transported, and educational resources. The bulk of the dataset comes from projects from the 1990s to compile the data, published on CD-ROM in 1999.

For more information on the project, see this site from Emory's Metascholar Initiative and this post by Elisabeth Grant on the American Historical Association's blog.

See also our past post on the project.

eIFL comment on the EU green paper

eIFL.net has released its comment on the EU green paper, Copyright in the Knowledge Economy.  Excerpt:

...(19) Should the scientific and research community enter into licensing schemes with publishers in order to increase access to works for teaching or research purposes? Are there examples of successful licensing schemes enabling online use
of works for teaching or research purposes?

Article 5(3)n of [Directive 2001/29/EC, known as the Information Society Directive] permits copying for illustration for teaching or scientific research, so a licence for this purpose is not necessary.

Universities and other institutions expend huge resources in terms of time and cost entering into licensing schemes with commercial publishers to access electronic works. Dissatisfaction with pricing models, standard publisher licences and the unequal position of the parties in licence negotiations has led towards other ways of disseminating research, science and educational materials to the public. The scientific and research community should continue to develop the two complementary strategies suggested by the Budapest Open Access Initiative : self-archiving (depositing refereed journal articles in open electronic archives) and open access (OA) journals. Open access journals use copyright and other mechanisms to ensure permanent open access to published articles. Because price is a barrier to access, OA journals don’t charge subscription or access fees. Open Access  has already permanently changed the field of scholarly communication. It is under discussion by governments, some publishers, including subscription-based, are experimenting and adopting the OA model and it is mandated by funding bodies and universities throughout the world. We believe that this is the best route to increase access to works.

Licences used for OA works provide examples of successful licensing schemes enabling online use of works for teaching, research purposes and much more. Open Access works are licensed to the public for a wide range of uses including to read, download, copy, distribute, print, search, or link to the full texts of the articles. One example of a licensing scheme that supports the ability to disseminate and reuse works is Creative Commons , ported in fifty countries including eighteen EU countries.

eIFL encourages the Commission to support the broad goals of open access and to encourage the adoption of open content licences to maximise the visibility and reuse of research outputs for the benefit of all. We believe that wider dissemination of knowledge contributes to more inclusive and cohesive societies, fostering equality of opportunities in line with the priorities of the forthcoming renewed Social Agenda....

The new way to report research output

The first version of the 2007 Research Output report from the University of Pretoria listed the faculty publications but didn't link to OA editions in UPSpace, the Pretoria IR.  That's fixed now, thanks to the UP Library Services and the Department of Research Support.  From Monica Hammes:

...A team of 19 staff members and students under the leadership of Elsabe Olivier upgraded the research data by [linking article citations to their full text in the institutional repository,] adding the URLs of available articles, correcting metadata and adding additional records.

56% of the articles in the 2007 Research Report now link directly to the full text adding to the visibility of this very important component of the University's research output. The project clearly indicates that collaboration with the faculties and the Department of Research Support can lead to more effective and less cumbersome research reporting. In this way the repository becomes a valuable part of the research infrastructure.

This project was the main goal of the Library's Open Scholarship programme for 2008. Read more about the Open Scholarship programme and share your views on the openUP wiki....

More on the OA impact advantage

Michael Norris, Charles Oppenheim, Fytton Rowland, The citation advantage of open-access articles, Journal of the American Society for Information Science and Technology, July 9, 2008.  (Thanks to Repository News, esp. because I check the new issues of JASIST and somehow missed this article when it came out.) 

Abstract:   Four subjects - ecology, applied mathematics, sociology, and economics -were selected to assess whether there is a citation advantage between journal articles that have an open-access (OA) version on the Internet compared to those articles that are exclusively toll access (TA). Citations were counted using the Web of Science, and the OA status of articles was determined by searching OAIster, OpenDOAR, Google, and Google Scholar. Of a sample of 4,633 articles examined, 2,280 (49%) were OA and had a mean citation count of 9.04 whereas the mean for TA articles was 5.76. There appears to be a clear citation advantage for those articles that are OA as opposed to those that are TA. This advantage, however, varies between disciplines, with sociology having the highest citation advantage, but the lowest number of OA articles, from the sample taken, and ecology having the highest individual citation count for OA articles, but the smallest citation advantage. Tests of correlation or association between OA status and a number of variables were generally found to weak or inconsistent. The cause of this citation advantage has not been determined.

Expanding OA for Indian agricultural research

G. Aneeja and Sridhar Gutam, Prospects of Open Access to Indian Agricultural Research: a case study of ICAR, a presentation at the 8th Indian Science Communication Congress (ISCC-2008), December 10-14, 2008, Chennai, India.

Abstract:   Historically, agricultural research and education in India have been in the public domain. The Indian Council of Agricultural Research (ICAR) was established as an apex organization for effective research coordination among institutions and promotion of agricultural research in the country. Funds for public sector research institutes were channeled through the ICAR from the central government. For the dissemination of research output, the research journals publishing in India have been, for long, primarily a public funded activity and done mostly by Government agencies. In case of agricultural research, the journals are being published by ICAR and by respective professional societies. Many of these societies are receiving financial assistance partly from ICAR. Each discipline of agriculture is having at least a professional society and for some disciplines, there are more than one society and each society is publishing a peer reviewed research journal. Though many of these journals are sent for international indexing like CAB Abstracts, full-text database services are very poor. Many of them are not even in the ISI Master Journal list for the impact factor or science citation index analysis. The main objective of author is to have more impact, visibility and readership of his/her article. These journals publish quality articles after stringent peer review process, but the time lag from submission to publication of an article or production of issue is more. There are instances where the articles sent for review were not returned back due to various reasons. The infrastructure for publishing online is also not available for these journals. Recently, a portal had started providing free online access of some journals being published by professional societies. Under National Agriculture Innovation Project (NAIP), ICAR is investing considerable amount for making availability of some non-free online journals along with all open access journals. Now the time has come to think about wider reachability without any restrictions on the lines of Open Access and Open Archiving of Research Publications. This paper discusses about the prospects of open access to ICAR organization. With the availability of open source software resources for transformation of traditional journals into open access journals and establishment of open archive online repositories for archiving research articles which can be harvested by search engines and made available for users thereby leading to availability of agricultural research knowledge to everyone online. This will increase the visibility of research output and eventually lead to earn a good impact factor for these journals. This would further support the target community as well as the extension system directly without any time lapse.

Seizing the opportunity to solve the serials crisis

Glenn S. McGuigan and Robert D. Russell, The Business of Academic Publishing: A Strategic Analysis of the Academic Journal Publishing Industry and its Impact on the Future of Scholarly Publishing, Electronic Journal of Academic and Special Librarianship, Winter 2008.

Abstract:   Academic libraries cannot pay the regularly escalating subscription prices for scholarly journals.  These libraries face a crisis that has continued for many years revealing a commercial system that supports a business model that has become unsustainable.  This paper examines the “serials crisis,” as it has come to be known, and the economics of the academic journal publishing industry.  By identifying trends within the industry, an analysis of the industry is undertaken using elements of the five forces framework developed by Michael Porter.  Prescriptions are offered concerning what can be done and what should be done to address this problem.

From the conclusion:

An analysis of the academic publishing industry indicates that the industry presents both threats and opportunities for academic libraries.  Within the current business model, bargaining power of academic libraries as buyers is weak.  Similarly, the bargaining power of faculty/scholars as suppliers of intellectual property is weak.  The industry is highly concentrated with three for-profit publishers controlling the distribution of many journals including the largest and most prestigious.  These factors contribute to an industry environment where the commercial publishers are able to increase prices due to the lack of alternative sources for the distribution of intellectual content held within academic journals.

The approach of analyzing the industry through a business perspective is important so that a clearer understanding of the industry landscape can be drawn.  This project will hopefully contribute to the public discourse that is taking place regarding the current business model of academic publishing and scholarly communication.  Based upon this analysis, the business model is no longer sustainable.  The authors are hopeful, however, that change in the academic journal industry business model is possible, but it will not take place unless academic libraries pursue strategies similar to those outlined here.  This includes the creation of large coalitions or consortia to aggressively negotiate with the journal publishers as a buyer group as well as the facilitation of alternative methods of scholarly publishing through OA initiatives such as those advocated by SPARC.  What is critical is that academic libraries must act and use technology to begin the process of change immediately.  The “serials crisis” has created an opportunity for change.  In an analysis of the scientific and academic publishing industry, the Wellcome Trust, which funds many research activities, emphasizes the fact that the existence of this crisis does not mean that change will happen:

The existence of the means to create significant change does not mean that change will occur.  The fact that electronic media exist has implications for the market.  It is up to the players in the market to decide how they will use the means at their disposal.  The dominance of the commercial publishers will be challenged only if the other players use the opportunities available to them.

Wednesday, December 17, 2008

Update to geodata portal

Go-Geo!, the JISC-funded OA database of geospatial data, has been updated.

... Our new international gateway is available via a link from the home page to other international geoportals. Alternatively, browse records describing international geospatial data created by UK academics.

My Go-Geo! now allows registered users of the service to bookmark their favourite web resources, manage their metadata news subscriptions and directly access all metadata records listed within their private institutional nodes.

The metadata editor tool has a new name – GeoDoc – and an easily recognisable logo. To complement our new international gateway, users can now use GeoDoc to document their international geospatial datasets. ...

See also our past post on Go-Geo!.

Flickr Commons growing; New York Public Library joins

The New York Public Library has become the most recent participant in the Flickr Commons, with an initial set of 1,300 public domain images. See the announcement or the blog post at Flickr.

We previously posted on participation in Flickr Commons by the Library of Congress (1, 2, 3), the Powerhouse Museum, the Smithsonian Institution and the Brooklyn Museum, the George Eastman House and the Bibliothèque de Toulouse. We also mentioned that the Biblioteca de Arte-Fundação Calouste Gulbenkian was posting selections on Flickr, but not as part of the Flickr Commons and not tagged as being in the public domain. Since then, the Gulbenkian has joined Flickr Commons (although the images are still tagged with a CC license, not as being in the public domain). Other subsequent participants:

Collaborative Drug Discovery integrates ChemSpider

Collaborative Drug Discovery has integrated the ChemSpider database. See the blog posts by Collaborative Drug Discovery or by ChemSpider.

See also our past posts on Collaborative Drug Discovery or ChemSpider.

Update to OA dictionary file for chemistry

Adam Azman has released a new version of his Chemistry Dictionary for Word Processors. Using data from ChemSpider, the dictionary has grown from 18,000 words to more than 100,000. The dictionary is licensed under the Creative Commons Attribution License and is compatible with Microsoft Office and OpenOffice.org.

See also our post on the earlier release.

More on the St Gallen digitization project

Jessica Dacey, Cataloguing the Middle Ages in cyberspace, swissinfo.ch, November 14, 2008. On the St. Gallen Abbey digitization project and the Digital Abbey Library of St. Gallen.

See also our past posts on the project: 1, 2.

Self-archiving on dermatology

Robert Dellavalle, et al., Self-archiving dermatology articles, Journal of the American Academy of Dermatology 59(6), 2008. Self-archived December 16, 2008. Abstract:
Discusses the merits of depositing medical journal articles in open repositories.

Wilbanks presentation on data sharing

John Wilbanks, Data Sharing and Science: Legal, Normative, and Social Issues, presented at the American Geophysical Union Fall Meeting (San Francisco, December 15-19, 2008).

See also his notes on the conference.

See also: We previously posted the abstract.

New OA portal of health resources

Health Sciences Online is a recently-launched OA portal of OA health resources. From the about page:

... HSO provides free, online linkages to a comprehensive collection of top-quality courses and references in medicine, public health, pharmacy, dentistry, nursing, basic sciences, and other health sciences disciplines. These materials are donated, hosted, and maintained by our distinguished content partners ...

HSO is a sieve that includes world-class materials (currently numbering >50,000 resources), hand-selected by clinicians and other experts from already-existing reliable sources and resource collections. This includes medical specialty societies, accredited continuing education organizations, governments, and top-ranked universities ...

Founding collaborators for this site include [the U.S. Centers for Disease Control and Prevention], World Bank, the American College of Preventive Medicine, and the University of British Columbia. HSO is non-profit; funding has been obtained from the Canadian and British Columbian governments, the World Health Organization, NATO’s Science for Peace Program, the Annenberg Physician Training Program, the Ulrich and Ruth Frank Foundation for International Health, and other generous and committed individuals. Health providers and scientists, in training and in practice, have donated thousands of hours, identifying and making materials accessible for HSO users. ...

See also the press release, or the press release by the search technology provider, Vivisimo.

Paul Allen on the Allen Brain Atlas

Paul Allen, Piece of mind, The Economist, November 19, 2008. An op-ed. (Thanks to Gloria Tavera.)

The mystery of how the brain works is the most compelling question in science. ...

So six years ago I brought together a group of leading neuroscientists to find the basis for an approach that could advance the entire field of brain research. It was clear there needed to be a comprehensive database of information on where genes are turned on (or expressed) in the mouse brain—a map, or atlas, of the brain’s frontiers that would provide more encyclopedic information than any individual lab could afford to generate.

It seemed achievable. With the help of several noted researchers, I founded the Allen Institute for Brain Science in 2003 to undertake this project. Three years later, the institute had completed an atlas of gene expression in the mouse brain. ...

They released it to the public. Over the internet. Free. ...

Today we have many scientists using the atlas for their research into Alzheimer’s, bipolar disorders, Down’s syndrome, Parkinson’s, fragile x mental retardation and epilepsy. The atlas is also giving scientists insight into alcoholism, obesity, sleep, hearing and memory.

The greatest testament to what we did was that researchers of spinal-cord diseases, trauma and disorders approached the institute and asked us to create a spinal-cord atlas, which is now close to completion. We will launch the first phase of a human-brain atlas, a four-year project, in 2010. ...

Clearly the model of providing a freely accessible database is a successful one. In a sense, we have challenged other researchers to offer greater access to their findings. Will they take the challenge? My bet is that over the next 18 months we are going to see more open access and more collaboration. ...

See also our past posts on the Allen Brain Atlas.

Let's do for energy literature what we do for medical literature

David Wojick, We need a National Library of Energy to rapidly deploy transformative energy innovations, OSTI Blog, December 15, 2008.  Excerpt:

...The goal of energy transformation can take a lesson from America's rapid deployment of innovative medical technologies. It may take a long time to get a new drug or device approved, but once this happens the deployment is very rapid. America's spectacular success in fielding new medical technologies is anchored in the innovative Web resources of the National Library of Medicine. For a modest $350 million a year NLM supplies vast amounts of innovation information to America's scientists, doctors and consumers.

The need to know about energy innovations is equally vast, but there is no engine of change comparable to NLM. Scientists, engineers, product manufacturers, retailers, consumers, even children, have a need to know about new energy technologies. Everyone from scientist to second grader, because energy technology is far more pervasive than medical technology. Information is also an antidote to hype. But there is no National Library of Energy to do for energy innovation what NLM does for medicine.

The Department of Energy does have a miniature version of NLM, in its Office of Scientific and Technical Information or OSTI. In fact OSTI has been a pioneer in developing new ways to deliver energy information via the Web. But OSTI's budget is a modest $9 million, compared to NLM's $350 million. So DOE does just 3% of what NLM does.

What we need is a National Library of Energy, at a scale comparable to the National Library of Medicine, doing the job of deploying energy innovation.

Comment.  Hear, hear.  I made a related argument in SOAN last month.  America urgently needs green energy, green technology, and green jobs.  To maximize our chances of finding effective solutions quickly, we need an ambitious federal program of green research.  To maximize the beneficial impact of that research, we need OA. 

Portugal launches central repository to harvest national IRs

Portugal's publicly-funded Repositório Científico de Acesso Aberto de Portugal (RCAAP) was officially launched yesterday at the 3ª Conferência sobre o Acesso Livre ao Conhecimento (Minho, December 15-16, 2008).  RCAAP harvests content from 10 institutional repositories from around the country.

For more details, see the RCAAP about page, in Portugues or Google's English, or yesterday's article in Tek, in Portuguese or Google's English.  Also see Ricardo Vidal's blog post (in English).  Excerpt:

...[RCAAP] is currently indexing over 13091 documents from 10 repositories has been announced as a project funded by the Knowledge Society Agency (UMIC) and will be technically maintained by the National Scientific Computations Foundation (FCCN)....

The 10 repositories that are currently contributing to this main centralized repository are mainly university DSpace-based repositories or similar....

Labels:

TA journal article + Wikipedia summary

Declan Butler, Publish in Wikipedia or perish, Nature News, December 16, 2008.  Free for a week before it moves behind a pay well (like all Nature News stories).  Excerpt:

Wikipedia, meet RNA. Anyone submitting to a section of the journal RNA Biology [published by Landes Bioscience] will, in the future, be required to also submit a Wikipedia page that summarizes the work. The journal will then peer review the page before publishing it in Wikipedia.

The initiative is a collaboration between the journal and the RNA family database (Rfam) consortium led by the UK Wellcome Trust Sanger Institute.... "The novelty is that for the first time it creates a link between Wikipedia and traditional journal publishing, with its peer-review element," says Alex Bateman, who co-heads the Rfam database. The aim, Bateman says, is to boost the quality of the scientific content on Wikipedia while using the entries to update the Sanger database....

The first paper scheduled is "A Survey of Nematode SmY RNAs"; its corresponding Wikipedia summary can be found here.

The goal is to encourage more scientists who work on RNA to get involved in creating and updating public data on RNA families, while being rewarded by the traditional method of a citable publication, says Sean Eddy, a computational biologist at the Janelia Farm Research Campus of the Howard Hughes Medical Institute in Ashburn, Virginia, and a co-author of the nematode article....

Comments

  • Very interesting.  This policy helps the journal (by spreading the word about new work) and Wikipedia (by adding high-quality contributions written for a lay audience).  It uses the fact that Wikipedia aspires to be the "sum of all human knowledge" --ambitious but still just a summary-- and does not allow original research.
  • This policy doesn't replace OA archiving for the peer-reviewed manuscript, and it doesn't have to.  A journal could easily do both.  Unfortunately, however, Landes Bioscience (publisher of RNA Journal) does not do both.  It does not permit postprint archiving.  OA Wikipedia summaries are a step up from OA abstracts, which most TA journals provide nowadays, but not much more than that.  They're a step up because Wikipedia summaries are longer than abstracts and open to user edits.  But access to summaries is no substitute for access to full texts --which, btw, is the problem with some publisher-suggested alternatives (1, 2, 3) to the NIH policy.
  • Compare the RNA Journal policy with the Emerald Asset (Accessible Scholarship Shared in an Electronic Environment) pilot project launched by Emerald in January 2007.  Emerald offered to waive the publication fees at its hybrid OA journals for authors willing to write "a summary of their research findings highlighting their practical application".  This not only gave readers the benefit of an OA summary, but gave them the benefit of the OA full-text article as well.  At the same time, it removed the publication fee which lowers author uptake at most hybrid OA journals. 

Update (1/13/09). Also see Kent Anderson's comment and the discussion it generated on the Scholarly Kitchen blog.


Tuesday, December 16, 2008

OA proposal for Obama administration now in top 12

The proposal to require OA for publicly-funded research has climbed to the 12th spot on Obama CTO, the unofficial web site collecting recommendations for the Obama administration, up one rank from last week.

The OA idea was posted to the site on November 15, and broke into the top 25 on December 4.  Keep spreading the word:  ranks are determined by user votes.

Introducing the Open Knowledge Commons

A webcast of Maura Marx's talk at Harvard today, The Open Knowledge Commons, is now available for downloading.  From the blurb:

This talk with Maura Marx, Executive Director of the Open Knowledge Commons, will introduce the OKC, a new organization born out of the Open Content Alliance and dedicated to advocacy for and development of an open digital library of human knowledge. In funding start-up operations for the Knowledge Commons the Alfred P. Sloan Foundation recognized the need to broaden the community's effectiveness beyond the research and development phase of mass digitization and to create a sustainable open digital library for scholarly and public use. This will be a very interactive session; comments and feedback on the OKC agenda will be encouraged and welcomed.

Maura most recently founded the Digital Library Program at the Boston Public Library and was instrumental in securing the Library's support of Open Content principles. She began her career in Europe doing development work in cultural heritage institutions and later worked in the U.S. technology sector before coming to libraries and the open content movement. She holds a B.A. in German from the University of Notre Dame, an M.A. in Italian from Middlebury College and an M.S.L.I.S. from Simmons College.

September 11 museum implements CC licensing

The National September 11 Memorial & Museum launched its Artists Registry on December 5. The Registry is an OA virtual gallery which allows artists to upload their work created in response to 9/11. The registry implements Creative Commons licenses, allowing artists to choose a CC license for the work they upload. See the press release. (Thanks to Creative Commons.)

50 years of French parliamentary debates go OA

On December 9, the Archives of the French National Assembly posted 350,000 pages of parliamentary records dating back to 1958. See the press release. (Thanks to europa-eu-audience.)

Norwegian report recommends OA to PSI

Petter Bae Brandtzæg and Marika Lüders, eCitizen2.0: The ordinary citizen as a supplier of public-sector information, report for the Norwegian Ministry of Government Administration and Reform, undated by recent. See also the November 4 press release. (Thanks to europa-eu-audience.) From the executive summary:

... This reports offers a survey of national and international trends, in addition to empirical facts regarding how people today use new services for spreading and sharing information. The results indicate that central principles of state information policy will have to be modified. ... The public sector and eGov need to a greater extent to take as their point of departure the fact that the ordinary citizen herself is capable of acting as a supplier of public-sector information and communication.

An important problem, however, is the lack of openness and access to public-sector data. Openness and easy access to public-sector data are essential if these are to be re-usable and be used in other contexts. ...

Therefore, the authorities should take the following trends into account: ...

3. A culture of sharing among citizens
4. Collective intelligence and the knowledge of the masses ...
8. Greater openness and access to information
9. eGovgeeks who develop user-generated information services based on data from the public sector in combination with other information and services ...

There are clear indications that the production and consumption of user-generated content is bound to increase. It may therefore be appropriate for the public sector to start to cooperate with private developers (i.e. eGovgeeks) of user-generated services in order to help ensure that their content and data will be of as high quality as possible for the general public.

  • Moving information and communication efforts from traditional information producers in the public sector to the general public could also have several other positive effects, such as: ...
  • More openness on the part of the authorities, because public-sector information – research results, accounts, map data and measurements - can be made more accessible to the individual citizen. ...

A sharper focus on the citizens themselves and participant information generators could be obtained by means of the following measures:

  • WikiNorge: Treating citizens as partners rather than as mere passive recipients of information: A radical proposal in this respect is that the state should set up Wiki.Norge.no; a «Wikipedia» for public-sector information, where citizens themselves could informally contribute and edit all imaginable sorts of public-sector information ...
  • Openness: Public-sector information must be made freely available and reusable on the Internet. The authorities must open up and make available their own information, so that citizens and private operators/developers (eGovgeeks) can utilise, publish and share such information in new forms and contexts. This will require common and/or standardised publishing solutions for national and local authorities. ...

Wilbanks video, audio, essay

Three new items from Science Commons' John Wilbanks: (Thanks to Science Commons.)

Willinsky on OA and PKP

John Willinsky, Open Access and the Public Knowledge Project, presented at The Scholarly and Public Quality of Research: Why Open Access Matters (Sydney, December 8, 2008). An audio recording. (Thanks to the Public Knowledge Project.)

NIH adds asthma data to dbGaP

NIH Expands Open-Access Dataset of Genetic and Clinical Data to Include Asthma, press release, December 15, 2008.

The National Heart, Lung, and Blood Institute (NHLBI) of the National Institutes of Health has expanded its collection of genetic and clinical data first made freely available to researchers worldwide last year, to include clinical and genetic information collected from three asthma research networks. ...

SHARP [SHARe-Asthma Resource Project] includes data on 2,332 people with asthma and 805 families whose DNA was tested for 1 million genetic variations. In addition, clinical data gathered during asthma clinical trials, such as lung function, allergy status, and respiratory symptoms are included in the database. In this way, SHARP will permit researchers to relate study participants' genetic variations to their clinical and laboratory test results, thereby enabling future discoveries of links between genes and health for asthma and other airway diseases. To protect the confidentiality of study participants who agreed to share their medical data, the database does not include any personal information. ...

The three asthma clinical research networks providing data are the Childhood Asthma Management Program (CAMP), the Childhood Asthma Research and Education Network (CARE), and the Asthma Clinical Research Network (ACRN) — all funded by NHLBI. For more than 10 years, these networks have been major sources of information about the best practices in asthma care, translating and developing new knowledge for patients and physicians. ...

SHARP data is accessed through dbGaP, or the database of Genotypes and Phenotypes, a Web-based resource for archiving and distributing data from genome-wide association studies (GWAS). ...

Individual-level data can be used only by authorized investigators who meet requirements for access outlined in the NIH GWAS policy. Researchers are prohibited from redistributing data or trying to determine the identity of participants. ...

See also our past posts on dbGaP.

LJ editorializes against Google settlement

Francine Fialkoff, Google Deal or Rip-Off? Librarians need to protect the public interest, Library Journal, December 15, 2008.  An editorial.  Excerpt:

One public access terminal per public library building. Institutional database subscriptions for academic and public libraries that secure once freely available material in a contractual lockbox, which librarians already know too well from costly e-journal and e-reference database deals. No remote access for public libraries without approval from the publisher/author Book Rights Registry, set up to administer the program. And no copying or pasting from that institutional database, though you can print pages for a fee. Of course, you can always purchase the book, too.

Those are just a few of the choice tidbits from the 200-page settlement in the Association of American Publishers (AAP) and Authors Guild three-year-old suit against Google, drawn from Jonathan Band's “Guide for the Perplexed: Libraries and the Google Library Project Settlement.” Band's report was commissioned by the American Library Association and the Association of Research Libraries....

[T]he suit was never about the public interest but about corporate interests, and librarians did not have much power at the bargaining table, no matter how hard those consulted pushed. While there are many provisions in the document that specify what libraries can and can't do and portend greater access, ultimately, it is the restrictions that scream out at us from the miasma of details.

Even the libraries that were initial partners (or those that become partners) in the Google scan plan don't fare well. They get a single digital copy of each book from their collection—mind you, they've paid for these books already—and can print a replacement copy only if a new copy isn't available at a “fair price.” They can allow faculty and students to “read, print, download, or otherwise use five pages of any book in its LDC” (library digital copy set) for books “not commercially available,” but they can't use the LDC for interlibrary loan or e-reserves. There are all kinds of potentially costly, nightmarish administrative minutiae, including a security plan and annual audits of usage and security.

The restrictions were obviously too much for one of the original five Google partners, Harvard University Library (HUL), which criticized the settlement. Robert Darnton, the HUL director, said the deal had “too many potential limitations on access to and use of books” for academia and public libraries and questioned what the price for access would be, given that “the subscription service will have no real competitors.” ...

Obviously, books are moving online, but librarians need to ensure that the principles of borrowing and sharing are expanded in a digital world, not circumscribed by contract law or available only to those who can pay....[L]ibrarians must do better than to acquiesce in an arrangement that relinquishes ownership of books online in favor of contractual provisions and for-pay schemes that subvert the ideals of the public library and academic inquiry.

Related.  Also see the comments of the Open Content Alliance, A Raw Deal for Libraries (blogged here 12/11/08).

More on the debate over the Google settlement

Andrea Foster, Google Settlement to Pay Lawyers Up Front, Authors Eventually:  Opinions on Settlement Differ, But Negotiations Cloaked in Secrecy by Non-Disclosure Agreements, Doing the Write Thing, undated but apparently December 15, 2008.  A detailed account.  Excerpt:

...Publishers, scholars, and authors, along with Google, have hailed the agreement as a boon to book lovers and an ingenious solution to the thorny issue of how to fairly compensate authors and publishers in the digital age. Books and periodicals are quickly moving online, and the public, it seems, expects to read it all for free.

But critics say the settlement is flawed. Some are troubled that those privy to the settlement negotiations are barred from disclosing any details. Many scholars are disappointed that Google failed to defend what they see as the legality of digitizing copyrighted books. Others worry that the agreement gives Google too much power. They fear that the company will monopolize commerce around digital books, or will hide certain books from its database if it fears their distribution would threaten the company's bottom line....

Boni, the Author's Guild lawyer, said the group made the right decision to settle the case rather than argue before a judge that Google was infringing on authors' copyrights. "We have a groundbreaking book-publishing deal the likes of which the world has never seen," he said. "We could never have gotten that had we brought this case to trial."

The Association of American Publishers, which filed a separate suit against Google for scanning copyrighted works, also applauded the settlement. Jeffrey P. Cunard, the lead lawyer for the publishing group, said that if his clients brought the case to trial it might have taken many years and many appeals before the suit would have been resolved. And it was never clear whether the publishers would have prevailed, he added....

Although Google, publishers, librarians, and authors pride themselves on supporting information sharing and free expression, the public will likely never know how the agreement was reached or what the litigants learned during the negotiations. That's because anyone who offered their advice and expertise to lawyers involved in settling the suit had to sign a sign a statement saying they would keep mum about the discussions, which lasted for at least two years....

Ultimately, the librarians viewed the non-disclosure agreement as the bitter pill they had to swallow in order to achieve an outcome that would give academic institutions, public libraries, and computer users access to a vast array of out-of-print, in copyright books....

"We now have a structure which is going to allow everything that's ever been published to be available on market terms, online for individuals and educational institutions," said [Paul N. Courant, university librarian at the University of Michigan]. "This was unimaginable five years ago. It's astonishing." ...

Ideally, though, Courant and many other scholars would have preferred to see Google defend and win its controversial claim that scanning and digitizing copyrighted works and displaying snippets of the text online is allowable under copyright law's fair use exemption. Such a decision would have set a legal precedent that would have given other organizations the green light to digitize copyrighted works without fear of liability....

The settlement is non-exclusive, which means that Microsoft, Amazon, Yahoo, or any other technology company can digitize and sell in copyright, out-of-print books. But it's unlikely they will since Google will have a jump start in forging relationships with a lot of authors and publishers via the book registry, because of the millions of dollars involved in digitizing books, because of Google's access to stores of these books at university libraries, and because of the company's dominance in book digitization.

Microsoft got out of digitizing books in May of this year, cementing Google's role as the leading commercial enterprise digitizing books.  "This settlement ensures for the foreseeable future that Google will be the sole major outlet for digital access to out-of-print books," Vaidhyanthan, of the University of Virginia, said.

And remarkably, he added, it is being accomplished with the assent of librarians. "Google is going to open little bookstores in libraries all across America. It's a stunning and radical change for libraries in this country, and it's entirely dictated by Google's market power." ...

James Grimmelman, an associate professor at New York Law School, favors the settlement, but he has a number of concerns that he describes on his blog, The Laboratorium, and that he says a judge should address. For example, he says the agreement gives Google special treatment, or "most-favored nations" status since within 10 years after its approval no other group could receive more favorable economic treatment with publishers and authors than Google....

Grimmelman finds other flaws, too. He says Google could exclude a book from its database for purely editorial reasons and not inform the registry or the public. He says Google could collect private information about users and their reading habits....

Brewster Kahle, an entrepreneur and librarian who helped create the [Open Content Alliance], is disturbed by what he's heard and read about the agreement. He would have preferred to see Google offer payment to rights holders whose works the company already digitized without permission, and leave it at that.

The settlement, he says, creates "a system going forward that reinvents a new copyright office, new copyright laws, a new payment system and all revolving around a single monopoly for access to the collective books of humankind."

Assuming the settlement is adopted as is, Kahle imagines it would be relatively easy for a dictator, a politically-connected organization, or a financial powerhouse, to call up Google and have it remove a controversial book from its database or give it a low profile, effectively censoring what people read online....

Robert C. Darnton, Harvard University's library director, told his library staff members in an October letter that Harvard will allow Google to continue scanning out-of-copyright books but not in-copyright works.

"The concerns that kept us from participating in the settlement to date relate to provisions about how the public access and subscription services would work and what they would cost, what material would be included, and whether the ways in which digitized volumes are captured and shared would reduce their utility for research and education," the letter read. Darnton, through a spokesman, declined to elaborate on the letter.

More on subtracted-value in TA journals

Kenneth Anderson, Musing about Publishing on Paper versus Publishing On-line, Opinio Juris, December 16, 2008.  Excerpt:

...I am on the editorial board of an important specialist journal in its field - The Journal of Terrorism and Political Violence.  It is a very serious, peer reviewed, interdisciplinary journal - published, however, by Taylor and Francis, so it is not open access.  I don’t understand what the incentive is to continue to have a journal like this published by a for profit publishing company - no one gets paid to write for it, and the peer reviewing is likewise not paid, so it is hard for me to see what the supposed connection is between a for profit publisher and peer review.  It’s not that I don’t take peer review seriously - on the contrary, I think it is a far better arrangement than the law journal setup with student editors.  But I don’t understand the value added by for profit publishers, especially in light of the value subtracted, by not having the stuff available open access....

Comment.  Exactly.  TA journals add the value of peer review, but OA journals add the same value without making any subtractions.  Here's how I put it in an article from July 2007:

...[A]fter publishers add value through peer review and copy editing they feel financial pressure [from the subscription business model] to subtract value by imposing password barriers, locking files to prevent copying or cutting/pasting, freezing data into images, cutting good articles solely for length, and turning gifts into commodities which may not be further shared....

Bibliotheca Alexandrina's forthcoming OER repository

Wagdy Sawahel, Library of Alexandria goes online for science education, SciDev.Net, December 8, 2008.  (Thanks to Subbiah Arunachalam.)  Excerpt:

An online repository of thousands of scientists' lectures from around the world will be launched by the Bibliotheca Alexandrina in Egypt in January (2009).

The repository — known as the Supercourse — is an attempt to improve access to science education in developing countries by targeting a total of 100,000 Golden PowerPoint lectures from scientists worldwide within a year and one million in three years.

The library hopes that the repository will become a knowledge network in four main scientific disciplines — medicine, engineering, environment and agriculture — through a community of more than 55,000 scientists in 175 countries who are sharing their collective library of more than 3,400 lectures....

Good press for OA legal text in Australia

Simon Chester, A New Model for Legal Publishing, Slaw.ca, December 15, 2008.  Excerpt:

I came across...[an] enthusiastic review of a new book on the law of Bail [in Victoria, Australia]....Well what’s unusual about that, you might ask.

Legal publishing in jurisdictions like Victoria...relies on legal academics or practising lawyers who work hard to write a book, which their publishers will know have very limited sales potential....

What’s unique about Faris on Bail is that the author, Australian blogger Peter Faris...decided...that he could reach the audience for his book via the web and that a legal publisher was an intermediary he was prepared to forego.

Friday’s Business Section of The Australian has the story [Chris Merritt, Legal Affairs Editor, December 12, 2008]:

...Peter Faris QC is one of those delightful lawyers who loves stirring. Just when you think he might be about to slow down, he does something or says something that is guaranteed to make trouble for someone who needs it.

[H]is latest target is the legal publishing industry that makes a very nice living by flogging overpriced law books to a captive market.

The publishers will have kittens when they become aware what Faris has done: he has written and published a legal text on Victoria’s bail laws — and he is giving it away online as a free PDF download.

It’s a deal the publishing industry can’t hope to match, particularly as he plans to issue updates about once a month — also for free....


Monday, December 15, 2008

U. Huddersfield releases its book circulation data

Dave Pattern, Free book usage data from the University of Huddersfield, "Self-plagiarism is style", December 12, 2008.

I'm very proud to announce that Library Services at the University of Huddersfield has just done something that would have perhaps been unthinkable a few years ago: we've just released a major portion of our book circulation and recommendation data under an Open Data Commons/CC0 licence. In total, there's data for over 80,000 titles derived from a pool of just under 3 million circulation transactions spanning a 13 year period.

I would like to lay down a challenge to every other library in the world to consider doing the same. ...

[I]f just a small number of other libraries were to release their data as well, we'd be able to begin seeing the wider trends in borrowing.

The data we've released essentially comes in two big chunks:

  1. Circulation Data: This breaks down the loans by year, by academic school, and by individual academic courses. This data will primarily be of interest to other academic libraries. UK academic libraries may be able to directly compare borrowing by matching up their courses against ours (using the UCAS course codes).
  2. Recommendation Data: This is the data which drives the "people who borrowed this, also borrowed…" suggestions in our OPAC. This data had previously been exposed as a web service with a non-commercial licence, but is now freely available for you to download. We've also included data about the number of times the suggested title was borrowed before, at the same time, or afterwards. ...

All of the data is in XML format and, in the coming weeks, I'm intending to create a number of web services and APIs which can be used to fetch subsets of the data. ...

Now it's up to you to think about whether or not you can augment this with data from your own library. If you can't, I want to know what the barriers to sharing are. Then I want to know how we can break down those barriers.

I want you to imagine a world where a first year undergraduate psychology student can run a search on your OPAC and have the results ranked by the most popular titles as borrowed by their peers on similar courses around the globe.

I want you to imagine a book recommendation service that makes Amazon's look amateurish.

I want you to imagine a collection development tool that can tap into the latest borrowing trends at a regional, national and international level. ...

See also the comments by Richard Wallis:

... If you have 14 minutes to spend I would recommend viewing Dave’s slidecast from the recent TILE project meeting, where he announced this ...

Patrick Murry-John picked up on Dave’s announcement and within a couple of days has produced an RDF based view of this data – I recommend you download the Tabulator Firefox plug-in to help you navigate his data. ...

A letter from the future

Dave Bacon has posted a letter from the future, to be sent back in time:

Dear Respected Mathematician/Scientist/Researcher:

First of all let me tell you want an honor it is to write to you from the future. Your work is so important in my time that we have named the main theorem which you proved in your paper "Megamathematical functorial categories which nearly commute" after you. Yes the JoeRandomName theorem is well known and used every day in my field. Thank you for thinking it up and proving it!

But I'm writing to you today, not because of this great piece of work. Instead I'm writing to you because a related piece of work which you published. I believe this work could would be useful in my own research. I've seen references to this paper which seem to indicate to me that it could be useful in my research. But, and here is the problem, it seems that I cannot get access to your work online nor at my library. ...

Libraries as walled gardens

Barbara Fister, Renting Keys to Walled Gardens, ACRLog, December 14, 2008.

... Are academic libraries building collections for the future and for all to use, or are we content to simply rent access temporarily for a limited audience? If we won’t stand up for free and equitable access, who will?

To be sure, we’ve partnered with scholars to push for open access, particularly to STM research. But I’m baffled when libraries pay money to subscribe to commercial versions of public databases like PubMed, ERIC, and NCJRS Abstracts, teaching our students to use interfaces that we think are better, but which they can’t access once they graduate. Lifelong learning? Pfui. Free to all? Feh.

When did we decide libraries are no longer a commons but a go-between that rents temporary membership in publishers’ walled gardens? Did we even notice? ...

OA dataset from archaeology textbook

The Quantitative Archaeology Wiki has posted the dataset from the textbook Digging Numbers by Mike Fletcher and Gary Lock, along with a tutorial of using the statistical software R to analyze the data. (Thanks to Stefano Costa.)

Blogging competition for PLoS ONE birthday

Liz Allen, PLoS ONE @ Two - second birthday synchroblogging competition, Public Library of Science blog, December 9, 2008.

PLoS ONE turns two this December. For our community celebration we're going to run our second synchroblogging competition. ... This time, we've partnered up with our friends at researchblogging.org, the site that aggregates blogs about peer-reviewed research, to create this competition.

... On December 18, 2008, blog about a PLoS ONE paper past or present and use the icon provided by researchblogging.org so the link to your post appears on their site ...

We've assembled a small team of judges who will review all these posts and vote on a winner. They will be looking for well organized posts from qualified individuals that set the research in broader context and add to the debate around the paper by posing questions on it. ...

The winning post will have their entry cross posted on the PLoS ONE Blog and A Blog Around the Clock. They will also receive a bag of swag that includes: the new PLoS Water bottle (H2go brand) and a couple of cool PLoS t-shirts. ...

Report on the Library of Congress Flickr Commons project

Michelle Springer, et al., For the Common Good: The Library of Congress Flickr Pilot Project, report, October 30, 2008. A shorter report summary is also available. From the executive summary:

The Library of Congress, like many cultural heritage organizations, faces a number of challenges as it seeks to increase discovery and use of its collections. A major concern is making historical and special format materials easier to find in order to be useful for educational and other pursuits. At the same time, resources are limited to provide detailed descriptions and historical context for the many thousands of items in research collections. The Library also faces competition for the attention of an online community that has ever-expanding choices of where to pursue its interests.

One solution worth exploring is to participate directly in existing Web 2.0 communities that offer social networking functionality. Reaching out to unknown as well as known audiences can attract more people to comment, share, and interact with libraries. Taking collections to where people are already engaged in community conversations might also encourage visits to a library’s Web site where the full wealth of resources are available.

To begin to address these issues, ... [a] small team developed a pilot project that could be rapidly implemented with limited resources. The formal Library of Congress strategic goals to expand outreach and improve the user experience shaped the primary objectives ...

Once the popular photosharing Web site Flickr was selected as a venue that would meet the Library’s requirements, the pilot team contacted Flickr to discuss its available rights statement options--none of which was appropriate for the Library’s content. These discussions began the collaboration that resulted in the launch of The Commons, a designated area of Flickr where cultural heritage institutions can share photographs that have no known copyright restrictions to increase awareness of their collections. Flickr members are invited to engage with Commons collections by describing the items through tags or comments. A growing number of libraries, museums, and archives, intrigued by the possibilities of this model, have followed the Library’s lead and launched accounts within the Commons framework.

Two collections of historical photographs were made public on a Library account on the Flickr photosharing site in January 2008. The response from Flickr members and observers of the pilot was overwhelmingly positive and beneficial. The following statistics attest to the popularity and impact of the pilot: [Note: omitting statistics.]

This project significantly increased the reach of Library content and demonstrated the many kinds of creative interactions that are possible when people can access collections within their own Web communities. The contribution of additional information to thousands of photographs was invaluable. ...

Concerns about loss of control over content will continue to be discussed but can also be mitigated. ...

The Flickr team recommends that this experiment in Web 2.0 become an ongoing program with expanded involvement in Flickr Commons and other appropriate social networking opportunities for non-photographic collections. The benefits appear to far outweigh the costs and risks. ...

On the chemistry wikis

Antony Williams, The Confusion of ‘Pedias and the Dependence on Wikipedia, ChemSpider Blog, December 11, 2008. Describes and compares the chemistry functions on Wikipedia, WiChempedia, and Chempedia.

Anthropology and museology journal adopts delayed OA policy

Gradhiva, a French journal of anthropology and museology in publication since 1986, has gone online with the Revues.org platform. Published by the Musée du quai Branly, issues starting with 2005 are now online, with a 3-year moving wall for OA. (Thanks to CLEO.)

Update: Updated to clarify the policy is a 3-year moving wall, i.e. issues 3 years old or older will be OA.

OA index on archaeology and early history

DAPHNE (Données en Archéologie, Préhistoire et Histoire sur le NEt - Data in Archeology, Prehistory and History on the NEt) is a new OA index of metadata, launched in November 2008. See also the press release by INIST. (Thanks to Fabrizio Tinti.)

Subjects: Prehistory, Protohistory, Archaeology (from early human evidences to the industrial era), Sciences of Antiquity (all aspects : religion, politics, society, philosophy, economics, art, daily life, etc.) and History until the year 1000 on all continents.

Data Types: Bibliographic records covering international literature and resulting from the indexing and analysis of various types of documents (journal articles, books, conference proceedings, exhibits, theses, journals, electronic resources, etc). Records include keywords and abstracts ...

DAPHNE pools together data from three databases: BAHR [Bulletin Analytique d’Histoire Romaine], FRANCIS (Prehistory and Protohistory, Art and Archaeology files and part of the History and Science of Religion), FRANTIQ-CCI [Fichiers de Recherche en sciences de l’Antiquité]. ...

Only 30% of UK repositories have working RSS feeds

Stuart Lewis and Les Carr have discovered that only 8 of the 26 UK DSpace repositories listed in ROAR have working RSS feeds.

Lewis and Carr are trying aggregate all the world's repository feeds.  Lewis:

There has started to be a lot of talk about what services feeds could enable us to build, and I predict that we’ll start seeing some novel and exciting uses of these feeds over the coming year. Enable your feeds, expose yourself, and be part of it!

If you're a DSpace repository manager, Lewis' post explains how to activate the feed.

Charging for access as penny-wise and pound-foolish

Interview with Andres Martin, new editor (January, 2008) of TheJournal of American Academy of Child and Adolescent Psychiatry, Journal of the Canadian Academy of Child and Adolescent Psychiatry, November 2008.  Excerpt:

...[Normand Carrey]:  In a sense journals are becoming dinosaurs. With the internet everything is going electronic especially the open access journals. Can you tell me how you are addressing this challenge/opportunity?

[Andre Martin]:  ...As things now stand, full access to the Journal requires payment from non-subscribers. This policy may well prove penny-wise and pound-foolish, and is largely out of synch with other leading journals. As their experience has already demonstrated, the fees that would be lost to free downloads would be more than made up for by increased visibility, impact and circulation, and through compensating revenue from higher advertisement fees. At a minimum, the Journal could easily provide free online access in the developing world through the HINARI initiative of the World Health Organization....


Sunday, December 14, 2008

Repositories in India

Rekha Mittal and G. Mahesh, Digital libraries and repositories in India: an evaluative study, Program: electronic library & information systems 42(3), 2008. Only this abstract is OA, at least so far:

Purpose - The purpose of this research is to identify and evaluate the collections within digital libraries and repositories in India available in the public domain.

Design/methodology/approach - The digital libraries and repositories were identified through a study of the literature, as well as internet searching and browsing. The resulting digital libraries and repositories were explored to study their collections.

Findings - Use of open source software especially for the creation of institutional repositories is found to be common. However, major digital library initiatives such as the Digital Library of India use custom-made software. The collection size in most digital libraries and repositories is in a few hundreds.

Practical implications - The paper highlights the state of digital libraries and repositories in India in late 2007.

Originality/value - The paper is the first of its kind that attempts to identify and evaluate digital libraries and repositories in India. It also gives a comprehensive listing of digital libraries and institutional repositories in India available in the public domain.

Data mining from ClinicalTrials.gov

Xiaohong Cao, et al., Data mining of cancer vaccine trials: a bird's-eye view, Immunome Research, December 12, 2008. Abstract:

Background: A wealth of information on clinical trials has been provided by publicly accessible online registries. Information technology and data exchange standards enable rapid extraction, summarization, and visualization of information and derived knowledge from these data sets. Clinical trials data was extracted in the XML format from the National Library of Medicine ClinicalTrials.gov site. This data includes categories such as 'Summary of Purpose', 'Trial Sponsor', 'Phase of the Trial', 'Recruiting Status', and 'Location'. We focused on 645 clinical trials related to cancer vaccines. Additional facts on cancer types, including incidence and survival rates, were retrieved from the National Cancer Institute Surveillance data. ...

Conclusion: We have developed a data mining approach that enables rapid extraction of complex data from the major clinical trial repository. Summarization and visualization of these data represents a cost-effective means of making informed decisions about the future cancer vaccine clinical trials.

See also our past posts on ClinicalTrials.gov.

Journal of American History launches OA podcasts

The Journal of American History has launched an OA podcast feature. (Thanks to Fabrizio Tinti.)
... [W]e will present interviews with authors of articles in our journal. In the future, we hope also to bring you conversations with award-winning authors of books on American history. ...

New OA journal of medicine from a new OA publisher

The International Journal of Clinical and Experimental Medicine is a new peer-reviewed OA journal published by e-Century Publishing Corporation. The first issue was released in January 2008; see also the inaugural editorial. The publishing fee is $100/page, subject to discount and waiver. Authors retain copyright and articles are released under the Creative Commons Attribution License.

See also e-Century's other journals, some in publication and some forthcoming, all OA.

New release of online archive platform

ICA-AtoM (International Council on Archives - Access to Memory) is a free and open source platform for sharing metadata about archival collections, as well as digital objects of the archival items. A new version (1.0.4 beta) was released on November 17, 2008. (Thanks to Charles Bailey.)

Presentation on DSpace

Michele Kimpton of the DSpace Foundation has posted her presentation from the Coalition for Networked Information fall meeting (Washington, DC, December 8-9, 2008).

New OA journal of ecology and evolution

Ideas in Ecology and Evolution is a new peer-reviewed OA journal published by Queen's University. (Thanks to Bora Zivkovic.) See the inaugural editorial for information on some of the unique aspects of the journal, including this:
... Authors pay a submission fee (currently $400 [Canadian]) at the time of submission. From these funds, two referees are paid (currently $150 each). If the paper is accepted for publication, authors are charged an additional processing fee (currently $300) to cover handling and publication costs. ...

New OA journal on Alzheimer's disease

Alzheimer's Research & Therapy is a new peer-reviewed OA journal published by BioMed Central. See the announcement from December 12, 2008. The article processing charge is £950 (€1100, US$1395), subject to discounts and waiver. Articles are published under the Creative Commons Attribution License.

OA to geodata in the "system of systems"

The Group on Earth Observations is leading the development of a Global Earth Observation System of Systems.

... This ‘system of systems’ will proactively link together existing and planned observing systems around the world and support the development of new systems where gaps currently exist. It will promote common technical standards so that data from the thousands of different instruments can be combined into coherent data sets. The ‘GEOPortal’ [pilot launched in June 2008] offers a single Internet access point for users seeking data, imagery and analytical software packages relevant to all parts of the globe. It connects users to existing data bases and portals and provides reliable, up-to-date and user friendly information – vital for the work of decision makers, planners and emergency managers. ...

Data sharing is part of the plan for GEOSS. The 10-Year Implementation Plan, approved in 2005, lays out these principles:

... The societal benefits of Earth observations cannot be achieved without data sharing.

The following are GEOSS data sharing principles:

  • There will be full and open exchange of data, metadata, and products shared within GEOSS, recognizing relevant international instruments and national policies and legislation.
  • All shared data, metadata, and products will be made available with minimum time delay and at minimum cost.
  • All shared data, metadata, and products free of charge or no more than cost of reproduction will be encouraged for research and education. ...

The drafting process is ongoing, with the most recent draft white paper and implementation guidelines on data sharing released in September 2008.

Comment. As noted, these guidelines are still being drafted. The proposed schedule has the guidelines presented for approval in 2010. The guidelines seem to recognize the value and urgency of removing barriers to access and re-use, but stop short of requiring OA. Maybe it's possible to strengthen the guidelines while they're still being written.

See also our previous post on a paper about GEOSS.

Norwegian agency adopts an OA mandate

Norwegian Knowledge Centre for the Health Services (Nasjonalt kunnskapssenter for helsetjenesten, NoKC) adopted an Institutional Policy for Open Access to Scientific Publications, November 25, 2008. 

Unfortunately the online version of the policy is an image-scan and I don't have time to rekey an excerpt.  However, it is in English.

Here's the gist of it:  All scientific publications by NoKC research staff "must" be deposited at the time of acceptance in Helsebiblioteket's Research Archive (HeRA), the new institutional repository launched by the NoKC health library (Helsebiblioteket).  HeRA contents are also retrievable through NORA (Norwegian Open Research Archives) and OAIster, as well as Google and Yahoo.  For each deposit, HeRA will release as much as it can as soon as it can.  For example, HeRA will respect publisher embargoes, but will release OA metadata during the period when the full-text may be embargoed. 

For more details, see the HeRA guidelines, which are in English and HTML.

The NoKC is an independent agency within Norway's Directorate of Health.  Its mission is to research the quality of the Norwegian health services.

Comment.  Kudos to all involved at NoKC.  I applaud the mandatory language, the immediate deposit requirement, and the dual deposit/release strategy.

Update (12/15/08).  Also see the NoKC press release (in Norwegian).  For some reason Google won't translate this doc into English.

Update (12/18/08).  NoKC signed the Berlin Declaration on Open Access to Knowledge on November 25.  (Thanks to Anja Lengenfelder.).

Labels:

Bishops blocking OA again

Damian Thompson, ICEL pulls the plug on free Mass settings, Telegraph, December 5, 2008.  (Thanks to Gino D'Oca.)  Excerpt:

...ICEL - the official International Commission on English in the Liturgy - has forced an independent Catholic website to disable links that provided free musical settings of the new translation of the Mass. And this despite the fact that the translation, produced by ICEL at the Vatican's insistence, has already been approved by Rome....

PS:  Also see some similar recent incidents, such as Catholic Bishops blocking OA to English translations of public-domain Latin texts or requiring the use of a particular TA edition in mass.

Growth of OA in 2008

Heather Morrison, Dramatic Growth of Open Access: 2008 Early Annual Edition, Imaginary Journal of Poetic Economics, December 13, 2008.  Excerpt:

This is a special early annual edition of the Dramatic Growth of Open Access Series, to facilitate predictions and planning for the coming year. Annual figures are for a full year, Dec. 11, 2007 - Dec. 11, 2008.

Dramatic Growth of Open Access 2008 - Highlights

While at the local level, institutional repository coordinators report (accurately) that recruiting content is difficult and growth no doubt seems very slow, at a global level the growth rate of material in archives is absolutely phenomenal.

Scientific Commons

  • 24 million publications, 963 repositories
  • increase of 45% or 7 million publications in the past year
  • adding close to 150,000 publications per week
OAIster
  • 19 million items
  • 4.8 million items added in the past year, a 34% increase
  • 137 repositories added in the past year

Material searchable through a Canadian harvesting service, the CARL Metadata Harvester, increased by 67%.

PubMedCentral

  • over half a million fulltext items added, a 52% increase (based on ROAR data)
  • 538 journals participating; 413 provide immediate free access, 288 provide full open access
  • this quarter - 46 more journals participating in PMC, 38 more with free text immediately available, 31 more open access
The Directory of Open Access Journals
  • 3,781 journals
  • 781 journals added in the past year,
  • growth rate 2 titles per day

There have been some minor decreases in numbers. Highwire Free lost one completely free site; the CARL Metadata Harvester is harvesting one less repository. The PMC Free figures for full-text for recent entries is down slightly. DOAJ new titles is low this month, at 47; this is more likely to reflect staffing fluctuations at DOAJ than anything else.

arXiv: 512,366 e-prints, 13% increase

RePEC: 555,000 items online (670,000 total), 25% increase in fulltext

E-LIS: 8,654 documents, 23% increase
59 Open Access Mandate policies (ROARMAP), 11 proposed policies

For comparison purposes, see the December 11, 2007 Interim Dramatic Growth of Open Access

Highlights: Last year I predicted that 15% of the world's peer-reviewed journals would be OA by the end of 2008. This number has been surpassed. Last year, the DOAJ growth rate was 1.4 titles per day; this year, it is over 2 titles per day. Last year, there were 40 OA policies; now, there are more than 50.

[PS:  Here omitting Heather's predictions for 2009.]

Open Data - data only Google Spreadsheet (for viewing)

Open Data - data plus growth Google Spreadsheet (for viewing)

Dataverse - for downloading

RSS feed for UKPMC

UKPMC has created an RSS feed of its new deposits.