Open Access News

News from the open access movement


Saturday, March 21, 2009

A legally remixable, open-source documentary

Open Source Cinema has released its first open-source documentary, RiP: A Remix Manifesto.   From the blurb:

RiP: A remix manifesto is an open source documentary about copyright and remix culture. Created over a period of six years, the film features the collaborative remix work of hundreds of people who have contributed to this website, helping to create the world's first open source documentary.

Kaltura, whose open-source technology helps power Open Source Cinema, calls RiP "the first legally remixable documentary about copyright and remix culture...."  No surprise, viewers haven't wasted any time in remixing it.

Building consensus for the MIT policy

Hal Abelson is the leading architect and entrepreneur behind the new OA mandate at MIT.  Here's an article he wrote late last year, showing the process of persuasion at work.  (Thanks to Jennifer Papin-Ramcharan.)  Remember that this process of persuasion eventually yielded a unanimous faculty vote.

Open Access Publishing: The Future of Scholarly Journal Publishing, MIT Faculty Newsletter, Nov/Dec 2008.  Excerpt:

There has been a growing perception over the past decade that the public, and the progress of scholarship in many disciplines, would be better served if peer-reviewed scholarly publications and data were distributed online so that they can be openly accessed and built upon, rather than through exclusive publishing agreements that restrict access and reuse. This perception has led to the emergence of policies that encourage or mandate open-access publishing, such as recent requirements by the National Institutes of Health, the European Research Council, and the Wellcome Trust.

At the same time, the continued shift towards an information economy has brought with it an increasing tendency to view scholarly writings through the lens of intellectual property, and there has been a concomitant heightening of concerns about copyright and licensing in academe. Alongside this has come an enormous increase in the cost of institutional subscriptions for scholarly journals in many disciplines, and an increasing imposition of licensing terms that restrict the reuse of scholarly works in teaching and research.
All this has placed significant stress on the historical system of scholarly journal publishing.

As faculty members at one of the world's premier academic institutions, we all have an enormous stake in how our scholarly contributions are published and disseminated.

In September 2008, Faculty Chair Bish Sanyal appointed an MIT Faculty Ad-hoc Committee on Open Access Publishing to coordinate a faculty-wide discussion of how our scholarly publications are and should be disseminated, with particular attention to the possibility of providing "open access" to those publications. The intent is for the faculty to discuss these issues in the fall and, if support for an open-access policy is established, for the Committee to draft a resolution to come before the faculty in the spring.

Committee members will be visiting departments over the coming weeks to explain the issues and receive feedback from the faculty....

The purpose of this article is to provide a basis for a broad discussion by outlining the situation facing the MIT community.

In the Committee's view:

  • MIT has a mission to further scholarship and to disseminate knowledge.
  • Historically, this mission of dissemination has been accomplished through a productive symbiosis: faculty write scholarly articles and give these articles to publishers, who then handle the academic review, production of the manuscript, dissemination of the journal, and advertisement of articles. The use of the term "give" here is intentional, because we mean the literal and complete transfer of copyright. Legally, the article becomes the publisher's property, and the terms of dissemination and further use are determined solely at the publisher's discretion.
  • This symbiosis has become increasingly unbalanced over the past two decades....
  • [A]t present, publication agreements are almost always negotiated between publishers and individual faculty members who typically pay only cursory attention to the details of contracts. To publish articles in scholarly journals, authors generally must sign publication agreements whose terms are set by the publishers. To move to a better system, we need processes through which faculty can play a role as a collective body, not just as individuals....

Following is a list that briefly summarizes some aspects of the present scholarly publishing system that can pose problems for some MIT faculty.

1. As scholars, many of us would like to make our work as widely available as possible, and the Internet facilitates this distribution at low cost. But this desire conflicts with the business models of some of the publishers who control the major journals in several fields and who limit access to those who pay fees.

2. Access restrictions and publication agreements may prohibit faculty members from distributing their own work even to students and colleagues. Authors might even be restricted from reusing figures and tables from their own articles. Legally, these writings are no longer theirs, once they have transferred copyright to a publisher.

3. Scholarly publishing has become subject to business consolidation, and monopoly pricing is becoming an increasing problem. The five largest journal publishers now account for over half of total market revenues. Over the past 15 years, the price of scholarly journals has grown roughly three times as fast as the Consumer Price Index, and library budgets, even at premier institutions, are straining under these pressures. One consequence has been a challenge to libraries' continued ability to purchase the books and monographs that are the major form of scholarly writing in some fields and to subscribe to new journals....

More on the MIT policy

MIT opens access to its research articles, CBC News, March 20, 2009.  Excerpt:

...In Canada, Athabaska University instituted a policy in November 2006 requesting that faculty, academic and professional staff deposit an electronic copy of any published research articles in its repository. However, it does not require the articles to be accessible online to people who search for them....

John Wilbanks, vice-president for science at Creative Commons, said the vast majority publishers already allow researchers to distribute their articles online for free through online repositories like MIT's. But it's not always easy for authors to figure out what their rights are because publishers' copyright agreements vary.

"What this [the MIT policy] does is make it standard and simple," said Wilbanks, whose organization helps make it easier for people to share images, writings, databases, patents or other creations on the internet through tools such as specially crafted copyright licences.

He said many publishers are already learning to deal with the new reality of increasing open access because the motivation behind it for researchers is so strong — making articles more accessible increases their the number of people who read them and cite them, boosting their impact.

"And if you're a publisher and you're going to reject anyone who has to follow one of these things [open access policies]," he said, "you're turning down Harvard, MIT, Stanford — you're turning down some pretty important scholars."

On-demand digitization from OSTI

Tim Byrne, What’s in the OSTI Legacy Collection?, OSTIblog, March 20, 2009.

The [U.S. Department of Energy] Office of Scientific and Technical Information legacy collection contains an estimated one million technical reports representing six decades of energy research that is, for the most part, unavailable in electronic format. On average, OSTI receives close to two hundred requests each month to digitize specific reports, with the vast majority of the requests coming from DOE employees and contractors. The legacy collection represents an enormous investment in research and development from the Atomic Energy Commission, Energy Research and Development Administration and Department of Energy. With the growing tendency of many researchers to rely solely on research information available electronically, this incredibly valuable resource collection is often ignored. By not having electronic access to previous research, scientific advancement may be diminished and funds wasted duplicating what has already been done.

OSTI has recently implemented the Adopt-a-Doc program that allows the general public to pay for the digitization of a document of their choosing. Documents in need of digitization can be identified by searching the Energy Citations Database and clicking on the Materials available for digitization box on the Fielded Search window. This is proving to be a popular service. Unfortunately, with the level of digitization that OSTI can currently handle, it will take a very long time to digitize the entire legacy collection. ...

The OSTI legacy collection is very rich in research all types of energy technologies. Although OSTI distributed much of the collection to libraries, laboratories, and contractors, access to this research by scientists and engineers is difficult at best. It is important for DOE to make this historic research collection easily accessible by digitizing the legacy collection and making it available on every researcher’s desktop.

See also our past posts on OSTI.

10-year review of the Astrophysics Data System

Michael J. Kurtz, et al., The Smithsonian/NASA Astrophysics Data System (ADS) Decennial Report, white paper submitted to the National Research Council Astronomy and Astrophysics Decadal Survey, self-archived March 18, 2009. Abstract:
Eight years after the ADS first appeared the last decadal survey wrote: "NASA's initiative for the Astrophysics Data System has vastly increased the accessibility of the scientific literature for astronomers. NASA deserves credit for this valuable initiative and is urged to continue it." Here we summarize some of the changes concerning the ADS which have occurred in the past ten years, and we describe the current status of the ADS. We then point out two areas where the ADS is building an improved capability which could benefit from a policy statement of support in the ASTRO2010 report. These are: The Semantic Interlinking of Astronomy Observations and Datasets and The Indexing of the Full Text of Astronomy Research Publications.
See also our past posts on the ADS.

Publishers collaborating with Scribd

Jason Kincaid, Major Book Publishers Start Turning To Scribd, TechCrunch, March 17, 2009. (Thanks to Michel Bauwens.)

Online document sharing site Scribd has announced that it has partnered with a number of major publishers, including Random House, Simon & Schuster, Workman Publishing Co., Berrett-Koehler, Thomas Nelson, and Manning Publications, to legally offer some of their content to Scribd’s community free of charge. Publishers have begun to add an array of content to Scribd’s library, including full-length novels as well as briefer teaser excerpts. ...

Scribd has actually been posting both full text and excerpts of books from some of these publishers for a few months now as they conducted trial tests. The fact that they’re now publicly endorsing the platform seems to indicate that they’re pleased with the results, and I won’t be surprised if we start seeing more publishers sign on. ...

Data-sharing advocate named Obama's health IT coordinator

HHS Names David Blumenthal As National Coordinator for Health Information Technology, press release, March 20, 2009.

The Department of Health and Human Services today announced the selection of David Blumenthal, M.D., M.P.P. as the Obama Administration’s choice for National Coordinator for Health Information Technology. As the National Coordinator, Dr. Blumenthal will lead the implementation of a nationwide interoperable, privacy-protected health information technology infrastructure as called for in the American Recovery and Reinvestment Act. ...

The American Recovery and Reinvestment Act includes a $19.5 billion investment in health information technology, which will save money, improve quality of care for patients, and make our health care system more efficient. Dr. Blumenthal will lead the effort at HHS to modernize the health care system by catalyzing the adoption of interoperable health information technology by 2014 ...

Comment. What's the OA connection? In 2002, Blumenthal was co-author of a Journal of the American Medical Association paper titled Data Withholding in Academic Genetics, studying the data-sharing behavior of geneticists, followed by a Science column, The Selfish Gene: Data Sharing and Withholding in Academic Genetics. From the latter:
A scientific world completely free of data withholding is probably unachievable and, indeed, may be undesirable. Some types of data withholding are necessary to ensure that investigators get the credit that they deserve and that partly motivates their effort. However, research and policy development should optimize scientific progress by identifying and combating the remediable causes of data withholding (such as those due to a lack of funds). Furthermore, when faced with decisions about sharing, scientists should consider seriously the broader impact of their decisions on the fundamental characteristics of the scientific enterprise and, when at all possible, err on the side of openness. Doing so may increase the likelihood that our system of science will function to its maximum capacity in the years to come.
What's that mean for health IT? A key question for a national health IT infrastructure will involve what patient information to make available for medical research and how. (See e.g. our recent post on the topic.) Blumenthal's background suggests he sees a value in data sharing, which may influence the way he handles that question.

Comparing OA searches for drug info

Maisha Kelly Freeman, et al., Google Scholar Versus PubMed in Locating Primary Literature to Answer Drug-Related Questions, Annals of Pharmacotherapy, March 3, 2009. (Thanks to Dean Giustini.)

Background: Google Scholar linked more visitors to biomedical journal Web sites than did PubMed after the database's initial release; however, its usefulness in locating primary literature articles is unknown.

Objective: To assess in both databases the availability of primary literature target articles; total number of citations; availability of free, full-text journal articles; and number of primary literature target articles retrieved by year within the first 100 citations of the search results.

Methods: Drug information question reviews published in The Annals of Pharmacotherapy Drug Information Rounds column served as targets to determine the retrieval ability of Google Scholar and PubMed searches. Reviews printed in this column from January 2006 to June 2007 were eligible for study inclusion. Articles were chosen if at least 2 key words of the printed article were included in the PubMed Medical Subject Heading (MeSH) database, and these terms were searched in both databases.

Results: Twenty-two of 33 (67%) eligible Drug Information Rounds articles met the inclusion criteria. The median number of primary literature articles used in each of these articles was 6.5 (IQR 4.8, 8.3; mean ± SD 8 ± 5.4). No significant differences were found for the mean number of target primary literature articles located within the first 100 citations in Google Scholar and PubMed searches (5.1 ± 3.9 vs 5.3 ± 3.3; p = 0.868). Google Scholar searches located more total results than PubMed (2211.6 ± 3999.5 vs 44.2 ± 47.4; p = 0.019). The availability of free, full-text journal articles per Drug Information Rounds article was similar between the databases (1.8 ± 1.7 vs 2.3 ± 1.7; p = 0.325). More primary literature articles published prior to 2000 were located with Google Scholar searches compared with PubMed (62.8% vs 34.9%; p = 0.017); however, no statistically significant differences between the databases were observed for articles published after 2000 (66.4 vs 77.1; p = 0.074).

Conclusions: No significant differences were identified in the number of target primary literature articles located between databases. PubMed searches yielded fewer total citations than Google Scholar results; however, PubMed appears to be more specific than Google Scholar for locating relevant primary literature articles.


Friday, March 20, 2009

More on the MIT policy

Two new reports from MIT on its two day-old OA mandate:

(1) MIT faculty open access to their scholarly articles, MIT's official press release on the policy, March 20, 2009.  Excerpt:

In a move aimed at broadening access to MIT's research and scholarship, faculty at the Massachusetts Institute of Technology have unanimously voted to make their scholarly articles available to the public for free and open access on the Web.

The new policy, which was approved at an MIT faculty meeting on Wednesday, March 18 and took immediate effect, emphasizes MIT's commitment to disseminating the fruits of its research and scholarship as widely as possible.

"The vote is a signal to the world that we speak in a unified voice; that what we value is the free flow of ideas," said Bish Sinyal, chair of the MIT Faculty and the Ford International Professor of Urban Development and Planning....

MIT's policy is the first faculty-driven, university-wide initiative of its kind in the United States. While Harvard and Stanford universities have implemented open access mandates at some of their schools, MIT is the first to fully implement the policy university-wide as a result of a faculty vote. MIT's resolution is built on similar language adopted by the Harvard Faculty of Arts & Sciences in 2008.

"Scholarly publishing has so far been based purely on contracts between publishers and individual faculty authors," said Hal Abelson, the Class of 1922 Prof. of Electrical Engineering & Computer Science and chair of the Ad-Hoc Faculty Committee on Open Access Publishing. "In that system, faculty members and their institutions are powerless. This resolution changes that by creating a role in the publishing process for the faculty as a whole, not just as isolated individuals." ...

"Through this action, MIT faculty have shown great leadership in the promotion of free and open scholarly communication," said MIT Director of Libraries Ann Wolpert, who worked closely with Abelson and others to move the resolution forward. "In the quest for higher profits, publishers have lost sight of the values of the academy. This will allow authors to advance research and education by making their research available to the world." ...

(2) Natasha Plotkin, MIT Will Publish All Faculty Articles Free In Online Repository, The Tech (from MIT), March 20, 2009.  Excerpt:

...The ad hoc committee’s explanatory document states that the ability to opt out may be especially important for junior faculty “who do not want to jeopardize their ability to work with certain publishers.”

“Initially opt out will get used a fair amount,” said Harold Abelson, a professor in the Department of Electrical Engineering and Computer Science and a member of the ad hoc faculty committee that proposed the resolution....

The open access resolution hopes to address two problems with publishing in scientific journals: first, publishers often force faculty members to give up rights to their own articles; second, the same publishers charge exorbitant subscription fees to MIT for access to these articles.

Abelson said that part of the rationale behind the new policy is to leverage MIT’s power as an institution in negotiating with publishing companies to freely distribute their articles.

A document [PS:  not yet online] produced by the ad hoc faculty committee that proposed the resolution stated: “[T]he goal of disseminating research is best served by using the unified action of the faculty to enable individual faculty to distribute their scholarly writings freely,” which “is especially apposite in the face of increasing efforts by some commercial publishers to further close access to the scholarly literature they control.”

Faculty and Libraries staff says that problems in the publishing industry have worsened over the past two decades.

Director of MIT Libraries, Ann J. Wolpert, said “[O]ver the last 15 years, much of scholarly publication has migrated from small societies and associations who had close relationships with researchers in their specific disciplines to a situation where those journals are owned by large international conglomerates — publicly owned in many cases — where the motivation for publishing is to return a margin to shareholders.”

This trend, said Wolpert, has created a situation in which “the way faculty teach and conduct research is greatly at odds with the business models of publishers … Publishers seek to maximize profits by exercising maximal control over [authors’] work, while authors seek to advance research and education.” ...

MIT will not have to spend money on creating the online repository: it will build on DSpace, the platform developed at MIT for collecting PhD dissertations online. Currently there are no plans to hire staff to support the effort to collect and disseminate articles. Current Libraries staff will assist faculty in negotiating contracts with publishers in light of the new policy.

Wolpert said she expects that workflows for submitting and publishing articles online will be implemented no later than June, but could be put in place earlier....

Translating the metadata of the World Digital Library

The Library of Congress is arranging for the metadata of the OA World Digital Library to be translated into seven languages.  The contract went to Lingotek.  For more detail, see the March 10 press release.

PS:  Read the press release.  Lingotek isn't just a group of translators.  It's a "collaborative translation platform" harnessing the power of a social network of translators.  Also see our past posts on the WDL.

Obama administration heard criticism of secret trade negotiations

James Love, Obama trade officials promise thorough review of transparency policies, Knowledge Ecology Notes, March 20, 2009.  Excerpt:

The following report was prepared by KEI, and reviewed by Daniel Sepulveda of USTR:

Obama Administration To Undertake Review of Transparency of Trade Negotiations

On Thursday, President Obama’s trade officials met with several civil society groups and promised a thorough review of the USTR policies regarding transparency. The review is expected to be completed within a few months. The process will include a meeting within a month to discuss initial specific proposals for openness and transparency. Citizens and NGOs are encouraged to think about the specific areas where openness and transparency can be enhanced and how. Among the specific proposals that will be evaluated are the following at the request of KEI:

1. Disclosure of all negotiating texts and policy papers
2. Disclosure of all meeting agenda (as soon as they are available), and participant lists....
4. Public consultations and comment periods....

This review will be focused on making the recent statements by President Obama on transparency concrete and effective in the area of trade negotiations....[T]houghtful discussions of the point at which communications with foreign governments should be disclosed and the extent of the disclosure required are more useful than broad high level statements on transparency.

The meeting was chaired by Daniel Sepulveda, a former Obama Senate aide who is now Assistant U.S. Trade Representative for Congressional Affairs....

Civil society participants included James Love, Judit Rius and Malini Aisola, of Knowledge Ecology International, Chris Murray of Consumers Union, Marcia Carroll of Essential Action and Eddan Katz of EFF (by phone).

KEI is very impressed with the USTR decision to undertake a review of USTR transparency efforts. They are taking this much further than simply reviewing policies on the Freedom of Information Act (FOIA), or recent controversies over the secrecy surrounding the Anti-Counterfeiting Trade Agreement (ACTA) negotiations....

The USTR welcomes submissions of those suggestions to Daniel_Sepulveda@ustr.eop.gov

Comment.  For some of the background, see our post from last week on the Obama administration's denial of FOIA requests to see the US documents on the ACTA negotiations.  Thanks to Daniel Sepulveda and the USTR for listening to public criticism and agreeing to this review.  Kudos to KEI and the other participating NGOs for arranging this welcome opportunity. 

A library in your pocket

Douglas MacMillan, Sony: Take That, Amazon! Business Week, March 20, 2009.  Excerpt:

Amazon Chief Executive Jeff Bezos won't tell you how many Kindles he's sold, but he's happy to share the number of e-book titles available on the device: 250,000, at last count.

With one fell swoop, a rival has made that library look small. On Mar. 19, Sony announced the addition of 500,000 titles to the collection of 100,000 e-books currently available to Sony Reader devices. Sony is giving away the books through a partnership with Google, which has already scanned and stored some 7 million books for its Google Book Search project. Neither partner disclosed details of the arrangement, but Google says that more of the public domain titles it has scanned will be available to Sony Readers in the future....

Comments

  • I was an adult before I could carry one recorded song in my pocket.  Today my daughters carry their favorite 10,000.  I could carry one book in my pocket, barely.  But --with an internet connection-- we can now carry hundreds of thousands, and we're speeding toward the day when --without an internet connection-- we could carry our favorite research library.  In a pocket.  I'm still digesting what that will mean.  Free online access to just about all you need will coexist, for those who want it, with free offline access to just about all you need.  I'm still stuck on the unexpectedness of that.  In January 2007, I got this far, but I'm still no further:
    Think about the difference between free online access to all the content you'll ever need and free offline access to all the content you'll ever need.  Online access has its advantages because the content will always be current and we'll be in real-time contact with other users who have the same access to the same content.  Offline access has its advantages because we won't always have connectivity, we won't always want connectivity, and lots of copies keeps stuff safe.  Because all our work for free online access will also help the cause of free offline access, we don't have to choose between the two.  We just have to start thinking about them.
  • Last December I noted that "there are many more gratis OA books online than print books in the average academic library."  Now your gateway to that literature can fit in your pocket.  Soon after, the literature itself can fit in your pocket, offline.  How will that change research? 

OA can improve research metrics

Stevan Harnad, The Need to Cross-Validate and Initialize Multiple Metrics Jointly Against Peer Ratings, Open Access Archivangelism, March 19, 2009.  Excerpt:

The Times Higher Education Supplement (THES) has reported the results of a study they commissioned by Evidence Ltd that found that the ranking criteria for assessing and rewarding research performance in the UK Research Assessment Exercise (RAE) changed from RAE 2001 to RAE 2008. The result is that citations, which correlated highly with RAE 2001, correlated less highly with RAE 2008, so a number of universities whose citation counts had decreased were rewarded more in 2008, and a number of universities whose citation counts had increased were rewarded less....

(5) The dawning era of Open Access (free web access) to peer-reviewed research is providing a wealth of new metrics to be included, tested and assigned initial weights in the joint battery of metrics. These include download counts, citation and download growth and decay rates, hub and authority scores, interdisciplinarity scores, co-citations, tag counts, comment counts, link counts, data-usage, and many other openly accessible and measurable properties of the growth of knowledge in our evolving "Cognitive Commons."...

Expert Group recommends OA for publicly-funded research in Europe

In early 2008, the European Commission assembled an Expert Group of 13 academics to rethink how the EU funds research and, in particular, "to undertake an evidence-based, ex-post evaluation of FP6."   As group chairman, it appointed Ernst Th. Rietschel, President of the Leibniz Association and Professor Emeritus at the University of Lübeck.  The group met six times between July 2008 and January 2009, and in February submitted its final report, Evaluation Of The Sixth Framework Programmes For Research And Technological Development 2002-2006, February 2009. 

From recommendation 5.2.3 (p. 60):

...The public accountability of the FP [Framework Programme] must be increased – not through audit control, but through clear procedures and access to information at all stages and, where appropriate, through open access to the research results obtained through the FP funding....

Labels:

Report on the Ghana repository seminar

AAU holds seminar to improve access to African scholarly works, Modern Ghana, March 17, 2009.  A report on the Institutional Repository Advocacy Seminar (Accra, March 16-18).  Excerpt:

Heads of tertiary institutions and academic researchers in Africa are meeting in Accra to deliberate on providing regional and national data base systems to improve the management and access of African scholarly works.

The three-day international seminar organized by the Association of African Universities (AAU) in conjunction with other partners would also discuss policy guidelines that would clearly spell out the operations of such data systems, known as Institutional Repositories in Africa.

Professor Goolam Mohammedbhai, Secretary General of AAU said at the opening of the seminar on Monday that the AAU in 1998 saw the need to create a common platform to manage and disseminate theses and dissertations of African Universities electronically.

He said the AAU therefore, initiated a project dubbed; the Database on African Theses and Dissertation (DATAD) programme to, among other objectives, facilitate the development of copyright procedures and regulations to protect intellectual property rights of African university graduates and researchers.

Since 2003, Prof. Mohammedbhai said, 14,723 records had been received on a similar database developed for the continent.

“There are over 700 registered users on the database with more than 60 institutions from 25 different countries and about 600 individuals from 65 countries using the database,” he added.

The Secretary-General therefore expressed the hope that with such facilities in every country, scholarly works in Africa would be further promoted.

Prof. Mohammedbhai also pledged AAU's commitment to technically support all countries that wished to establish national repository centres.

Mr Alex Tettey-Enyo, Minister of Education, in a speech read on his behalf...commended the initiative of AAU to create such data base systems that would make African research publications more accessible both to Africans themselves and the other continents....

More on the pricing crisis

Caleb Fleming, Library to cut nearly $1 million, Collegiate Times, March 19, 2009.  Excerpt:

Virginia Tech's University Library plans to cancel nearly $900,000 worth of subscriptions in the 2009-2010 budget year.

In a letter to the university community, Eileen Hitchingham, dean of university libraries, wrote that potential budget cuts and anticipated inflation costs facilitate a need for mass cancellations.

Hitchingham said that the university must cancel $500,000 worth of subscriptions to accommodate library budget cuts. They must also cover $400,000 worth of cuts to meet inflationary cost increases....

[Paul Metz, director of collection management for University Libraries] said that the university has been aggressive in recent years in obtaining "all title" deals from venders such as Elsevier, John Wiley, Springer-Verlag, Blackwell and Taylor & Francis/Routledge. While the resources have been heavily used, Metz wrote, annual price increases exceeding inflation have demanded subscription cancellations be made.

ATA action center for communicating with Congress

The Alliance for Taxpayer Access has launched a legislative action center to make it easy for US citizens to contact their Congressional delegations about bills affecting OA. 

Right now the center hosts a user-modifiable ATA letter supporting the NIH policy and opposing the Conyers bill.  If you haven't already written to your House representative to oppose the bill, please take a moment to do so.  Unless you want to edit the default letter, it just takes a few clicks.  And please spread the word. 

More on privacy and medical data sharing

Alison Hall and Philippa Brice, Data sharing: privacy concerns and public attitudes, PHG Foundation, March 16, 2009.
How extensively should governments be able to share personal data without consent for the public good? This question has vexed many of those involved in medical research who have sought more effective means of carrying out large scale research studies without the need to take consent from every patient. ...

Not consulting the US Register of Copyrights

Andrew Albanese, Register of Copyrights Not Asked by Congress To Weigh in on Google Book Search?  Library Journal, March 19, 2009.  Excerpt:

Out of last Friday's all-day Columbia University conference on Google Book Search came this interesting little tidbit: ...Congress, well, just didn’t seem to care about the program.

“Most disturbing of all was [Register of Copyrights Mary Beth Peters'] admission that not one member of Congress has asked the Copyright Office to comment on the settlement," [Peter] Hirtle blogged  “even though it may fundamentally change how Americans can access and use copyrighted information.”

Certainly, that insight has to make one wonder how much Congress cares about the promotion of progress at the bedrock of copyright law. Last year, Congress failed to pass orphan works legislation but passed a draconian bill stiffening infringement penalties. And while sitting out the potentially momentous discussion over copyright as raised by Google Book Search, Congress is again considering the Fair Copyright in Research Works Act—controversial legislation that would bar public access to research funded by taxpayers, and would undo the NIH’s access policy, enacted last year.

Notably, Peters was also not asked to testify at a hearing on the Fair Copyright in Research Works Act during a congressional hearing last year —but, curiously, a former register of copyright, Ralph Oman was asked, and did testify. Oman told lawmakers that the public access policies, like the NIH’s, would harm publishers and gut copyright. 

Comment.  Not only was Peters not asked to testify at last year's hearing on the Conyers bill, her office made it known that it didn't like the bill.  As Albanese reported at the time,

...[S]ources in the library community suggested to the LJ Academic Newswire that her absence was not without meaning: the Copyright Office is said not to be persuaded by publishers’ arguments regarding the NIH public access policy, and sees the recently introduced bill as unnecessary.

The rise of open video

Shay David, Industry Perspectives: The Promise of Open Source Video, Streaming Media, March 18, 2009.  Excerpt:

As we enter 2009 three things about online video become clear: (1) with over 17 Billion videos watched in December 2008 in the U.S. alone...video is here to stay; (2) video exhibits classic long-tail distribution—while YouTube remains the dominant player, video is rapidly moving from destination sites to the rest of the web, with millions of sites streaming video as the new mode of communication; and (3) the conversation at all levels is shifting from the technological aspects to the value aspects: not how to build a player or convert between formats but, rather, how to foster audience engagement and monetize on these billions of streams.

These observations are underwritten by the trends that became apparent in the past several years, namely, the rapid reduction in the cost of delivery and the abundance of companies in this ecosystem. For anyone who is part of the video universe, the key question that remains open is what drives value in this brave new world....The short answer, I believe, is to focus on innovation —of formats, user experience, content, or delivery.

And here is where open source video enters the picture: It is a development methodology and distribution strategy that allows each company in the ecosystem to focus on what it does best, instead of replicating the efforts of others. Open source video, introduced to the market by companies such as Kaltura, is being adopted at every level of the ecosystem by industry leaders such as Akamai, Mozilla, and Wikipedia. Its premise is simple: Video is too important of a medium to be controlled by a single player. By espousing the principles of openness at all levels, including formats, technology, and content, and by collaborating in the development process, video can enjoy the force-multipliers that we have seen in other areas of open source software. The result is a better user experience, a reduction in the total cost of ownership, and a focus on innovative value-driven results....

There is a ton of activity around open source video today. Kaltura, a market-leading pioneer in the area of open source video, has partnered with the Participatory Culture Foundation (creators of the popular Miro software), with Wikipedia, with the Mozilla Foundation, as well as 20 other organizations and instigated the Open Video Alliance (OVA), an umbrella organization for stake-holders who share the vision and believe in the future of open source video. The OVA is centered on raising awareness and developing standards that promote open source video and coordinate members’ activities....

JISC briefing paper on CC licenses

JISC, Creative Commons Licences, briefing paper, March 9, 2009.
The purpose of this briefing paper is to provide JISC-funded projects, as well as content creators and others within higher (HE) and further education (FE) institutions, with information that can be used to make judgements as to when the use of open content licences, particularly CC licences, may be appropriate.
See also e.g. our past post on the briefing paper on CC licenses from the Digital Curation Centre.

Openness and the future of libraries

Nisha Doshi, Is Society Biased Against "Openness"? A PLoS Board Member’s Perspective on the Future of the Library in the Digital World, Public Library of Science, March 19, 2009. Notes on Cultural Agoraphobia and The Future of The Library (Cambridge, March 12, 2009).

... [James] Boyle challenged the stereotypical image of libraries as conservative, dusty places, arguing instead that they are extraordinary institutions which combine the fundamental roles of archiving, facilitating research and enabling public access to cultural material. Nevertheless, libraries developed in a world where the container for knowledge was the book: only one person at a time could read any one copy of a book, and libraries provided valuable repositories for these documents. In the age of digitised knowledge, books and papers can be read simultaneously be anyone on the planet – in theory, at least. What effect does this have on the notion of the library?

To address this question, we were asked to consider three trends. Firstly, according to Professor Boyle, attitudes towards information dissemination are characterised by a fear of openness. If asked to predict the outcomes of the World Wide Web in 1992, we would have easily predicted the risks of spam, illicit copying, and pornography, and we may have rejected this decentralised model in favour of a closed, carefully-controlled system. In contrast, he suggests, we would not have been able to imagine the blossoming of information and knowledge sharing which the Internet has enabled, nor the dramatic benefits it has brought in areas such as information layering, blogging, open source software, and whistleblowing. This fear of openness, or “cultural agoraphobia”, leads to an asymmetry in our perception of risks and benefits. As a result, we are too reluctant to embrace open systems and methods of production or distribution, and we are not yet able to differentiate effectively between appropriate and inappropriate developments in the sphere of “openness”.

The development of copyright formed Professor Boyle’s second trend: as the cost of copying has decreased to almost zero via the invention of the printing press, copying technologies, audio recording and the Internet, the perceived need for stronger protection has increased – when you needed a monk to copy a book, copyright wasn’t so much of an issue! The third trend which we were asked to contemplate relates to the contrast between the evidence-based nature of most policy-making (for example, in medicine or environmental science) in contrast with the anecdotal or philosophical basis of most intellectual property debates. ...

In a world of open access and digitisation, therefore, do libraries still have a role to play? Absolutely, concluded Boyle. We should increasingly look to digital libraries to provide “global access to everything”, we should invest in ensuring the stability of digital archives to preserve access to cultural objects in the face of evolving technologies, and we should develop novel ways to use and explore the information which is increasingly becoming available to us. ...

See also the podcast of the event.

Thursday, March 19, 2009

Open notebook mathematics

The Polymath1 project is an open collaborative mathematics project initiated on Timothy Gowers's blog and continued in further blog posts and comments and on a wiki. See also the project timeline. (Thanks to Entertaining Research.) See also this later post from Gowers:
Without anyone being particularly aware of it, a race has been taking place. Which would happen first: the comment count reaching 1000, or the discovery of a new proof of the density Hales-Jewett theorem for lines of length 3? Very satisfyingly, it appears that DHJ(3) has won. If this were a conventional way of producing mathematics, then it would be premature to make such an announcement — one would wait until the proof was completely written up with every single i dotted and every t crossed — but this is blog maths and we’re free to make up conventions as we go along. So I hereby state that I am basically sure that the problem is solved (though not in the way originally envisaged). ...

DRIVER survey of repository managers

DRIVER is conducting a survey of European repository managers.

Considered the largest initiative of its kind in helping to enhance repository development worldwide, the aim of DRIVER is to create a portal through which researchers can find and access the wealth of research outputs currently stored in open access repositories across Europe.

At this point in the project's development, we wish to evaluate views of the European repository manager community about the current and potential usefulness of the new DRIVER portal. To this end we would be grateful if you would spare a few minutes to consider and respond to the questions below. ...

See also our past post on DRIVER's companion survey of researchers.

OAIster added to FirstSearch

OCLC's FirstSearch now includes OAIster.  See OCLC's press release from March 16.

This marriage was part of the plan when OCLC acquired OAIster back in January.  While the FirstSearch version of OAIster is is behind a price wall, OCLC has promised that "OAIster will remain a permanently free, open access service."

Freeing 90 years of bird migration data

Alexis Madrigal, Open Data: Help Migratory Bird Observations Fly into the Digital Age, Wired Science, March 19, 2009.  (Thanks to Garrett Eastman.)   Excerpt:

The only complete dataset of bird migration patterns in North America is trapped in a basement — and it's going to take the power of crowdsourcing to free it.

Stored on 6 million note cards stretching back to the 1880s, the records of migratory birds were created by a network of thousands of volunteers who recorded birds' comings and goings, then carefully shipped their observations to the government.

All that irreproducible, paper-based data now sits in a basement in Virginia. Short on cash, a group of biologists is taking a page from NASA's citizen-participation playbook. The North American Bird Phenology Program is asking volunteers to transcribe all that paper into a digital database....

PS:  Also see the crowdsourcing project to make an OA transcription of handwritten records from the 1875 Norwegian census

On publishing raw clinical trial data

Iain Hrynaszkiewicz and Douglas G. Altman, Towards agreement on best practice for publishing raw clinical trial data, Trials, editorial, March 18, 2009. Abstract:
Many research-funding agencies now require open access to the results of research they have funded, and some also require that researchers make available the raw data generated from that research. Similarly, the journal Trials aims to address inadequate reporting in randomised controlled trials, and in order to fulfil this objective, the journal is working with the scientific and publishing communities to try to establish best practice for publishing raw data from clinical trials in peer-reviewed biomedical journals. Common issues encountered when considering raw data for publication include patient privacy - unless explicit consent for publication is obtained - and ownership, but agreed-upon policies for tackling these concerns do not appear to be addressed in the guidance or mandates currently established. Potential next steps for journal editors and publishers, ethics committees, research-funding agencies, and researchers are proposed, and alternatives to journal publication, such as restricted access repositories, are outlined.
See also Iain Hrynaszkiewicz, Can we establish best practice for publishing raw clinical trial data?, BioMed Central Blog, March 18, 2009.

Milestones for University of Nebraska-Lincoln IR

Charles Bailey, Two Million Plus Downloads: University of Nebraska-Lincoln Institutional Repository, DigitalKoans, March 18, 2009.
As of today, the DigitalCommons@University of Nebraska-Lincoln repository contains over 31,000 documents, has had 2,132,581 full-text downloads to date, and 1,307,822 downloads in the past year ...

Forthcoming OA journal of chemistry

The Journal of Systems Chemistry is a forthcoming peer-reviewed OA journal published by Chemistry Central. See the March 19 announcement. There are currently no author-side fees. Authors retain copyright and articles are published under the Creative Commons Attribution License.

An OA book-scanning program from the Library of Congress

The Library of Congress describes its new "Digitizing American Imprints" Program in a 21-minute webcast from January 14, 2009.  (Thanks to ResourceShelf.)  From the blurb:

Deanna Marcum, Library of Congress Associate Librarian for Library Services; Doron Weber, Program Director of the Alfred P. Sloan Foundation; and officials of the Internet Archive participated in a news conference on the Library's "Digitizing American Imprints" program scanning "brittle books" too fragile for standard use, to preserve them, on an open-content basis, for future generations.

PS:  The new pilot project is the the LOC's first mass-digitization effort for books, and aims to digitize 100,000 public-domain books for OA.  The occasion for the webcast was the digitization of the 25,000th book.  "Openness" is one of the project's four guiding principles --quality, quantity, openness, and leadership.  Also see the press release issued on the same date (which we blogged at at the time).


Wednesday, March 18, 2009

MIT adopts a university-wide OA mandate

This afternoon, the MIT faculty unanimously adopted a university-wide OA mandate.  Here's the resolution the faculty approved (thanks to Hal Abelson, MIT professor of computer science and engineering, who chaired the committee to formulate it):

MIT Faculty Open-Access Policy

Passed by Unanimous of the Faculty, March 18, 2009

The Faculty of the Massachusetts Institute of Technology is committed to disseminating the fruits of its research and scholarship as widely as possible. In keeping with that commitment, the Faculty adopts the following policy: Each Faculty member grants to the Massachusetts Institute of Technology nonexclusive permission to make available his or her scholarly articles and to exercise the copyright in those articles for the purpose of open dissemination. In legal terms, each Faculty member grants to MIT a nonexclusive, irrevocable, paid-up, worldwide license to exercise any and all rights under copyright relating to each of his or her scholarly articles, in any medium, provided that the articles are not sold for a profit, and to authorize others to do the same. The policy will apply to all scholarly articles written while the person is a member of the Faculty except for any articles completed before the adoption of this policy and any articles for which the Faculty member entered into an incompatible licensing or assignment agreement before the adoption of this policy. The Provost or Provost's designate will waive application of the policy for a particular article upon written notification by the author, who informs MIT of the reason.

To assist the Institute in distributing the scholarly articles, as of the date of publication, each Faculty member will make available an electronic copy of his or her final version of the article at no charge to a designated representative of the Provost's Office in appropriate formats (such as PDF) specified by the Provost's Office.

The Provost's Office will make the scholarly article available to the public in an open- access repository. The Office of the Provost, in consultation with the Faculty Committee on the Library System will be responsible for interpreting this policy, resolving disputes concerning its interpretation and application, and recommending changes to the Faculty.

The policy is to take effect immediately; it will be reviewed after five years by the Faculty Policy Committee, with a report presented to the Faculty.

The Faculty calls upon the Faculty Committee on the Library System to develop and monitor a plan for a service or mechanism that would render compliance with the policy as convenient for the faculty as possible.

Comments

  • This is big.  Another of the world's great research universities has an OA mandate.  Unlike Harvard and Stanford, which now have OA mandates for some of their schools, the MIT mandate is university-wide.  Moreover, while the votes at Harvard and Stanford were unanimous within the relevant schools, and the Boston University vote was unanimous within the University Faculty Council, this is the first university-wide unanimous faculty vote. 
  • Like the Harvard policy, the MIT policy allows faculty to opt-out.  Faculty must make separate requests for separate works, and must give a reason. 
  • MIT developed DSpace (with Hewlett-Packard), of course, and already has an OA repository (also named DSpace) with over 30,000 deposits
  • The new policy does not specify the method of deposit.  Faculty are not even asked to make direct deposits, merely to make their work available for deposit.  The Provost's Office will insure that the works make it to the repository.  Note that a faculty committee has an essential role in the interpretation and implementation of the policy.
  • Kudos to Hal Abelson and the MIT faculty!
  • Also see our past posts anticipating this policy at MIT (1, 2), and our many posts on MIT's other OA projects, including its pioneering Open Courseware

Update (3/19/09).  Also see Hal Abelson's 'comments:

I chaired the committee that drafted the resolution and led faculty discussions on it throughout the fall. So I’m particularly gratified that the vote was unanimously in favor. In the words of MIT Faculty Chair Bish Sanyal, the vote is “a signal to the world that we speak in a unified voice; that what we value is the free flow of ideas.”

Our resolution was closely modeled on similar ones passed last February by Harvard’s Faculty of Arts and Sciences and by the Harvard Law School, also passed by unanimous vote. Stanford’s School of Education did the same, as did Harvard’s Kennedy School of Government just last Monday.

Harry [Lewis] blogged last month about the execrable “Fair Copyright in Research Act” introduced by Rep. Conyers of Michigan, which would repeal the National Institute of Health mandate on open-access publishing and forbid government agencies from imposing similar mandates. This act is harmful to the progress of science and should be scuttled. Now that there are unanimous votes supporting open access by faculty at world-leading institutions supporting open access, Rep. Conyers should recognize what everyone else does, and deflate his ill-conceived trial balloon.

Labels:

The impact of "Roach Motel"

Dorothea Salo, A post-Roach-Motel world, Caveat Lector, March 17, 2009.

["Innkeeper at the Roach Motel"] is an open-access success story. Before it was even officially published, its preprint version had over four thousand item views. ... This is rather remarkable, considering that it is my first published journal article, it’s about a very constrained niche of librarianship, and I myself have not even four years behind me in the profession.

It gets better. The day it was published, Roach Motel had at least five citations in the literature (plus one student project, which I’m not counting). Two of them were in the same Library Trends issue in which Roach Motel itself appears. I don’t know much about citation patterns in library publishing, but five citations pre-publication also strikes me as rather remarkable! ...

What is unremarkable, given the slow pace of publishing, is that the day it was published, Roach Motel was obsolescent. Don’t get me wrong; it still needed publishing, because not everyone is a leader in this space (as in any space), and watching the discussion at the Repository Infrastructure conference in the UK via Twitter, I see the discourse could still do with a kick in the trousers. Even so, as I told Andrew Albanese in the interview I did for Library Journal’s article, we are moving into a post-Roach-Motel world, and this is a good thing.

Consider, for example, Oregon State University Libraries’ new mandate. Voilà, Roach Motel page 105 made a thing of the past: “Poor repository uptake among academic librarians invites faculty to charge their libraries with hypocrisy… Yet no tenure-track libraries with institutional repository programs have mandated deposit for their own staff’s published and presented materials.” ...

[The world is changing.] And in good ways. ...

And nobody will be happier than I when Roach Motel is completely obsolete.

New OA journal on pathogens

Gut Pathogens is a new peer-reviewed OA journal published by the International Society for Genomic and Evolutionary Microbiology and BioMed Central. The article-processing charge is £850/US$1190/€920, subject to discounts or waiver. Authors retain copyright and articles are published under the Creative Commons Attribution License. See the inaugural editorial, published February 3, 2009:

As the open access movement begins revolutionizing the publication of publicly funded biomedical research, it is time to ensure access to important scientific discoveries and opinions within and across the disciplines which have direct bearing on health and quality of life in resource poor settings and in countries of the world where library budgets are seriously shrinking. In this concern, the area of enteric infectious diseases and gut health has been of paramount significance. For example, the enteric infections that cause diarrhoeal diseases already constitute one of the top 10 causes of death and kill approximately 1.81 million people in the developing world (6.9% of total deaths), a figure which is more dramatic and alarming than the number of deaths associated with tuberculosis and malaria put together.

To this end, the International Society for Genomic and Evolutionary Microbiology (ISOGEM) in collaboration with BioMed Central Ltd. has launched Gut Pathogens with the aim of providing a high-quality forum for research on enteric infections of humans and animals. The journal led by three Editors-in-Chief and supported by a highly qualified and organized international Editorial Board publishes open access research articles of repute in areas of biology and the pathogenesis of bacterial, parasitic and viral infections of the gut including their diagnosis, epidemiology and clinical management. ...

As like all BioMed Central journals, Gut Pathogens shall be committed to the open access publication policy – the concept that has revolutionized research access to communities and that was recently endorsed by major funding agencies including the NIH and the Wellcome Trust. ...

OA archaeology and PLoS

Nisha Doshi, Open Access Archaeology and PLoS, Public Library of Science, March 17, 2009.

Starting work in open-access publishing at PLoS Medicine seemed a sensible step for me, having focused my Masters research on an evaluation of the quality and research dividends of an online archaeological database ... Yet, many of my colleagues in archaeology insisted that open access had no place in our discipline, arguing that authors would be unable to fund publication fees and traditional subscription-based journals were too central to assessment of academic merit. A quick search of PLoS journal archives reveals, however, that some archaeological scientists have already embraced the open-access model for dissemination of their research.

Perhaps most significant in terms of its contribution to the discipline is a paper by Houyuan Lu and colleagues from the Chinese Academy of Sciences, Beijing, published by PLoS ONE in February 2009. This study represents a valuable step forward in archaeobotanical analysis of common and foxtail millet – important early cultivars in northern China which have been hitherto difficult to identify at archaeological sites. ...

In addition to archaeobotany, research in archaeogenetics is already well represented amongst PLoS papers. [Note: omitting examples] ...

Open-access archaeological papers published by PLoS are not limited to archaeobotany and archaeogenetics. For example, William Banks, Francesco d’Errico and colleagues present archaeological and chronological data, coupled with high-resolution palaeoclimatic simulations, to argue that Neanderthal extinction did not result from climate change but from competition with anatomically modern humans. ...

Far from representing a dead end, this range of research hints that the open-access model has an important role to play in archaeology. Already, one major field unit (Oxford Archaeology) has committed to “making archaeological knowledge free to access” as part of its Open Archaeology programme and the Alexandria Archive Institute has developed an international facility for open access archiving of primary data from archaeology and related disciplines. Hopefully, the coming months and years will see increasing publication of archaeology in open-access journals such as PLoS ONE, to enable widespread dissemination of research and facilitate international and interdisciplinary access, dialogue and debate.

Google: Why we believe in sharing geodata

Dylan Lorime, Why we believe in geospatial data sharing, Google Public Policy Blog, March 17, 2009.

In several recent posts, we've highlighted our ongoing efforts to partner with public sector organizations to add their map content to Google Maps and Google Earth. We undertake these partnerships because, by definition, organizations like local governments are the most authoritative source of geospatial data for their jurisdiction. But partnering with governments is a difficult mathematical equation. If you run the numbers for just the U.S. where there are many federal agencies with geospatial data, 50 state governments, some 3,000 counties and over 30,000 cities and towns, you quickly get an idea of the volume of relationships you'd have to develop and manage to add data from all governments to a service like Google Maps.

It's therefore no surprise that we at Google are very supportive of organizations that seek to streamline access to and simplify the sharing of geospatial data. One such organization is the National States Geographic Information Council (NSGIC), the association of U.S. state government GIS agencies. Among NSGIC members' objectives is coordinating the collection and sharing of data within their jurisdictions. Because of the efforts of many NSGIC members, we've managed to efficiently add aerial imagery and other datasets for entire states to our services. ...

Another NSGIC objective, shared by U.S. federal agencies and others, is producing nationwide datasets as part of a National Spatial Data Infrastructure, such as through the Imagery for the Nation program. We've joined others in the technology industry in endorsing such efforts. ...

We applaud the work of GIS agency managers and policymakers who are working, at all levels of government, to ensure that the public's investment in geosptial data is shared and thereby used across agencies and governments, but also is made readily available to the public through free services like Google Maps. We look forward to collaborating with NSGIC and other organizations to advance such efforts in data sharing.

OA to collection of medical history images

Alexis Madrigal, Rare Trove of Army Medical Photos Heads to Flickr, Wired Science, March 17, 2009. (Thanks to Boing Boing.)

An Army archivist is undertaking a massive project to digitize and make public a unique collection of rare and sometimes startling military medical images, from the Civil War to Vietnam.

This previously unreported archive at the Army-run National Museum of Health and Medicine in Washington, D.C., contains 500,000 scans of unique images so far, with another 225,000 set to be digitized this year.

Mike Rhode, the museum's head archivist, is working to make tens of thousands of those images, which have been buried in the museum's archive, available on Flickr. Working after hours, his team has posted a curated selection of almost 800 photos on the service already.

"You pay taxes. These are your pictures," Rhode said. "You should be able to see them."

The collection includes images of injured veterans, medical treatments (like the hernia operation above), the first airplane crash investigation, and public health warnings about the dangers lice posed to World War II soldiers. ...

The organization that runs the NMHM — the Armed Forces Institute of Pathology, funded by the Department of Defense — hasn't signed off on Rhode's plan to bring medical history photos to the people. ...

Still, Rhode is continuing to push to get the photos, a precious resource, into the light of the internet.

"We have pictures from all types of military conflicts and all different types of medicine and issues in medicine," Rhode said. "We love the stuff that we're able to play with and want to bring it to everyone else in the world." ...

See also the project's blog.

More on a UK Research Data Service

Slides from An International Conference on the UK Research Data Service Feasibility Study (London, February 26, 2009) are now available. See also JISC, Managing UK research data for future use, press release, March 5, 2009.

Nearly 200 delegates gathered in London last week to debate how to capture and manage the UK's rich data resources for future use. Their focus was the findings of the eagerly anticipated feasibility study into a UK Research Data Service (UKRDS), a study funded under the [Higher Education Funding Council for England] shared services programme with additional funding from JISC, Research Libraries UK and the Russell Universities Group IT directors.

The study recommends that there is a need for a UKRDS, based on a ‘cooperative service model’, that would build on existing good practice in data management and fill gaps in provision. More than 700 researchers at four case study universities had been interviewed as part of the investigations.

'The UK is already well provided with an infrastructure on which to build', Jean Sykes, Librarian and Director of IT Services at the London School of Economics and chair of the UKRDS project board, told the conference. This includes several subject-specific national data centres and many JISC initiatives, such as the JANET network, the Information Environment, the Digital Curation Centre and tools developed under the Data Audit Framework projects.

Institutional repositories could fill gaps left by the lack of suitable subject repositories, Dr Malcolm Read, JISC executive secretary, told the conference. 'If we can’t create subject repositories in all disciplines, we need a horizontal vision across institutional and disciplinary repositories.' This should include subject practitioner advice on what to save and how. ...

An executive summary of the (undated) final report is also available. See also notes by: See also our past posts on the UKRDS.

Canadian membership in DOAJ

Heather Morrison, Growing Canadian membership in DOAJ!, The Imaginary Journal of Poetic Economics, March 15, 2009.
The list of Canadian members of the Directory of Open Access Journals is growing! As of today, there are 7 Canadian individual library members, 1 Canadian library consortium, and at least 1 individual member that I know of. ...

Upgrades to Open Library

Anand Chitipothu, Open Library software upgrade, ol-discuss list, March 16, 2009. (Thanks to Open Sesame.)

Open Library has been upgraded to new software with lot of improvements.

  • Readable URLs: Title of the book/name of the author is now part of the URL ...
  • Better default cover images ...
  • Improved type system: With the improvements to the type system, it becomes easier to model more complex data like table of contents ...
  • More APIs ...

Notes on Columbia OA conference

Open Access and Libraries Columbia University March 17, 2009, Book Calendar, March 17, 2009. Notes on Open Access and Libraries (New York, March 17, 2009).

Notes and audio on OA in religion

Bryan Bibb, Open-Access Scholarship: Audio Podcast, Hevel.org: A Chasing after Wind, March 16, 2009.

Last Friday, I gave a presentation at the southeastern regional meeting of the National Association of Baptist Professors of Religion on “Open-Access Journals and the Future of the Scholarly Community.”  I will be writing much more about this topic in the near future.

In addition to this post describing my presentation and the questions afterward, and this page with my outline notes, I have prepared an audio podcast of the talk itself. ...

Political support for OA in India

Sridhar Gutam, Political Support to Open Access, Gutam Sridhar on Open Access, March 17, 2009.

The [Communist Party of India (Marxist)] sets out the alternative path for the country in its Lok Sabha [lower house of the Parliament of India] Elections 2009 Manifesto. On the Science and Technology it has put forth the following points:

  1. ... Promoting free software and other such new technologies, which are free from monopoly ownership through copyrights or patents; “knowledge commons” should be promoted across disciplines, like biotechnology and drug discovery.

Leftist's Support for Free and Open Source

Dinesh Abrol, in his article published in People's Democracy Vol. XXXIII No. 4 (February 01, 2009), suggested the following safeguards to the Indian Parliament:

  1. Institutionalization of the practice of open source culture of knowledge protection and utilization in medical and agricultural research.
  2. Prohibition of exclusive licensing and insisting of the use & practice of non-exclusive licensing/licenses offered by Creative Commons to ensure the protection of public interest in publicly funded research. ...

All research universities in Alberta publish OA journals

Heather Morrison, Alberta: 100% of research universities publish open access journals, The Imaginary Journal of Poetic Economics, March 15, 2009.
Was just browsing DOAJ Canada and noticed that all 4 research universities in Alberta (University of Alberta, University of Calgary, University of Lethbridge, and Athabasca University) are publishing open access journals! Athabasca University is the early leader as the world's first university all-open-access publisher. Is Alberta the first jurisdiction to have OA publishing services available at every research university? ...

More on the new Harvard OA mandate

Niha S. Jain, HKS Allows Article Access, Harvard Crimson, March 18, 2009.  Excerpt:

In part of a University-wide trend to share intellectual property, Harvard Kennedy School faculty voted overwhelmingly last week to allow open access for all scholarly articles written by faculty.

The change—which took effect immediately after a vote at a faculty meeting last Tuesday—means that all faculty will permit the Kennedy School to distribute their articles through DASH (Digital Access to Scholarship at Harvard) repository, which is an online database currently being developed by the Office for Scholarly Communication.

This makes the Kennedy School the third Harvard school to allow open access for its journals, following the Faculty of Arts and Sciences and Harvard Law School.

According to Doug L. Gavel, media relations manager at the Kennedy School, approximately 98 percent of faculty members who attended the meeting voted in favor of the new policy.

“As an academic, I’m very excited to be able to publish my articles broadly,” said Erich J. Muehlegger, an assistant professor at the Kennedy School....

“I understand the economic pressures the publishing houses are under. They have been losing money for a long time,” said Professor Matthew A. Baum. “But I think it’s unfortunate when intellectual property gets put under lock and key.”

Similarly, Professor Stuart M. Shieber said that while there was a worry that allowing open access might affect the business model of subscription-based journals, he personally felt there are no drawbacks to an open access program. He added that faculty members who would prefer that their articles not be accessed by the general public can opt out of the program for any article.

Professor and Director of the University Library Robert C. Darnton said he saw no immediate threat to scholarly journals. He said support for open access has been overwhelming because many faculty members believe that they have a responsibility to the larger public.

Associate Dean for Research Matthew L. Alper added that faculty are focused on the issue of the public good, with a desire to share their research quickly and cheaply, particularly in the developing world where library access is limited.

While many students were not aware of the policy change, they said they thought it was a positive development. Kennedy School students Daniel C.I. Bjorkegren and Dan M. Rakove said they thought making academic information available to the public was an important part of the school’s mission.

Darnton said he expects that more Harvard schools will adopt open access. “I feel that the momentum is building up so that Harvard’s example will spread far and wide,” he said.

Also see John Lauerman, Harvard Government School to Offer Access to Faculty Papers, Bloomberg News, March 17, 2009.  Excerpt:

...The John F. Kennedy School of Government voted March 10 to become the latest unit at the Cambridge, Massachusetts, university to approve posting faculty papers online, the school said yesterday in an e-mail. The only exemptions will be articles covered by exclusive contracts with publishers, the school said....

“[I]t’s the best policy anywhere,” said Peter Suber....“It shifts the default so Harvard faculty must make their work openly available unless they opt out. The default at most universities is the other way around: you have to choose open access and arrange for all the [permissions].” ...

The U.S. National Institutes of Health requires scientists who accept funding from the agency to allow open access to their research. The NIH mandate helps university scientists who want their works to be made public when they negotiate with publishers who may prefer to have exclusive rights to the work they print, Suber said.

“I’ve been scanning the horizon and haven’t seen a publisher flatly refuse” to publish work that NIH requires to be made public, Suber said in a telephone interview. “A lot of publishers are unhappy with the policy, but as far as I know they’re still publishing work by NIH-funded authors when it meets their standards.”

Open access to research and new ideas is vital to the development of poor countries, said Calestous Juma, a Kennedy School professor of the practice of international development. His main concern is that people be credited and compensated for ideas that lead to inventions, he said. Developing countries that gain free access to research must continue to respect intellectual property rights, he said.

“For society to be creative you need as much information as possible in the public domain,” he said. “At the same time, you have to provide incentive for innovation by allowing people to own what they create. The two are complementary rather than contradictory.”

More writers are insisting on free access to their own research, according to Juma, who said he is an editor at two journals.

“There have been times when authors have said they would like to publish with us but the want to make sure the articles are publicly available,” he said yesterday in a telephone interview. “We have made them available, and that hasn’t prevented libraries from buying the journals.”

Streamlining the production of OA webcasts

openEyA is a new tool to convert classroom lectures or conference presentations to OA video. 

From today's announcement:

The ICTP Science Dissemination Unit (SDU) [of the Abdus Salam International Centre for Theoretical Physics] is pleased to announce the release of an innovative automated recording system: openEyA, which integrates different technologies under free Ubuntu Linux OS. The project's goal is to automate as much as possible the production and post-production processes of academic webcasting in order to reduce costs....

We aim to help bridging the knowledge divide in developing countries.

From the web site:

openEyA...has these main features:

  • no dedicated human intervention during recording and post-processing (no need for an operator or editor); The automated audio-video-slides synchronization is done in few minutes.
  • scalable architecture and portable (from one classroom to many);
  • no special requirements for the speaker or lecturer (no need to press buttons during the lecture or to wear a microphone, etc) ;
  • high resolution of zoomable images...
  • low total cost of hardware implementation;
  • low-bandwidth friendly features (synchronized recordings are saved as zip files)....

Swedish archaeology journal converts to novel form of delayed OA

Fornvännen, the Swedish journal of archaeology published by the Royal Swedish Academy of Letters, has converted to a novel form of delayed OA. 

From the annnouncement by Martin Rundkvist, its managing editor:

Since a bit more than a year, Fornvännen's first 100 years (1906-2005) have been freely available and searchable on-line....Now we've gone one step further and made the thing into an Open Access journal. The site's run of the journal is complete up to 6 months ago, and every issue will henceforth appear on-line half a year after it was distributed on paper....

Many thanks to my friends Gun Larsson and Kerstin Assarsson-Rizzi of the Library of the Academy of Letters who have been the driving forces behind our on-line move!

From the English version of the Fornvännen "about" page:

The [digital] Table of Contents and Abstracts will be published [online] simultaneously with the print edition at fornvannen.se, while the full text of the articles will be published electronically 6 months after the print edition has reached its subscribers. The articles will then be available free of charge. The Table of Contents and links to available articles in full text will be found at the webpage of the annual issue.

Comment.  Very few journals have been able to make OA digital editions and priced print editions coexist without some delay in the digital edition.  It's not impossible; Medknow does it.  But Fornvännen is trying a new form of delayed OA:  The full-text digital edition is OA from the moment it appears online, but it doesn't appear online until six months after the print edition.


Tuesday, March 17, 2009

New issue of D-Lib

The March/April 2009 issue of D-Lib is now available. See especially:
  • Mats Dahlström and Alen Doracic, Digitization Education: Courses Taken and Lessons Learned

    ... While there is much ado in [cultural heritage] digitization policy statements about making the material usable and re-usable, particularly the latter ideal is hardly ever realized, certainly not where we are talking about enabling research and education users (not to mention "deep sharing" with other memory institutions, cf. Seaman, 2004) to download rich archival material, choose segments from it and reuse them in new projects and contexts.

    We try to counter this by teaching the students open access, deep access and open source approaches, and we furthermore require the students to work with and to deliver their project materials on an open access basis, encouraging them to apply a Creative Commons license to their intellectual enhancements of the digitized material. ...

  • Ana Alice Baptista, Report on the 2nd Ibero-American Conference on Electronic Publishing in the Context of Scholarly Communication (CIPECC 2008)

Library responses to the NIH policy

Joseph Thomas is conducting a survey of library responses to the NIH policy.  See his call for participants (dated today):

...How are academic libraries responding [to the policy]?  The survey will provide information on what methods libraries are using to engage their campuses on a variety of scholarly communications issues, including the NIH Public Access Policy, and how successful they feel these efforts have been. Survey responses will inform a presentation to take place at the Annual Conference of the North American Serials Interest Group in June 2009.

The survey contains 25 questions and should take approximately 20 minutes to complete....

Please follow the link to take the survey: [here]....

The survey will remain open until Friday, May 15, 2009....

Irreproducible research

Peter Murray-Rust, Closed Data at Chemical Abstracts leads to Bad Science, A Scientist and the Web, March 17, 2009. Comments on Alan H. Lipkus, et al., Structural Diversity of Organic Chemistry. A Scaffold Analysis of the CAS Registry, Journal of Organic Chemistry, May 28, 2008.

... The data come - according to the authors - from a snapshot of the [Chemical Abstracts Service] registry in 2007. I believe the following to be facts, and offer to stand corrected by CAS: ...

  • CAS sells a licence to academia (Scifinder) to query their databse . This does not allow re-use of the query results. Many institutions cannot afford the price.
  • There are strict conditions of use. I do not know what they are in detail but I am 100% certain that I cannot download and use a signifcant part of the database for research, and publish the results. Therefore I cannot - under any circumstances attempt to replicate the work. If I attempted I would expect to receive legal threats or worse. Certainly the University would be debarred from using CAS.

The results of the paper - such as they are - depend completely on selection of the data. ...

There are many data sources which are unique - satellite, climate, astronomical, etc. The curators of those work very hard to provide universal access. Here, by contrast, we have a situation where the only people who can work with a dataset are the people we pay to give us driblets of the data at extremely high prices. ...

But to use a monopoly to do unrefereeable bad science is not worthy of a learned society.

Nominations wanted for 2009 SPARC Europe Award

It's time to send SPARC Europe your nominations for its 2009 Award for Outstanding Achievements in Scholarly Communications.  From today's announcement:

SPARC Europe (Scholarly Publishing and Academic Resources Coalition), a leading organization of European research libraries, today announced the opening of nominations for the Fourth SPARC Europe Award for Outstanding Achievements in Scholarly Communications.  Launched in 2006, this annual Award recognises an individual or group within Europe that has made significant advances in our understanding of the issues surrounding scholarly communications and/or in developing practical means to address the problems with the current systems.  The First Award, in 2006, was presented to the Wellcome Trust, with the second in 2007 going to the SHERPA group and the third in 2008 to Leo Waaijers.

Nominations are open to all who have made major contributions in the field of scholarly communications, and the judging panel, formed from members of the SPARC Europe Board of Directors, particularly wishes to receive nominations for individuals or groups working in any of the following areas:

  • Research that helps illuminate the scholarly communications landscape
  • Advocacy for new models of scholarly communications
  • Development of new tools to aid scholarly communication (e.g. repository software)
  • Interesting new projects or products
  • Implementation of policies that promote new scholarly communication models.

Nominations may come from any part of the world, but nominees should work mainly within Europe.  (Self-nominations will not be accepted.)  Preference will be given to activity within the past two years. Nominations, together with a short (approximately 500 words) outline of the nominee’s work, should be sent to David Prosser, Director of SPARC Europe no later than 20th April 2009. The Award will be present at the CERN Workshop on Innovations in Scholarly Communication (OAI6), to be held in Geneva, Switzerland, 17-19 June 2009.

Crowdsourced annotation of a Flickr photo collection

Patrick Peccatte, Flickr et PhotosNormandie: une entreprise collective de redocumentarisation, Documentaliste - Sciences de l'information, 2009; self-archived March 12, 2009. English abstract:
Originally a collection of 2,763 copyright-free photos on the Battle of Normandy, and a private initiative to correct errors and enhance the quality of descriptions. A choice : to use the features of the photos sharing platform Flickr to improve indexation of this [source].
See also the collection.

Italian thesis on OA

Claudio Marconi, Open Access e archivi aperti: nuove modalità di diffusione della letteratura scientifica, BA thesis, Università degli Studi di Roma “La Sapienza”, Facoltà di Scienze della Comunicazione, 2008; self-archived March 16, 2009. English abstract:
Knowledge as primary good of modern society is a topic widely discussed, so that the latter is often defined in terms of the first as "information society". But what about its particular form which much of today well-being depends on: scientific knowledge? Scholar needs to publish. Academic careers depend on publishing and without scientific pubblications it would not be possible for those who decide to undertake research, build anything: "we are like dwarfs on the shoulders of giants". Scholar, however, soon learns that the traditional scientific publications, except for the text adopted as manuals, are by no means profitable. Equally, the resulting texts are not public but private. In the hands of the publishers they become subject to restrictive policies aimed at maximizing profit, not at dissemination, as any scholar would like. Libraries are forced to buy back for their users something their colleagues have produced without any expectation, except the broader circulation. A third party, the publisher, becomes the filter and a real obstacle to dissemination of scientific knowledge. The argument presented here is based on this paradox, in the belief that speech technology and markets affect the word itself, suggesting a review of recent developments which have affected the scientific communication system and discussing the movement for Open Access to scientific literature, whose goal is precisely the reconciliation between the cultural practices of scholars and the economy these practices are based on.

African statement on access to knowledge

Representatives of the parliaments of 27 African countries and 4 intergovernmental organizations signed the Kigali Declaration on the Development of an Equitable Information Society in Africa at a recent conference in Kigali, Rwanda. (Thanks to the Association for Progressive Communications.)

... Acknowledging that equitable access to information is a right for all;

Recognising the vast inequities to equitable access to information, knowledge, and affordable communications across the continent and the uneven development that this contributes together with the negative impact that the high cost of communications services has on the wider economy;

Considering the critical role the information and communication technology (ICT) can play in the economic growth and development of nations;

Acknowledging that the quality of a democracy is dependent on the rights of the citizenry to express themselves freely and to access information and knowledge in order to make informed decisions;

Realising the significant role that parliaments must play in promoting an equitable information society through the enactment of legislation, which ensures transparency, accountability, openness and effective oversight; ...

Call upon all African Parliaments to: ...

Urge the development of policies and initiate and enact laws that promote equitable access to information, communication and knowledge and provide for favourable institutional arrangements for the ICT sector. ...

OA to Finnish economics research

Päivi Kanerva, Open access -julkaiseminen Turun Kauppakorkeakoulussa, a Master's thesis approved by the University of Tampere on December 17, 2008.  (Thanks to Jyrki Ilva.) 

Read it in Finnish or Google's English.  The title in Google's English = Open Access Publication of the Turku School of Economics.  The Turku School of Economics is in Turku, Finland.

A larger view of free culture

Roger Lancefield, Free Culture versus Freetard, Ralpress, March 16, 2009.  (Thanks to Glyn Moody.)  Excerpt:

...Free culture isn’t about ripping off someone else’s “intellectual property”, neither is it about a generation of youngsters who are growing up with the expectation that expensively produced content should be available at zero cost. These are just temporary issues and concerns, thrown up by the inexorable shift from a world in which most content is permission-based, to one in which most content will be governed by copyleft-style licenses or else is released into the public domain. Free culture is epitomized by innovation and collaboration, it builds networks of people and content, it encourages and facilitates mutual help and support, it leads to the creation of many thousands of open and free collections of knowledge and media, it helps us reclaim data which by rights belongs to us rather than to government or corporations. Far from being all about obtaining the hard work of others for nothing, free culture is instead characterised by giving for nothing, it’s about contributing and collaborating without the expectation of financial reward....

Free culture would quickly run out of gas if people only ever made withdrawals instead of paying in....

Free culture is not primarily a political movement, it’s the natural result of mass ownership of myriad devices that can share data....

It is not a fad, it goes much deeper. Characterising it in narrow terms as a politically motivated cult, or as a commercially damaging movement is missing the big picture, for these things are not of its essence. It is first and foremost a technology-facilitated extension of our normal modes of behaviour, of our normal desires, and this is why it is inevitable, profound and unstoppable.

A plan for open access and open data in chemistry

Steven M. Bachrach, Chemistry publication - making the revolution, Journal of Cheminformatics, March 17, 2009.  (The DOI-based URL doesn't work at the moment.) 

Abstract (provisional):   The advent of the Internet has been the impetus for the Open Access movement, a movement focused on expanding access to information principally by reducing the costs of journals. I argue here that the Open Access movement has had little impact on the chemistry community and has taken our attention away from the real opportunity to revolutionize scientific communication. I propose a plan that both reduces the total cost of publishing chemistry and enriches the literature through incorporation of Open Data. By publishing lots of data, available for ready re-use by all scientists, we can radically change the way science is communicated and ultimately performed.

From the body of the paper:

Most OA journals operate on an author-pays model, and so it appears that the cost of publication simply is shifted from the reader to the author....

I offer a two-tier publication model that attempts to address both the ever-increasing cost spiral and to bring new technologies to bear on scientific communication....

Tier 1 publishing: We need to reduce the number of full-service, full-feature journals and the number of articles they publish. I believe that each of us has in their minds a hierarchy of journals – those in which we publish our best works and which we read religiously, those we peruse on occasion and those that we consider rubbish. Each of us should support only those top-tier outposts of science. We should publish in, peer-review, serve on editorial boards, and ask our libraries to purchase just these journals. For all others we should decline our services and our moneys. We should end up with perhaps 20% of the total number of current journals....

This is not to say that we can completely do without all of the rest of the literature. Clearly, the other quarter of the citations in the book are important too! This brings me to the second tier.

Tier 2 publishing: The remaining say 80% of the scientific literature should be “published” within institutional repositories (IR). I place quotations around the word “publish” to indicate that this effort is quite different in kind from what is carried out in Tier 1; Tier 1 publishing includes all of the components we commonly associate with the publishing endeavor (Chart 1): fulltime staff of editors and copy-editors, peer review of articles, typesetting, widespread dissemination and archiving. Institutional repositories will provide just the role of dissemination and archiving. Authors will write and typeset the articles themselves, and then deposit in the IR. Peer review will occur by way of the community interacting through Web 2.0 tools, such as forums, wikis, tagging and blogs, which originate within the IR or are linked into it....

Comments

  • I won't comment on Bachrach's two-tier proposal.  But I will comment on one of his main premises.  It's not true that "Most OA journals operate on an author-pays model....."  We've known since 2005 that it's not true, and I'm surprised that the claim made it through peer review at a BMC journal.  In fact, most OA journals charge no fees at all
  • Bachrach's claim may be true about chemistry, but he hasn't cited any data to show it.  If anyone wants to do a manual count, go to the "for authors" section of the DOAJ and look at the chemistry journals.  Each one is tagged to indicate whether or not it charges a publication fee.   (This doesn't work if you look at the chemistry journals in the main section of the DOAJ.)

Update (3/17/09).  Bill Hooker has done the manual tallies of fee-based and no-fee OA journals in chemistry.  Looking at the full OA journals only, as opposed to hybrid OA journals, here are the results:  42% (= 38) charge publication fees, 49% (= 44) charge no fees, and 9% (= 8) are not classified one way or the other by the DOAJ.  (Thanks, Bill!)

University of Chicago requires electronic submission of theses and dissertations

The University of Chicago has stopped accepting paper copies of theses and dissertations and now requires electronic submission.  (Thanks to Charles Bailey.)

Comment.  But does Chicago require or even provide OA for these electronic theses and dissertations (ETDs)?  After studying many university ETD policies in 2006, I concluded:

In principle, universities could require electronic submission of the dissertation without requiring deposit in the institutional repository.  They could also require deposit in the repository without requiring OA.  But in practice, most universities don't draw these distinctions.  Most universities that encourage or require electronic submission also encourage or require OA.  What's remarkable is that for theses and dissertations, OA is not the hard step.  The hard step is encouraging or requiring electronic submission.  For dissertations that are born digital and submitted in digital form, OA is pretty much the default.  I needn't tell you that this is not at all the case with journal literature....

I believe that's still true.  However, Chicago appears to be one of the exceptions.  As far as I can tell (from ROAR and OpenDOAR) Chicago doesn't have an institutional repository, either for faculty research or for student ETDs.  The new page on ETDs says nothing about deposit in any other OA repository (such as the NDLTD, which functions as a universal or fall-back OA repository for ETDs).  The same page seems to limit ETD access to ProQuest UMI.  That's not necessarily a bar to OA.  ProQuest will provide OA, but it charges an author-side fee to do so and requires an affirmative request.  Chicago students would be much better off if the university launched a repository and deposited all new ETDs there, with reasonable exceptions and permissible embargoes, but without charge.  This is entirely compatible with maintaining a relationship with ProQuest.

Recommendations and working examples for the Obama administration

Erik Wilde, Eric C. Kansa, and Raymond Yee, Proposed Guideline Clarifications for American Recovery and Reinvestment Act of 2009, University of California School of Information, working paper, March 16, 2009.

Abstract:   The Initial Implementing Guidance for the American Recovery and Reinvestment Act of 2009 [from the Obama administration's Office of Management and Budget (OMB)] provides guidance for a feed-based information dissemination architecture. In this report, we suggest some improvements and refinements of the initial guidelines, in the hope of paving the path for a more transparent and useful feed-based architecture. This report is meant as a preliminary guide to how the current guidelines could be made more specific and provide better guidance for providers and consumers of Recovery Act spending information. It is by no means intended as a complete or final set of recommendations.

From the body of the report:

...The goal of [the OMB's feed-based] architecture is to make sure that spending can be openly and transparently tracked at the agencies receiving funds from the Recovery Act, and to make sure that data can be aggregated and re-published on the portal being set up at [Recovery.gov].

The feed-based publishing of data proposed by the Office of Management and Budget (OMB) represents an exciting first step in making government transparency on the Web a reality....However, as yet, the OMB guidelines for feed implementation (in terms of how to publish feeds and what to publish in them) remain underspecified and require some modification and expansion to be more effective. This document serves as a guide towards making feeds work better for their stated transparency and accountability goals....

Along with this report, [we have made available] a sample set of simulated datasets and sample files....The data that has been used for the sample data is mostly fictional and partly based on the information already available through the first recovery Web sites. All sample data can be found in feeds.xml, which conforms to feeds.xsd. Using feeds.xslt, the sample dataset has been converted into a set of "Web sites" and feeds, and the main purpose of this was to get some data to build on as a prototyping and experimentation platform....

For more perspective, see the blog posts by the report co-authors Erik Wilde and Eric Kansa.

Supporting Australian repositories

The Council of Australian University Librarians (CAUL) has launched the CAUL Australian Institutional Repository Support Service (CAIRSS).  From yesterday's announcement:

Following discussions with the Department of Innovation (DIISR) the Council of Australian University Librarians (CAUL) Executive, and the ARROW Management Committee, CAUL is establishing the CAUL Australian Institutional Repository Support Service (CAIRSS), to provide support for all institutional repositories in Australian universities, regardless of the software being used. CAIRSS will be overseen by CAUL and is funded by the residue of the ARROW funds plus the annual Australasian Digital Theses Program (ADT) contribution to provide support beyond the end of 2008 over a two/three year period.

CAUL has appointed the University of Southern Queensland, through the Australian Digital Futures Institute (ADFI) within the Division of Academic Information Services (DAIS) to undertake its new institutional repository support service. The service will commence officially on March 16, 2009.

PS:  For related initiatives see the OAD list of Services to support repository managers.

OpenStreetMap has 100,000 users

OpenStreetMap has passed the milestone of 100,000 registered users.  (Thanks to Glyn Moody.)

Gaming download numbers at an OA repository

Benjamin Edelman and Ian Larkin, Demographics, Career Concerns or Social Comparison: Who Games SSRN Download Counts? Harvard Business School Working Papers, February 19, 2009.  (Thanks to ResourceShelf.)

Abstract:   We use a unique database of every SSRN paper download over the course of seven years, along with detailed resume data on a random sample of SSRN authors, to examine the role of demographic factors, career concerns, and social comparisons on the commission of a particular type of gaming: the self-downloading of an author’s own SSRN working paper solely to inflate the paper’s reported download count. We find significant evidence that authors are more likely to inflate their papers’ download counts when a higher count greatly improves the visibility of a paper on the SSRN network. We also find limited evidence of gaming due to demographic factors and career concerns, and strong evidence of gaming driven by social comparisons with various peer groups. These results indicate the importance of including psychological factors in the study of deceptive behavior.


Monday, March 16, 2009

Harvard's Kennedy School adopts an OA mandate

Harvard's John F. Kennedy School of Government has adopted an OA mandate.  From today's announcement:

The faculty of the John F. Kennedy School of Government at Harvard University voted overwhelmingly last week to make all faculty members' scholarly articles publically available online at no charge, providing for the widest possible dissemination of faculty research and scholarship. The historic vote adds Harvard Kennedy School to a growing list of faculties at the university to endorse the initiative.

"The scholarly articles authored by Harvard Kennedy School (HKS) faculty members enhance the understanding of many critical and urgent public policy issues, and by embracing open access we seek to maximize the avenues by which these ideas are shared," said Kennedy School Dean David T. Ellwood. "In the developing world especially, where access to expensive journals is rare, there is a pressing need for access to the latest policy advice and scholarship coming from HKS faculty."

Under the new policy, HKS will make articles authored by faculty members available in an open access online repository, the contents of which will be searchable through web tools such as Google Scholar. Authors will maintain the right to distribute articles on their own websites, and educators will have the right to freely provide the articles to students, so long as the materials are not used for profit. Faculty members will have the ability to "opt out" of the requirements of the new policy in the case of specific articles in which the policy proves to be incompatible with the obligations under a particular publisher's contract.

The vote at HKS follows a proposal by a university-wide committee aimed at encouraging wider dissemination of scholarly work. The Faculty of Arts and Sciences (FAS) and faculty at Harvard Law School (HLS) each voted last year to endorse legislation similar to that approved by the HKS faculty.

Stuart M. Shieber, the James O. Welch Jr. and Virginia B. Welch professor of computer science (FAS) and faculty director of the Office for Scholarly Communication, who introduced the legislation last year said, "I am delighted that the faculty of yet another school at Harvard has chosen to embrace open access and that momentum is spreading on campus. Now that the Office for Scholarly Communication is staffed, the DASH (Digital Access to Scholarship at Harvard) repository is in development, and we have experience working with both the Faculty of Arts and Sciences and the Law School, we look forward to working with the Harvard Kennedy School to implement their faculty’s important decision smoothly and efficiently."

Although other academic institutions have considered similar open access policies, none are considered as far-reaching as the one put forth at Harvard.

Comment.  This is the third Harvard OA mandate, after the mandates at the Faculty of Arts and Sciences and the Law School.  A fourth is in development at the Medical School.  It's also the third Harvard OA mandate adopted by faculty vote.  It follows the exemplary pattern of the FAS and Law School policies, requiring OA through the IR but offering the possibility of an op-out (which applies only to OA, not to both OA and deposit).  Kudos to Stuart Shieber, the Office of Scholarly Communication, and all involved at the Kennedy School.

Update (3/17/09).  Also see Andrew Albanese's article in Library Journal:

...Notably, unlike the [Faculty of Arts and Sciences] vote, the [Kennedy School] faculty vote was not unanimous. While Harvard spokesman Doug Gavel said he was not privy to the final vote tally, but that the vote passed overwhelmingly—with some 98 percent voting in favor. Kennedy School dean David T. Ellwood, however, is one in clear support of the mandate. “In the developing world especially, where access to expensive journals is rare, there is a pressing need for access to the latest policy advice and scholarship coming from [Kennedy School] faculty," he said in a statement.

Labels:

OA advocates among LJ's Movers & Shakers 2009

Library Journal has named Dean Giustini and Dorothea Salo among of the Movers & Shakers of 2009.

The citation on Dean doesn't mention his OA work.  But for details, see our past posts on him.

From the citation on Dorothea:

Cassandra of Open Access.  [Dorothea is the] digital repository librarian at the UW-Madison Library....“Dorothea is the Cassandra of open access,” says Laura Crossett, branch manager, Park County Library System, Meeteetse, WY, “the one speaking powerful, critical truths that most people would rather not hear.”

Take for example Salo's seminal and widely cited fall 2008 article for Library Trends, Innkeeper at the Roach Motel. “I became the 'Innkeeper at the Roach Motel,'” explains Salo, “when I used the phrase roach motel to explain to a faculty member that the repository didn't have document versioning; documents went in, but they didn't come out to be reworked.” The article builds on topics Salo often visits at her blog, Caveat Lector (“let the reader beware”), where she's similarly become known for pulling no punches.

“Open access starts at home,” says Salo, who sees the profession as “disastrously timid about supporting experimentation and the business models we think preferable, speaking truth to power, even just modeling the behaviors we want faculty to adopt.” Issuing a call to arms, she warns, “We can't just wring our hands about the serials crisis any longer. If we want results, we need to put our market power and our praxis where our mouth is.”

PS:  Congratulations to Dean and Dorothea. 

Clay Shirky on newspapers

Clay Shirky, Newspapers and Thinking the Unthinkable, Clay Shirky, March 13, 2009. 

This thoughtful piece is about newspapers, not scholarly journals.  But how far do the insights carry over?  Excerpt:

...It makes increasingly less sense even to talk about a publishing industry, because the core problem publishing solves — the incredible difficulty, complexity, and expense of making something available to the public — has stopped being a problem....

When someone demands to know how we are going to replace newspapers, they are really demanding to be told that we are not living through a revolution....

The newspaper people often note that newspapers benefit society as a whole. This is true, but irrelevant to the problem at hand; “You’re gonna miss us when we’re gone!” has never been much of a business model....

In craigslist’s gradual shift from ‘interesting if minor’ to ‘essential and transformative’, there is one possible answer to the question “If the old model is broken, what will work in its place?” The answer is: Nothing will work, but everything might. Now is the time for experiments, lots and lots of experiments, each of which will seem as minor at launch as craigslist did, as Wikipedia did, as octavo volumes did....

Society doesn’t need newspapers. What we need is journalism....

When we shift our attention from ’save newspapers’ to ’save society’, the imperative changes from ‘preserve the current institutions’ to ‘do whatever works.’ And what works today isn’t the same as what used to work....

PRC report confirms majority publisher support for green OA

The Publishing Research Consortium has released a new report by Sally Morris, Journal Authors’ Rights:  perception and reality (March 2009). 

From today's announcement:

Using re-analysis of the recently published ALPSP report Scholarly Publishing Practice 3 (which looks at the practice of 181 publishers, representing 75% of all articles), and a new survey of 1163 authors, the report compares what publishers actually allow authors to do with the different versions of their manuscript, and what they want to do and believe they are permitted to do.

For both the submitted and the accepted version of their manuscript, the majority of publishers’ agreements (as calculated by the number of articles they publish) allow authors to provide copies to colleagues, to incorporate into their own works, to post to a personal or departmental website or to an institutional repository, and to use in course packs;  just under 50% also permit posting to a subject repository.  However, far fewer authors think they can do any of these than are in fact allowed to do so.

The published PDF version is the version that authors would prefer to use for all the above purposes; again, publishers’ agreements exceed authors’ expectations for providing copies to colleagues, incorporating in subsequent work, and use in course packs.  However, the picture is turned on its head when it comes to self-archiving;  more than half of authors think that publishers allow them to deposit the final PDF, whereas under 10% of publishers actually permit this – probably because of serious concerns about the long-term impact on subscriptions.

Why do authors have such a poor understanding of publishers’ agreements?  The PRC concludes that publishers need to do much more to make sure that their terms are crystal clear, but also suggests that the ambiguous term ‘preprint’ may mislead authors, and should be dropped in favour of the recommended NISO terminology ["author's original" or "submitted manuscript under review" or "submitted version"].

Comments

  • This is a welcome report.  It's true that most publishers allow authors to deposit their peer-reviewed manuscripts in institutional repositories and it's true that this important fact is not widely known.  I included this fact in my short list of Six things that researchers need to know about open access (February 2006), and could support myself and half the OA journals in my field if I got a dollar for every time I've had to repeat it.  In the April issue of my newsletter I'm writing about common misunderstandings about OA, and I'll be repeating this fact once again, as the correction to an insidious error. 
  • I admit that in dark moments over past seven or eight years I've thought that publishers did not want to publicize this fact, and that publishers who'd already agreed to permit green OA were more interested in rescinding their permission than enlightening authors about the opportunity.  This report is welcome in part for the evidence that publishers want to enlighten authors.
  • One lesson from this fact:  At most publishers and most journals, the door is already open to green OA.  We are impeded more by authors who don't seize the opportunity than by publishers who don't give permission.  The primary role for funder and university OA policies is to nudge authors, who from inertia, busyness, or lack of familiarity with their options, are not already acting.  (A secondary role, at least at rights-retention or loophole-free mandates, like that at the NIH, is to secure permission at 100% of the journals where researchers publish, not just those already granting permission.)
  • If you look at figures 8 and 10 in the summary paper, you'll see that more than 70% of publishers surveyed for this study allow authors to deposit the "accepted version" of their manuscript in an institutional repository.  Unfortunately, I have to read this value off a bar chart and can't find the exact number anywhere in the text.  As of today, SHERPA reports that 51% of surveyed publishers allow authors to deposit their "postprint" ("final draft post-refereeing") in at least one kind of OA repository.  While PRC and SHERPA use slightly different terminology, these appear to be attempts to measure the same quantity.  The disparity needs some explanation.  The PRC study surveyed 400 publishers and got 203 usable responses (p. 5), and SHERPA has surveyed 545 publishers.  How much of the disparity, if any, is due to the publishers SHERPA surveyed which PRC did not?  How much, if any, is due to their use of slightly different terms? 

Update (3/16/09).  Also see Stevan Harnad's comments:

This report is welcome for strongly confirming what was already known from the Romeo directory of publisher self-archiving policies....

It is also quite correct that:

(1) Most publishers endorse only the immediate, unembargoed self-archiving of the author's refereed, revised, accepted final draft, not the publisher's proprietary PDF.
(2) Most publishers endorse immediate, unembargoed self-archiving only on the author's institutional website, not on a 3rd-party website, such as a central or subject-based repository.

Both of these limitations are just fine and in no way limit or compromise the provision of (Green, gratis) Open Access. What would-be users worldwide who do not have subscription access to the publisher's proprietary PDF urgently need today is access to the refereed research itself, and that is what depositing it into the author's Institutional repository provides.

Although the word "print" is somewhat misleading in the online era, because most eprints are not printed out at all, but consulted only in their online version, the preprint/postprint distinction is perfectly coherent: a preprint is any draft preceding the author's final, accepted, refereed version, and a postprint is any draft from the author's final, accepted refereed version onward (including the publisher's PDF). Preprint/postprint marks the essential OA distinction: There is no need to use the complicated NISO terminology instead.

The PRC Report is quite right that authors are still greatly under-informed about Open Access, Self-Archiving, and Rights. Universities need to master the essential information and then convey it to their researchers.

New OA toxicology resource

EPA Releases Comprehensive Database on Environmental Chemicals Agency continues efforts to provide new level of transparency, press release, March 12, 2009. (Thanks to ResourceShelf.)

[The U.S. Environmental Protection Agency] has released a new online database that collects information on more than 500,000 man-made chemicals from over 200 public sources. The Aggregated Computational Toxicology Resource (ACToR) database allows access to hundreds of data sources in one place, providing a new level of transparency and easy access for environmental researchers, scientific journalists and the public. ...

Interview with Google on its book settlement

Andrew Albanese, In Wide-Ranging Interview, Google Talks Books, Library Journal, March 12, 2009.  Excerpt:

...[Our] wide-ranging interview with Dan Clancy, engineering director in charge of Google Book Search, and Tom Turvey, Google director of content partnerships, will run in the May 1 issue of Library Journal, but we wanted to provide some excerpts from the discussion, which ranged over pricing issues and access for libraries and consumers, the legal settlement’s impact on orphan works and the public domain, and how the program would actually translate from paper to practice once —or perhaps if— approved....

Of course, Clancy said Google was confident that their initial “fair use” claim would have prevailed—eventually. “We strongly believed in our fair use position,” he said, "but we didn’t start this project to win a court case on fair use. I can honestly say that the settlement was completely driven by what we felt was in the end better for everyone and in particular, users.” ...

If the price is not appropriate, he noted, universities "can choose not to subscribe," Clancy said. He suggested that "the existence of a robust consumer model" in general drives prices down. "We think it has to be a price point where schools feel like it’s delivering value commensurate with the price they have to pay."

What are the other options? Certainly an online preview--"free preview of books is very much in the public interest," Clancy said--as well as use of interlibrary loan.

Turvey said that no pricing could be addressed until after the settlement but that Google was looking at various models for academic libraries based on FTE users....

More on the Google Book settlement

Here are some blog comments on Columbia Law School conference, Google Books Settlement: What Will It Mean for the Long Term? (March 13, 2009). 

From Peter Hirtle, Part 1:

...After an introduction from June Besek of Columbia that briefly summarized the settlement, the meeting kicked off with what was for me the highlight of the day: Mary Beth Peters, the Register of Copyrights, providing her view of the settlement. She started by noting that she had recommended against the Library of Congress participating in Google’s initial Library Partners program because she was not convinced that Google’s indexing of copyrighted books was a fair use. I believe she opened this way in order to establish that a settlement to the case might have been required (if Google's actions were indeed infringing). She is not opposed to the settlement in principle, but she is worried about the scope and the forward impact that the settlement may have. She noted that some critics (including Brewster Kahle and Robert Clarida) have suggested that the settlement may create new rules for the use of copyrighted works, and in effect institute new copyright legislation without having to go through the legislative process that in theory at least is supposed to balance private rewards with the greater public interest. Doing this is really hard – she cited the example of orphan works legislation, which still hasn’t passed. But it does ensure a level of public engagement that is different than in the settlement.

She then posed a discussed a number of issues for which she does not have the answer:

  • How does this settlement affect orphan works? Will it make it harder or easier for orphan works legislation to pass?
  • The settlement, she suggested, is a compulsory license for the benefit of one company. Could Congress have enacted such a license for everyone?
  • Are there treaty obligations that are implicated? Probably yes, if this was legislation – but maybe not if a settlement.
  • The Copyright Office sponsored a three-year long study group tasked to examine the exemptions available to libraries, archives, and museums in order to make sure that they are compatible with the digital age. Does the Google settlement obviate its work?
  • What about foreign works?
  • What about journals? Will journals go to Congress for a legislative solution or will they sue Google, too?
  • What about libraries that are not part of the settlement?
  • Does the settlement represent the interests of all authors? One would assume that legislation would have  acknowledged and addressed all types of copyright owners, but the parties in the litigation were not so constricted....

The Google settlement may make the careful balances found in copyright law (as well as the public procedures to change it) moot, replacing them with private contractual arrangements instead....

Most disturbing of all was Peters’s admission that not one member of Congress has asked the Copyright Office to comment on the settlement – even though it may fundamentally change how Americans can access and use copyrighted information....

This might be because the enormous sea-change that the Settlement represents has not sunk in with the public. For example, the Copyright Office was worried that there might be a huge rush to register works just prior to the Jan. 2009 deadline in order that they would be eligible to be included in the settlement. Only two publishers, however, sought to register their past publications. (To me, this is also more evidence of how we have to live with copyright rules that were designed to serve and protect a miniscule portion of the country’s creative output, but in the process throttle other types of creativity.) ...

Peters was followed by Randal C. Picker from the University of Chicago Law School who spoke on possible competition and anti-trust issues in the proposed settlement.

He opened with some hypothetical examples of fair and infringing uses, and the moved into what he says are the three key components of the Settlement: the creation of a collection of digital files; the scope of rights included in the license granted to Google through the settlement; and the creation of the Book Rights Registry and its relationship with rightsholders. We could if we wanted address competition in each of these three areas. Picker then looked at each in more detail....

Picker raced over the 3rd key element – the Registry – and postulated that it would be very tough to require a second, competing rightsholder registry....So if there is going to be more competition, it will have to be in the licenses that the rightsholders provide to Google and others. And there are elements in the license terms that work against competition. Foremost among these is the “Most Favored Nations” clause, 3.8(a), which benefits Google only. In the settlement agreement, only Google gets to make use of orphan works....

Picker sees to possible solutions to this antitrust problem. First, he thinks other for-profit firms should be able to use orphan works under the same term as Google (even though the Registry itself does not have the authorization of the rights holders to negotiate on their behalf). Second, he thinks that non-profits should be able to use orphan works without liability until such time as a rights holder steps forward....

And while both Peters and Picker found elements in the settlement that they found troubling, neither suggested that the settlement should not be approved.

From Peter Hirtle, Part 2:

...The fear of what these outsiders (Google and the Registry) will do to traditional practices was echoed in several other presentations.  People just don't know what this will mean for publishers: trade, STM, and academic, or even for reproduction rights organizations.  There will be new relationships between authors and publishers because of the settlement, and the Registry will reign supreme.  There seemed to be a lot of concern from the audience that the Registry has no bylaws, no public means of selecting members, no guaranteed international participants (even though all international authors and publishers are swept into the settlement).  Adler's response: NY State law will govern it and ensure that the public interest is protected.  My analysis: the upcoming wars between authors and publishers are going to be fierce.  I wish I could get into the copyright termination business - with money on the table, there is now a reason for everyone to do it....

One interesting suggestion: that an institutional subscription model for the in-copyright stuff is in the works....At one point during the day, the very interesting observation was made that there was no reason that this needed to be a class action - the aggrieved publishers could have pursued court action on their own and got a settlement.  Others pointed out that class actions are normally used for addressing past injustices - and not establishing a new business model.  Michael Boni, the lead counsel for the Authors Guild, dismissed both of these assertions.  He also maintained that the court had limited authority to alter the terms of the settlement - this is a take-it or leave-it deal....

Orphan works continued to loom as an issue, and several participants thought that we still needed orphan works legislation in spite of the settlement.

One other bit of news: Alex Macgillivrary from Google said that Google would make public information on any title it decided it needed to pull from an institutional subscription (rather than just reporting such titles to the Registry, as the settlement demands)....

I sensed throughout the day a tremendous displeasure with the process used to reach the settlement agreement.  There were questions about whether litigation is an appropriate substitute for legislation; whether the settlement reflected the economic interests of the people at the table rather than the entire class; whether the Registry will operate to the benefit of all the members of the class or only certain types of authors and publishers; and what checks there might be on the greed of the participants.  (They have already taken for themselves the royalties due on orphan and public domain works.  Who knows what they will do with public and institutional pricing?) ...Most librarians would say...that the profession played no meaningful role in shaping the settlement....

Yes, the class may be too large and the mechanism too crude, but we created this problem when we abandoned formalities, lengthened copyrights, and started treating every copyrighted item in the world like it was a Disney movie.  Given this procrustean bed we have made for ourselves, the settlement may be our only way out.  Yes, Congress should create a compulsory license authorizing the use of out-of-print books - but don't hold your breath waiting for that.  In the interim, the settlement may be the best we can hope for - even though it has the potential to radically alter all of our worlds.

From Adam Hodgkin:

...Apparently one of the recurring themes in the conference was this mantra "What is good for Google is good for the USA." I am sure that it was said in jest/irony, but that must nevertheless have made the Google participants unhappy. Even if ironic, the comparison is wounding. Just now being compared to General Motors is nearly as bad as being compared to AIG, and is frankly worse than being compared to Microsoft (which would also be very unfair and unwelcome to Google, but the comparisons are coming...). The mantra is especially unfortunate, since it is far too close to the bone: the whole way the Google Book Search settlement is working out is far too US-centric, as though Detroit was the market, and the accessibility of digital books in the rest of the world was not a matter of importance to the US or to Google. General Motors has been building inefficient and slipshod cars which had limited appeal in the rest of the world and failed the ultimate tests of quality engineering and sustainability. Could Google fall into a similar trap of building too much, too wastefully, for local demand and national circumstance without full attention to all the factors which build quality, openness and sustainability? ...

Update (3/17/09).  Also see Paul Courant's notes on the meeting:

The most important single thing about the Google settlement, simultaneously its greatest achievement and among its most vexing features, is the treatment of orphaned works (in James Grimmelman’s witticism, “zombie” works). The problem, as we all know, is that there are millions – no one quite knows how many – of works that may or may not be in copyright and for which the rightsholder(s) may or may not exist and may or may not be aware of their rights. Our ability to use these works is thus much compromised: we run the risk that a copyright holder will appear and claim damages. As we all know, Congress’s efforts to make it easier and safer to use orphaned works have failed. Moreover, the most recent draft legislation would have imposed difficult and costly burdens on a potential user by requiring the would-be user to make substantial efforts to find any potential but unknown rightsholder.

Along comes the Google settlement, which solves at least part of the problem, for Google and the Book Rights Registry, at one fell swoop. (Only part of the problem, because works that were not registered with the copyright office will likely not be in the settlement and yet may be just as orphaned as those that are registered.) Under the settlement, revenues generated by orphaned works will be held in escrow for for five years, allowing time for a rightsholder to come forward. It’s a moving window; if the rightsholder comes forward in year 22, she gets revenues from year 17 on. Thus the products that Google sells to individuals and institutions can include, among other works, millions of orphans (zombies). Without the orphans, the great public benefit of the settlement – the ability to find and use much of the literature of the 20th century in digital form – would be much diminished.

At the same time, the disposition of the revenues attributed to orphaned works is one of my least favorite parts of the settlement. The unclaimed revenues go first to support the operations of the BRR, and then, after that, will be used for charitable purposes consistent with the interests of publishers and authors. As the head of a library that has lovingly cared for these works for decades, the notion that the fruits of our labors (and those of many others in many libraries) redound to the benefit of entities that did not write, publish, or curate these works sticks a bit in my craw. So I hope that authors, publishers, the court, and the public will be vigilant in making sure the BRR does not squander the unclaimed revenues on mismanagement, high salaries, and the like. The “charitable purposes” should be an objective, not a remainder for unclaimed funds.

The settlement also gives Google and the BRR, and no one else, the right to use the orphaned works in this way. A number of commentators, have noted problems that may arise from Google’s privileged position in this regard. But there is an obvious solution, one that was endorsed at the Columbia meeting by counsel for the Authors Guild, the AAP, and Google: Congress could pass a law, giving access to the same sort of scheme that Google and the BRR have under the Google Settlement to anyone. And they could pass some other law that makes it possible for people to responsibly use orphaned works, while preserving interests for the missing “parents” should they materialize. Jack Bernard and Susan Kornfield have proposed just such an architecture to “foster” these orphans. Google has also made a proposal that would be a huge improvement....

Also see Peter Brantley's notes, and Norman Oder's two stories in Library Journal (1, 2).

New OA humanities journal

The Journal of the Northern Renaissance is a new peer-reviewed OA journal published by the Scottish Institute of Northern Renaissance Studies. (Thanks to Karen Baston.)

The inaugural issue was published on March 3, 2009. New issues will be published annually, with reviews published on a rolling basis.

Using the journal system's incentives to open data

Gavin Baker, Why not publish data?, A Journal of Insignificant Inquiry, March 16, 2009.

... Recently I was reading about efforts related to data sharing: technological infrastructure, curation, educating researchers, and the like. I was struck by the thought that most of the advocacy for data sharing boils down to an exhortation to stick it in a digital repository.

This seems a bit odd considering that much of what propels science is the pressure to publish (written) results (in journals, conferences, monographs, etc.). There is a hierarchy of venues in terms of prestige, which is in turn linked to research funding, promotion, public attention (media coverage, policy influence), etc.

Might the best way to get researchers to share data be to create a similar system for datasets? It might provide a compelling incentive.

Moreover, publishing might provide a compelling incentive to the related issue of data curation (making data understandable / usable to others, e.g. through formatting, annotation, etc.). ...

This doesn’t seem so outlandish to me. There are similar efforts to provide publication fora for materials which were not traditionally unpublished (we might say undersupplied), such as negative results and experimental techniques. ...

There are other ways to skin the same cat. One option would be to build alternative systems for conferring recognition (e.g. awards, metrics for contributions to shared datasets, etc.). The other approach is to make data sharing a more enforceable part of other scientific endeavors, e.g. mandatory as a condition of research funding, mandatory as a condition of publication (of written results) in a journal, etc. I think multiple approaches will yield the best result. It seems to me that creating “journals” (or some other name) for “publishing” datasets could be a useful way to spur participation. ...

See also comments on the open-science list.

Robert Darnton and critics on the Google Book settlement

Google & Books: An Exchange, letters to the editor of the New York Review of Books in response to Robert Darnton's article, Google and the Future of Books, NYRB, February 12, 2009 (blogged here in January).  The new exchange includes Darnton's response.  Excerpt:

From Paul Courant:

...[Darnton's] utopian vision of a digital infrastructure for a new Republic of Letters makes the spirit soar. But his idea that congressional committees beholden to Hollywood might have implemented that vision is a utopian fantasy, while his description of what will happen as a result of Google's scanning of copyrighted works is a dystopian fantasy.

At the heart of Darnton's dystopia about the Google settlement is his view that "Google will enjoy what can only be called a monopoly...of access to information." But Google doesn't have anything like a monopoly over access to the information in the books that are subject to the terms of the settlement. For a start (and of stunning public benefit in itself), up to 20 percent of the content of the books will be freely available on the Internet, and all of the content will be indexed and searchable. Moreover, Google is required to provide the familiar "find it in a library" link for all books offered commercially as a result of the settlement. That is, if after reading 20 percent of a book a user wants more and finds the price of online access to be excessive, the reader will be shown a list of libraries (ordered by proximity) that have the book, and can visit one of those libraries or employ interlibrary loan. This greatly weakens the market power of Google's product. Indeed, it is much better than the current state of affairs, in which users of Google Book Search can read only snippets, not 20 percent of a book, when deciding whether they have found what they seek....

[T]he market for current scientific literature has little in common with that for out-of-print monographs. The production of scholarship in the sciences requires reliable and immediate access to the current literature. One cannot publish, nor get grants, without such access. The publishers know it, and they price accordingly. In contrast, because there are many ways of getting access to most of the books sold by Google under the settlement, rapacious pricing won't work. The settlement requires "find it in a library" and extensive free preview, as well as a free access terminal in every public library building in the country. These features could not be more different than the business practices employed by many publishers of scientific, technical, and medical journals....

Of course I would prefer the universal library, but I am pretty happy about the universal bookstore....

From Ann Kjellberg (and six other literary executors to major authors):

...[A] period of exclusive control by a literary estate after an author's death creates the opportunity, and the financial incentive, to assemble fully prepared editions, made by specialists often informed by the author's instructions. Once work enters the public domain, it can be published by anyone in any form, and the financing of editions requiring editorial care becomes, once again, at the pleasure of benevolent institutions rather than readers.

As those of us involved in protecting authors' rights know, once work has been digitized, it becomes very difficult to control. In many places, and of many authors, it can be said that only a lingering affection for the book as an object sustains sales, since an author's works are so widely available electronically. Mass digitization will profoundly affect the viability of our system of copyright, and we need to ask, as not merely a parenthetical matter, how we will support creative work and the open exchange of ideas in its wake.

From Darnton's response:

I wish I could be convinced by Paul Courant's arguments, because I share his conviction that we need a digital infrastructure to support the twenty-first-century world of learning. Why then have I contracted a bad case of dystopia?

First, I worry about commercializing the content of our research libraries. True, there seems to be no alternative, because we have failed to finance large-scale digitization with public money. Google has the means, the skill, and the audacity to create a gigantic database by digitizing millions of books. I sympathize with those who say it is better that this database should exist, even if we will have to pay for access to it, than if it had never been created. What worries me is the fact that Google has no competitors.

Second, the class action character of the suit brought against Google by the authors and publishers means that no other enterprise could mount a rival digitizing operation, even if it could afford to do so, without striking deals with the copyright owners book by book, a virtual impossibility. True, this hypothetical, rival entrepreneur could attempt to cut a deal with the Book Rights Registry...[but] the settlement prohibits it from offering a competitor better terms than those accorded to Google....

Monopolies tend to charge monopoly prices. I agree that the parallel between the pricing of digital and periodical materials isn't perfect, but it is instructive. If the readers of a library become so attached to Google's database that they cannot do without it, the library will find it extremely difficult to resist stiff increases in the price for subscribing to it. As happened when the publishers of periodicals forced up their prices, the library may feel compelled to cover the increased cost by buying fewer books....

Moreover, Google's agreements with a number of large research libraries to digitize their collections preclude those libraries from mounting their own shared database that could compete with Google. According to the agreements, each library retains its own copy ("library digital copy") of works in its collection digitized by Google. But it does so under severe restrictions on use that prevent it both from making the digital file available for reading by its own members and from pooling these copies with those of other participating libraries in order to make them accessible to readers, either at a price or free of charge. Only Google will have control of the vast database of the libraries' combined collections....

Far from disputing the value of that access [Google will provide under the settlement], my concern is to guarantee it by modifying the proposed settlement in a way that will prevent exorbitant pricing. If that cannot be done, we may all become victims of what Paul Courant and I both deplore: "The American practice of making public policy by private lawsuit," as he correctly put it. In the end, when my dystopian temper gets the better of me, I fear that this practice will damage the world of learning by favoring private profit over the public good.

Open raw data + tools = the new primary "literature" of science

Jan Velterop, Open wider, The Parachute, March 15, 2009.  Excerpt:

...[W]hat is presented in journal articles are mostly results derived from data. Interpretations and annotations of data. Seldom the data themselves. Journal publishing evolved in the past, when the physical reality of sharing actual raw data was nigh impossible, so almost every scientist had to rely on the interpretations as published in journals. But now that we can share the raw data (view Tim Berners-Lee's call for sharing raw data), and tools to manipulate those raw data become widely available, relying on journal articles may well take second seat. And now that instant comment on data as well as on journal articles has become possible...review after publication is a reality of today (albeit not used all that widely yet)....

Interview with Rainer Kuhlen

Roland Detsch, Urheberrecht: „Je freier Information ist, desto mehr kann damit verdient werden“, Goethe Institut, March 2009.  An interview with Rainer Kuhlen.  Read it in German or Google's English.  (The title is:  Copyright:  The freer information is, the more can be earned.)

Also see our past posts on Kuhlen.

On IRs for data sharing

Luis Martinez Uribe, Research data into Fedora at Oxford, DataShare Blog, March 6, 2009.

... This post looks into some of the work carried out by my colleagues in the Library to explore ways to manage research data into Fedora. These efforts are recounted in the blog of Ben O'Steen, Oxford Research Archive Software Engineer.

Some months ago Ben already provided an exceptional account of the challenges encountered when ingesting a research dataset into FEDORA. He described how he dealt with the modelling and storing of a phonetics dataset given to him on a DVD-R, containing around 600 audio files organized in a hierarchical structure.

In a more recent post Ben talks again about storing, curating and presenting research data. ...

This post also identifies a gap in institutional and departmental IT support for those researchers needing to store tables of data and suggests HBase as the type of basic service that could be provided to avoid the free-form tabular datasets as well as to educate researchers.

All this work has been taking place in parallel to the scoping study I have been conducting in the last 15 months to scope the requirements for services to manage and curate data. ...

Presentations from library conference

The presentations from Upgrading the eLibrary: Enhanced Information Services Driven by Technology and Economics (Bielefeld, Germany, February 3-5, 2009) are now online. (Thanks to Charles Bailey.) Several are on OA.

Automated identification of copyright status of a work

Rufus Pollock, Computing Copyright (or Public Domain) Status of Cultural Works, miscellaneous factZ, March 12, 2009.

I’m working on a EU funded project to look at the size and value of the Public Domain. This involves getting large datasets about cultural material and trying to answer questions like: How many of these items are in the public domain? What’s the difference in price and availability of public domain versus non public domain items? ...

Suppose we have data on cultural items such as books and recordings. For a given item we wish to:

  1. Identify the underlying work(s) that item contains.
  2. Identify the copyright status of that work, in particular whether it is Public Domain (PD)

Putting 1 and 2 together allows us to assign a ‘copyright status’ to a given item. ...

[D]etermining copyright status is, in theory, simple:

  1. Given information on an item match it to a work (or works).
  2. For each work obtain relevant information such as date work first published (as an item) and death dates of author(s)
  3. Compute copyright status based on the copyright laws for your jurisdiction.

While copyright law is not always simple, step three is generally fairly straightforward ...

What is not so straightforward are the first two steps especially step 1. This is because most datasets give only a limited amount of information on the items they contain. ...

New OA pharmacy journal

Journal of Medicine Use in Developing Countries is a new peer-reviewed OA journal published by the Discipline of Social and Administrative Pharmacy of Universiti Sains Malaysia. (Thanks to Imran Masood.)

Sunday, March 15, 2009

A literature review limited to OA literature

Cathy S. Cavanaugh, Michael K. Barbour, and Tom Clark, Research and Practice in K-12 Online Learning: A Review of Open Access Literature, The International Review of Research in Open and Distance Learning, February 2009. 

Abstract:   The literature related to online learning programs for K-12 students dates to the mid-1990s and builds upon a century of research and practice from K-12 distance education. While K-12 online learning programs have evolved and grown over the past decade, the amount of published research on virtual schooling practice and policy is limited. The current literature includes practitioner reports and experimental and quasi-experimental studies, both published and unpublished. This paper reviews open access literature in K-12 online learning and reports on a structured content analysis of the documents. Themes in the literature include steady growth and a focus on the benefits, challenges, and broad effectiveness of K-12 online learning. In addition, newly developed standards for K-12 online learning are emerging in descriptions of effective practices.

Comment.  This is the first literature review I've seen which deliberately limits itself to OA literature.  Don't jump to conclusions about why the authors did it this way.  They do not believe "if it's not OA, then it's not worth reading".  They did not decide to review what was ready to hand because they lacked access to much of the TA literature.  (Some of the co-authors have published previous literature reviews focusing on the TA literature.)  They did not assume that OA literature and TA literature differ in the topics they cover or conclusions they draw, which one could only know by reviewing of the TA literature as well.  From the body of the paper:

The decision to use only open access documents was made for two reasons. The initial search of literature revealed that individuals outside of the academy authored the majority of documents; thus, the authors may not have regular or free access to subscription-based publications. Also, because the authors were interested in presenting this paper to the practitioner community, we wanted to ensure that this audience was able to access the documents on which our metasynthesis was based....

One day soon we'll see another kind of literature review limited to OA literature:  one based on sophisticated text mining.  The authors will explain that only OA literature is technically and legally amenable to that kind of analysis.

Update (3/18/09). See the blog post by co-author Michael Barbour, responding to mine. In the comment section I clarify and extend what I said here.

Tim Berners-Lee on open data

Tim Berners-Lee, The next Web of open, linked data, a 16:23 minute video of a TED talk, presented last month and posted this month.

More on OA at the CSIR labs

M. Sreelata, Key Indian research organisation goes open access, SciDev.Net, March 13, 2009.  Excerpt:

India's main publicly-funded scientific research agency, the Council of Scientific and Industrial Research (CSIR), has announced a set of measures to make its research publications open access.

Last month (6 February) Naresh Kumar, head of CSIR's Research and Development Planning Division wrote to the directors of CSIR's more than 40 laboratories with a list of directions for making CSIR-generated knowledge open access.

Each laboratory is asked to set up its own institutional open access repository compatible with the more than 1,000 repositories across the world. They are also asked to make their research findings available either by depositing them in such a repository or by publishing them in open access journals. CSIR journals are also requested to become open access.

The next step is to create awareness among CSIR scientists by holding in-house training and hosting a conference on open access later this month (24 March).

Samir K. Brahmachari, director-general of CSIR, says that the open access scheme won't be easy to implement as there are many technicalities involved, including the sheer number of articles. He says that CSIR publishes about 4,000 articles in over 21 journals annually.

The official decision to opt for open access publication was taken on Open Access Day last year (24 November). Two CSIR journals have already become open access.

Subbaiah Arunachalam, a Chennai-based information consultant who was involved in formulating the recommendations, says CSIR is the only scientific council in India to have taken such a policy decision....

PS:  Also see our post on the CSIR decision, and all our past posts on CSIR.