Open Access News

News from the open access movement


Saturday, February 07, 2009

Policy-making under the influence

James Boyle, Obama’s team must fight ‘cultural agoraphobia’, Financial Times, December 17, 2008.  (Access to the full text requires free registration.) 

Boyle asks you to think back 17 years and imagine that you knew nothing of the World Wide Web or its many applications and services.  If you saw the major arguments pro and con, would you have green-lighted the open web, open source software, or Wikipedia? 

In each case, the open option does not simply sound less plausible. It sounds – or it would have sounded 17 years ago – completely delusional....

My point is simple. We have a bias, a cognitive filter, that causes us to underestimate the benefits and overestimate the dangers of openness – call it cultural agoraphobia.

It is not that openness is always right. It is not. Often we need strong intellectual property rights, privacy controls, networks that demand authentication. Rather it is that we need a balance between open and closed, owned and free and we are systematically likely to get the balance wrong....

And herein lies the lesson for the Obama administration. Think about the policy choices of the future by applying our assumptions to the choices of our past.

The web succeeded too quickly to be controlled. It conquered skepticism by existing....It spread too fast to think of taming it into the more mature, sedate “National Information Infrastructure” that the Clinton administration imagined....

Whenever throughout history we have opened a communications network or the franchise or literacy, reasonable people have worried about the consequences....

[While] openness is not always right...our prior experience seems to be that we are systematically better at seeing its dangers than its benefits....

Comment.  This is a real phenomenon and "cultural agoraphobia" is a good term for it.  I run across this systematic cognitive bias every day, in myself and others, even after 17 years of experience with the open web.  Of course not every objection to openness is an example of cognitive bias.  So the term will be more useful for anthropologists studying our culture, or historians looking back, than for advocates and activists.  We still have to answer objections, not just explain them.  But Boyle points out an important use for advocates and activists.  If we understand cultural agoraphobia, we can warn policy-makers, the citizens who watch them, and ourselves, against its effects.

Skidmore librarian would like to see an OA mandate

Ruth Copans on the future of Scribner Library, ScopeWeekly, February 5, 2009.  An interview with Ruth Copans, the College Librarian of Skidmore College.  Excerpt:

You've mentioned the 'Open Access' movement as one current trend about which you'd especially like to get input from the Skidmore community. How would you describe the movement and what challenges and opportunities does it pose to academic libraries?

In the 1980s, college administrators expressed concern that they were buying back the scholarly output of their faculty --- output they had already paid for with salaries, infrastructure, and staff. Librarians, faced with a 215 percent price increase in journals from 1986 through 2001, were forced to begin cancelling subscriptions. This created a groundswell of interest in the concept of Open Access, a movement to place peer reviewed scholarly literature on the Internet, making it available without cost and free of most copyright and licensing restrictions. In 2007 the Senate Appropriations Committee directed the NIH to require that its funded research be made publically available on the Internet within 12 months of publication. The movement gained further momentum last February when the Harvard Faculty of Arts and Sciences voted to give the University a worldwide license to make each faculty member's scholarly articles available in a central database administered by the Harvard University Library.

Would you like to see the Skidmore faculty take a similar move as the Harvard faculty? What would this mean for Skidmore?

I would be thrilled if Skidmore faculty voted unanimously to support the Open Access movement by agreeing to submit their research to an Institutional Repository. Of course, we'd have to figure out the mechanism for developing and maintaining an Institutional Repository but the gesture would be very meaningful. If faculty everywhere did that, an extraordinary quantity of high quality research could be shared across the world. As Robert Darnton, Carl H. Pforzheimer University Professor and Director of the University Library, asserted in his opinion piece in the Harvard Crimson before the Harvard University FAS vote:  "In place of a closed, privileged, and costly system, it will help open up the world of learning to everyone who wants to learn. . . . " ...

UK task force recommends OA and CC licenses for PSI

The UK Power of Information Taskforce has released a beta version of its Report, February 1, 2009.  (Thanks to Michael Cross.)  Excerpt:

...The Taskforce ran a competition for innovative ideas to co-create public services.  ‘Show Us a Better Way’ was a success, receiving praise worldwide, and showed the potential for innovation that engages the general public....

Recommendation 4

Unlock innovation in leading public sector sites using a ‘backstage model‘, a standing open online innovation space allowing the general public and staff to co-create information-based public services.  This capability should be a standard element of public information service design. The government should start by creating a live backstage service for DirectGov by end June 2009 or earlier....

Recommendation 9

The Ordnance Survey is fundamental to delivering the power of information for the economy and society.  The Taskforce has contributed to the Government’s Trading Funds Assessment.  This Assessment should be radical and fundamental. In particular:

· Basic geographic data such as electoral and administrative boundaries, the location of public buildings, etc should be available free of charge to all.

· There should be simple, free access to general mapping and address data for volumes of data up to moderately substantial levels.

· Voluntary and community organisations pursuing public policy objects should benefit from straightforward standard provisions for ensuring access to geospatial data without constraints.

· Licensing conditions should be simplified and standardised across the board and, for all but the heaviest levels of use, should be on standard terms and conditions and should not depend on the intended use or the intended business model of the user....

Recommendation 10

(a)  Government should ensure that there is a uniform system of release and licensing applied across all public bodies; individual public bodies should not develop or vary the standard terms for their sector.
(b)  The system should be a creative commons style approach, using a highly permissive licensing scheme that is transparent, easy to understand and easy to use, modelled on the ‘Click Use’ licence, subject to the caveats below....

Recommendation 11

Public information should be available at marginal cost, which in practice means for free.  Exceptions to this rule should pass stringent tests to ensure that the national benefit is actually served by charging for information and thus limiting its reuse by exploiting the monopoly rights conferred by intellectual property regimes....

Recommendation 13

The Government should ensure that public information data sets are easy to find and use.  The government should create a place or places online where public information can be stored and maintained (a ‘repository‘) or its location and characteristics listed (an online catalogue).  Prototypes should be running in 2009....

PS:  Also see our past posts on the Show Us a Better Way contest and the Power of Information Review, the predecessor to the current report.

Update (2/9/09).  Also see the comments of Tom Watson ("A Blueprint for Obama?") and Ellen Miller.

ACM recommends libre OA for government data

The Public Policy Committee of the Association for Computing Machinery has released its Recommendations on Open Government.  (Thanks to David Robinson.)  Excerpt:

Recommendations

  • Data published by the government should be in formats and approaches that promote analysis and reuse of that data.
  • Data republished by the government that has been received or stored in a machine-readable format (such as online regulatory filings) should preserve the machine-readability of that data.
  • Information should be posted so as to also be accessible to citizens with limitations and disabilities.
  • Citizens should be able to download complete datasets of regulatory, legislative or other information, or appropriately chosen subsets of that information, when it is published by government.
  • Citizens should be able to directly access government-published datasets using standard methods such as queries via an API (Application Programming Interface).
  • Government bodies publishing data online should always seek to publish using data formats that do not include executable content.
  • Published content should be digitally signed or include attestation of publication/creation date, authenticity, and integrity.

In an earlier post, David Robinson gave a good example of the potential benefits:

...Ultimately, we all want information about bailout spending to be available in the most user-friendly way to the broadest range of citizens. But is a government monopoly on "presentations" of the data the best way to achieve that goal? Probably not. If Congress orders the federal bureaucracy to provide a web site for end users, then we will all have to live with the one web site they cook up. Regular citizens would have more and better options for learning about the bailout if Congress told the executive branch to provide the relevant data in a structured machine-readable format such as XML, so many sites can be made to analyze the data....

It would be a travesty to make government the only source for interaction with bailout data —the transparency equivalent of central planning. It would be better for everyone, and easier, to let a thousand mashups bloom....

Update (2/10/09).  More from the ACM:  Make Recovery.Gov Web 2.0 Friendly.

...Today we turn our attention to an obscure requirement of [The American Recovery and Reinvestment Act], which requires a website called “Recovery.gov” to house all of the grant data that would be generated from spending under the act. USACM sent a letter calling for the website’s requirements to include the ability to download complete data sets in machine-readable form.

The legislation specifies a number requirements for the website, including this one dealing with data accessibility:

  • “(3) The website shall provide data on relevant economic, financial, grant, and contract information in user-friendly visual presentations to enhance public awareness of the use funds made available in this Act.” (our emphasis)

This is clearly an important provision, but it misses a key element for the web 2.0 culture, namely the reuse of that data. Last week, USACM released its recommendations on enhancing open-government, which recommended (among other things):

  • Data published by the government should be in formats and approaches that promote analysis and reuse of that data.
  • Data republished by the government that has been received or stored in a machine-readable format (such as online regulatory filings) should preserve the machine-readability of that data.

One important step that Congress could take toward making Recovery.gov more useful for the public is building these principles into the requirements specified by the legislation....


Friday, February 06, 2009

Against the primacy of IRs

Gavin Baker, Authors: I don’t care where you deposit, just do it, A Journal of Insignificant Inquiry, February 5, 2009.

... The advocates of institutional repositories usually frame themselves as being advocates of decentralization, but this isn’t terribly accurate. A recent blog post by Bernard Rentier, rector of the University of Liège, captures the arguments well: [Note: omitting excerpt.] ...

But we might as well say this instead:

If requiring deposit is left to universities, their repositories will only contain publications by their researchers. Since some researchers have multiple institutional affiliations, and since any given publication may be authored by researchers across multiple institutions, it is easy to see that researchers will ultimately have to deposit their publications in as many repositories as there are institutions involved in their research.

This trend seems to rest on the naive notion that, in the Internet era, it is somehow still necessary for researchers to conduct their work solely through the channels of a university.

It is understandable that universities may wish to host a complete collection of the research published by their faculty, but nowadays that can easily be accomplished by importing it automatically from the more complete collections of the distributed Web.

Recall also that universities are not the universal providers of all research output. There will always be independent scholars, as well as publications by authors in government, non-profits and think tanks, and corporations.

The OA philosophy is global. It cannot be reduced to a single university.

To be clear, I am not saying “just put it on the Web”. Appropriate metadata, interoperability, and preservation for long-term access matter; repositories (whether organized around research institutions, research funders, or research topics) provide these. But arguing for the primacy of institutional over funder repositories is little less naive than arguing the opposite. Maximum effectiveness requires coordination by all parties. ...

Update (2/6/09).  [Peter Suber:] For my views, see points #3 and #13 in my article from last week on OA policy options.

Update (2/7/09).  Also see Stevan Harnad's comments:

...Gavin Baker suggests that it does not matter where authors deposit their papers to make them Open Access (OA): in an Institutional Repository (IR) or a Central Repository (CR). Nor, more importantly, does it matter where authors' funders mandate that they should deposit them, because IR deposits can be harvested to CRs and vice versa. I point out that this apparent symmetry between IRs and CRs with respect to the harvestability from one to the other (in either direction) is irrelevant today because most of the target content for OA is not yet even being deposited at all, anywhere: In other words, authors are decidedly not "just doing it." Nor are institutions -- the universal providers of OA's target output, both funded and unfunded, across all fields -- "just mandating that authors do it." Apart from the tiny number (about 30) that have already mandated deposit, institutions are the "slumbering giant" of OA, until they wake up and mandate the deposit of their own research output in their own IRs. Not all research output is funded, but all research output is institutional: Hence institutions are the universal providers of all OA's target content. Although not many funders mandate deposit either, the few that already do (about 30) can help wake the slumbering giant, because one funder mandate impinges on the research output of fundees at many different institutions. But there is a fundamental underlying asymmetry governing where funders should mandate deposit: As Prof. Bernard Rentier (founder of EuropenScholar and Rector of U. Liège, one of the first universities to adopt an institutional deposit mandate) has recently stressed, convergent funder mandates that require deposit in the fundee's own IR will facilitate the adoption of deposit mandates by institutions (the slumbering giant), whereas divergent funder mandates that require CR deposit (or are indifferent between CR and IR deposit) will only capture the research they fund, while needlessly handicapping (or missing the opportunity to facilitate) efforts to get institutional deposit mandates adopted and complied with too. The optimal solution for both institutions and funders is therefore: "Deposit institutionally, harvest centrally".

Update (2/8/09).  Also see Heather Morrison's comments:

...For the searcher, the optimum will almost always be the disciplinary / subject archives....

Institutional archives are also necessary, for a number of reasons, two of which are listed here. First, the majority of disciplines do not have disciplinary archives, so this is necessary. More will likely be developed, but given that many institutions now have archives, future disciplinary archives may well be developed within institutional archives. Second, institutional archives will provide functions and services to institutions beyond what disciplinary archives can do, such as showcasing the work of the university, its departments and faculty to potential donors, including the public and politicians for publicly funded institution, and potential students....

Update. (2/10/09) [Gavin Baker:] See also my response to the responses.

Comments on the Conyers bill

Here are a few comments from the press and blogosphere on the re-introduction of the Conyers bill (a.k.a. Fair Copyright in Research Works Act, HR 801), which would overturn the OA policy at the NIH.

From Andrew Albanese in Library Journal:

...[W]hile near-universal opposition to the bill remains—changes in the political and bureaucratic landscape could mean a tougher battle in 2009 for those opposed to the bill.

Following last year's hearing, lawmakers all but ruled out action on the bill in 2008, saying the issues needed to be studied more. If the issues were studied more, however, it’s hard to tell. Not only was the bill reintroduced early in the 111th congress by Congressman John Conyers (D-MI), it was introduced without revision, despite extensive criticism and comment, and broad opposition from a range of stakeholders, including librarians, Nobel Prize-winning researchers, and a coalition of law professors and copyright experts.
SPARC executive director Heather Joseph, who testified at a Congressional hearing on the bill last September, told the LJ Academic Newswire she was somewhat surprised the bill resurfaced so soon. “I am also a bit surprised that Mr. Conyers would introduce a bill that is so diametrically opposed to the new administration’s push for openness and transparency in public agencies,” she added....

From Mike Carroll at Carrollogos:

The NIH Public Access Policy is working. Although publishers have made vague assertions, claims that there are legal problems with the NIH policy have been discredited. Similarly, there is no evidence to support the policy - with its allowance of an unduly long 12 month delay - that scholarly communication in the biomedical sciences has been harmed.

Indeed, it's really time to turn this conversation around. The United States' economy needs more than increased consumer spending to recover. We need to innovate, and innovation in basic research happens quicker and in more diverse directions in an open, networked environment. In a word, research should be linkable.

Wanna see? Do you have breast cancer or is there a woman in your life who does? Want to know more about the statistical risks? Thanks to the NIH Public Access Policy, I can simply suggest that you click here because your tax dollars supported the study.

Now that's just using the freedom to link to help quickly point you to an article or scientific letter you might want to read. But the real power of linkable science is that scientists would be able to use their computers to study the network of links to find otherwise hidden patterns in the research and to find otherwise hidden linkages between results in related but distinct fields of research or even in different disciplines. It's the power to process links that has made Google the leading search engine for the web. So why can't web technologies do for scientists what they do for web searchers looking to buy electronics or shoes? Because scientific information other than NIH funded research articles is not generally linkable! ...

So, Chairman Conyers, with all due respect, the policy question is not whether Congress should act to deny scientists and taxpayers access to research funded by NIH, but rather, why should NIH-funded research articles be the only articles reporting federally-funded research that scientists and taxpayers like me can link to?

From Gigi Sohn at Public Knowledge:

While there is lots of legislation that Public Knowledge disagrees with, we can often see the rationale for it. Not so with H.R. 801, the Fair Copyright in Research Works Act....This bill would have the effect of overturning the “open access” to research policy of the National Institutes of Health, which requires that research funded by the agency to be made available for free in an online archive within 12 months of publication. The rationale for this policy is simple - if taxpayers are going to pay for research, taxpayers should get a return on their investment by getting free online access to that research. This is the same rationale driving some of the conditions on the stimulus bills being debated right now - government can, and in this case especially should, put public interest conditions on the money it spends.

So why would Rep. Conyers, considered a champion of the public interest, introduce this bill? News reports indicated that the Chairman was upset because he believed that the House appropriators seized what he believed to be a copyright issue from his Committee.

But open access doesn’t require any change to copyright law and it doesn’t diminish anybody’s copyrights....

Open access to government supported research is not only the proper response to ensure that taxpayers get a return on their investment, it is consistent with President Obama’s vision of greater public access to government information, more government transparency and increased accountability. Chairman Conyers and his House colleagues should not allow a jurisdictional dispute undermine these critical goals.

From John Timmer at Ars Technica:

...Publishers...objected, viewing the policy as a threat to the value they could extract from copyrighted publications. Their objections, in part, eventually led to the introduction of a bill that would roll back the NIH policy, which allowed the House Judiciary Committee to hold hearings on it last year.

In reporting on that testimony, it was clear that there were issues that went well beyond any threat to the scientific publishing industry created by the mandatory open access policy. Both witnesses and members of Congress suggested there were a huge range of concerns; a general mistrust of government involvement in markets, worries about the general fight over intellectual property, and Congressional turf battles all featured. The hearings also made clear that many committee members had some significant gaps in their knowledge of scientific publishing....

The fact that the bill's cosponsors are all members of the Judiciary Committee adds credence to our earlier impression that a Congressional turf battle was threatening [the] NIH policy....

Unfortunately for open access fans, this year's bill seems to present a greater threat to the NIH's policy. For one, Congress is distracted by other issues, which might allow a minor amendment to a funding bill to slip through unnoticed. In addition, Zerhouni gave a strong defense of the open access policy on scientific grounds at last year's hearings; he's since stepped down and, with Tom Daschle withdrawing his nomination, it's not even clear when a new head of Health and Human Services will be named; until that position is filled, there won't be a new NIH chief named.

Also see:

  • My own comments on the re-introduction of the bill, my comments on the first two press releases from the publishing lobby, and my analysis of the bill the first time around, which still applies in full since the the bill is unchanged.
  • Our three collections of comments on the bill from the first time around (1, 2, 3).

OA monographs on education research

Yesterday Vandenhoeck & Ruprecht started providing OA to selected monographs on education research published by its imprint, V&R unipress.  The monographs are from the series, Studien des Georg-Eckert-Instituts zur Internationalen Bildungsmedienforschung, and on deposit in the Fachportal Pädagogik (or pedocs) document server hosted by the Deutschen Institut für Internationale Pädagogische Forschung (DIPF).  (Thanks to BuckMarkt and Open Access Informationsplattform.)

An OA mandate at the Quebec health research fund

The Fonds de la recherche en santé du Québec (FRSQ) adopted an OA mandate on October 24, 2008, which took effect on January 1, 2009.  Excerpt:

...As a manager of public funds, [FRSQ] is firmly committed to ensuring that the full potential of research outputs is used and that the return on research investments is optimized so that society as a whole can reap the benefits thereof. In adopting this Policy, the FRSQ, reaffirming the independence of researchers..., intends to establish measures that will structure and support researchers' initiatives to disseminate research outputs and thereby foster open access to them when possible....

This Policy applies to all research fully or partially funded by FRSQ,...and to any new award or grant issued as of January 1, 2009. Although recipients of awards or grants issued before this date are not bound by this Policy, it would be appropriate that they abide by it....

FRSQ awardees are expected to make every possible effort to have their peer-reviewed publications posted on open-access Web sites as soon as possible, ideally, no later than six months after publication or presentation. Awardees can achieve this through the following options:

Option 1 – Via the publisher’s Web site (in the case of articles) or that of the organizer of the event (in the case of scientific conventions);

Option 2 – Via one or several online repositories (awardees must comply with the publisher’s or the organizer’s policies)....

Comments

  • Kudos to all at the FRSQ.  I applaud the mandatory language, the option to comply with the policy by publishing in an OA journal (alongside the option to publish in a TA journal and deposit in an OA repository), the encouragement for past grantees to comply with the policy, and the six month embargo.  Note that FRSQ continues the pattern that every medical research funder in the world with an OA mandate, except the NIH, uses a six month embargo. 
  • Unfortunately the policy leaves a loophole for resisting publishers.  (When grantees publish in a TA journal and deposit in an OA repository, they "must comply with the publisher’s...policies.")  The loophole is not at all necessary, and has been closed in the policies of the Wellcome Trust, NIH, MRC, and many others.  For details on how to close it and still ensure that the OA is authorized by the copyright holder, see #10 of my article from last week.

Labels:

Presentations and notes from Data Library conference

The presentations are now online from the Data Library 25th Anniversary (Edinburgh, December 5, 2008). See also the detailed notes or blog post by Anne Donnelly.

Researcher survey on data preservation and re-use

The PARSE.insight project is conducting a survey of researcher practices and views regarding data re-use and preservation.

... The European Alliance for Permanent Access aims to work together to preserve research data for future re-use. Science and research publishers actively support and partake in the Alliance through the participation of the International Association of Scientific Technical and Medical publishers. A number of Alliance partners have initiated a project, partly funded by the EU, entitled PARSE.insight which aims to gain insight into the requirements of researchers, data managers and funding agencies with regard to preserving research data.

PARSE.insight will help to shape the EU's preservation infrastructure, as part of the broader e-Research infrastructure, so that European citizens can continue to gain benefits from its digital resources, in all aspects of life.

In order to complete our picture of what the research community needs, we kindly request that you fill out this questionnaire, which may reach you via other members of the Alliance. It will take you about 20 minutes. The research community as a whole will benefit from your cooperation! ...

See also our past post on the PARSE.insight project.

Spain to introduce OA mandate

José Manuel Nieves, Un Consejo de Política Científica controlará la investigación en España, ABC.es, February 5, 2009. Read it in the original Spanish or Google's English.
Next Wednesday, [Spain's] Minister of Science and Innovation, Cristina Garmendia, will formally present the draft of the new and awaited Law of Science and Technology ... Regarding the dissemination of results, Chapter III establishes the obligation to publish in open access texts that have been accepted for publication in scientific journals where these have been financed with public funds from the General Administration of State. ...

Labels:


Thursday, February 05, 2009

Asturias adopts an OA mandate

The Spanish principality of Asturias has adopted an OA mandate for its funded research. See this February 4 post from the AccessoAbierto blog (in the original Spanish, or in Google's English):
The Principality of Asturias has been resolved by agreement of the Governing Council of the Commonwealth a clear policy regarding the scientific output resulting from the Principality funded projects:
... Bases in aid and grants financed by public funds provided or managed by the Administration of the Principality of Asturias, whose purpose is to promote research, include a clause under which the beneficiary must self-archive its research results in the Institutional Repository of the Principality of Asturias (RIA), to enable the dissemination of the work in the scientific community for study and research. Where the work being published, the Government of the Principality of Asturias followed, whenever necessary, however a period not to exceed 6 months prior to its disclosure by the Institutional Repository of the Principality of Asturias ( RIA ) ...

Labels:

Copyright Alliance and AAP welcome re-introduction of Conyers bill

Two publisher groups which supported the Conyers bill the last time around are supporting it again.  No surprises here. 

From the Statement of Patrick Ross, Executive Director of the Copyright Alliance, February 4, 2009:

The Copyright Alliance praises House Judiciary Committee Chairman John Conyers for introducing HR-801, the Fair Copyright in Research Works Act....

Federal copyright law and years of precedent grant copyright owners control of the right of reproduction, distribution, and public performance and display. But in a troubling reversal of this incentivizing precedent, Congress – without consultation of members with expertise in copyright law – has given the federal government control over the reproduction and distribution of certain research works without regard to the rights of publishers.

The mere fact that a scientist accepts as part of her funding a federal grant should not enable the federal government to commandeer the resulting peer-reviewed research paper and treat it as a public domain work.

Grants are provided to pay for the research and resulting data, which is generally freely and immediately available. But taking the scientist’s copyrighted interpretation of the data is not fair to other funders, and it violates the rights of the publisher. A publisher improves the work through a rigorous peer review process and develops it for publication....

From the press release of the Association of American Publishers, February 4, 2009:

The Association of American Publishers welcomed the re-introduction of legislation to safeguard the rights of authors and publishers of copyrighted, peer-reviewed scientific journal articles, and praised House Judiciary Committee Chairman John Conyers Jr....

The Fair Copyright in Research Works Act, HR 801...would help keep the Federal Government from undermining copyright protection for journal articles where private-sector publishers have added such significant value. The legislation would address serious concerns that the mandate is inconsistent with policies underlying U.S. copyright law and undermines our nation’s ability to comply with international copyright treaty obligations....

Comments

  • Both statements say or imply that the NIH policy violates publishers' copyrights.  That is false.  If it were true, or if the publishers honestly believed it to be true, they would be in court, where they would already have a remedy for copyright infringement.  Instead, they are in Congress lobbying for this bill which would amend US copyright law.  They must change the law to get what they want because the NIH policy does not violate current copyright law.
  • Both statement say or imply that publishers are the copyright owners, without qualification, on the articles reporting the results of NIH funded research.  That is false.  The NIH policy requires grantees to retain a key right and use it to authorize OA.  There are three important consequences:  (1) OA from the NIH is authorized by the copyright holders, (2) NIH-funded authors no longer transfer the full bundle of copyrights to publishers, and publishers no longer acquire the full bundle of copyrights on these articles, (3) publishers acquire fewer rights from authors than in the past, but have undiminished power to enforce the rights they do acquire. 
  • The rhetoric that the policy "commandeers" publishers' articles or forces publishers to "surrender" their articles is false.  It suggests that these articles are publishers' property, without qualification, and the NIH somehow expropriates their property or prevents publishers from enforcing their rights.  But see the previous bullet:  publishers are not the full owners of these articles, and they remain free to hold all the rights they acquire and to enforce all the rights they hold.  Perhaps I should also add that US copyright law protects the right of authors to divide the bundle of copyright and transfer some rights, rather than all rights, to a publisher. 
  • It's true that publishers invest money in organizing peer review.  But the commandeer/surrender rhetoric implies that they discover, after the fact, helplessly, that the NIH will distribute OA copies of the peer-reviewed manuscripts.  That is false.  When NIH-funded authors approach publishers, they don't merely ask "will you publish my article?" but also "will you publish it under these terms?"  It's a business proposition that publishers may take or leave.  Publishers are virtually unanimous in taking it.  But in these public statements they pretend that the government is taking their property and suppress the fact that they accept the offer with their eyes open.
  • The publishing lobby and Rep. Conyers are unhappy that the original policy was adopted without consulting the House Judiciary Committee, which vets new bills raising copyright issues.  It's true that the Judiciary Committee was not consulted.  While that may have created a regrettable turf battle among House committees, there ought to be a way to resolve it without distorting the facts or enacting bad policy. 
    1. The NIH policy does not violate copyright law.  Don't take my word for it.  Read the judgment of 46 copyright lawyers.  Pretending that the policy violates copyright in order to justify review by the Judiciary Committee does nothing to clarify the committee's jurisdiction or respect its expertise on copyright issues. 
    2. While the original policy was adopted without consulting the Judiciary Committee, the Judiciary Committee has since held a hearing (September 11, 2008) on the issues raised by the policy and by this particular bill to overturn it.  The no-consultation objection is now moot.  Continuing to press the point puts the turf war ahead of the public interest in good policy. 
    3. Bypassing the Judiciary Committee the first time around was not a stealth maneuver.  William Patry, former Copyright Counsel to the House Judiciary Committee, says it's "absurd" to think that the NIH policy raises copyright issues or that it had to be reviewed by the Judiciary Committee.
  • For a more detailed analysis of all of these points, see my article from last October on the Conyers bill and the rhetoric of the publishing lobby.

Overlooking green OA in The Guardian

Andrew Brown, Digital Britain needs access to science journals, not YouTube, The Guardian, February 5, 2009.  Excerpt:

Scientific journals are a notorious racket: because they are essential tools for the professions that use them, they can charge pretty much what they like....The effect of this...is that the government pays universities to conduct research for the public benefit; the measure of this research is publication in peer-reviewed specialist journals; the peer review is done for free, by academics employed and paid by universities. The results are then sold back to the universities who paid for the research in the first place.  This is bad value for governments....

One answer to all this is to promote the growth of free scientific publishing, and also, increasingly, of free access to the immense quantities of data that lie behind most published papers. For those who just want to know what is going on, open access is unsatisfactory for two reasons. The first is that it is not yet widespread enough. There is no guarantee at all that the interesting work will emerge in open-access journals, which tend to be extremely specialised. The second is more philosophical, and deeper. It may never become as useful a resource as the paid-for generalised magazines, precisely because the open-access journals are written to be enjoyed within particular fields of expertise....

Comment.  Brown is right that OA is a solution to this problem.  Unfortunately, he seems to think that OA can only be delivered by OA journals.  Both of his objections to OA are answered by green OA, or the practice of depositing peer-reviewed postprints in OA repositories.  The best-kept secret of OA is that it's compatible with publishing in a non-OA journal.  (If this is news to you, see #2 and #4 in my list of Six things that researchers need to know about open access.)

Update (2/6/09).  Also see Stevan Harnad's comments:

Open Access (OA) does not mean only, or mainly, open-access journals ("Gold OA"). The other, more widespread way to provide OA is for the authors of articles published in non-OA journals to make them OA by depositing an electronic version in an OA Repository ("Green OA"), thereby making them free for all (including those whose libraries cannot afford a subscription) -- as 34 research funding councils worldwide (including all the UK Research Councils, the European Research Council and the US National Institutes of Health) as well as 31 Universities and Faculties (including Southampton, Glasgow and Stirling in the UK, and Harvard and Stanford in the US) have already adopted mandates requiring the authors they fund and employ to do.

Norwegian Research Council adopts an OA mandate

The Norwegian Research Council has adopted an OA mandate.  (Thanks to Jan Erik Frantsvåg.)  The full text of the policy has not yet been released, but OpenAccess.no has a two-sentence summary.  Here's the summary in Google's English:

Research Council of Norway adopted 28 January principles for open access to scientific publications. Research will require that the articles that are a result of funding from the research, should be made available through self-archiving or Open Access publishing to the extent not in conflict with the publishing rights.

Comments

  • Kudos to all involved.  As the policy moves toward implementation, I hope the Research Council can close the loophole for resisting publishers, as the Wellcome Trust, NIH, UK MRC, and other funders have done.  (For details on how to do it, see #10 in the article I published on Monday.)
  • Also see our past posts on the consultations leading up to the NRC policy.

Update (2/6/09).  The Norwegian Research Council has released an announcement and the text of the policy.  (Thanks to Jan Erik Frantsvåg.)  Both are in Norwegian.  Because the policy is a PDF, I can't link to a machine translation.  The announcement is in HTML, but for some reason Google Translate chokes on it at the moment.  Here's the link to Google's English in case the problem is merely temporary.

Update (2/18/09). Stian Håklev has translated the key parts of the new policy.

Labels:

Good press for the latest Houghton study

John Gill, Analysis backs open-access path for scholarly publishing, Times Higher Education Supplement, February 5, 2009.  Excerpt:

An "open-access" future for academic publishing would save money while boosting the profile of research and maximising its economic impact, a study has found.

John Houghton - an economist at Victoria University in Australia and one author of the report, Economic Implications of Alternative Scholarly Publishing Models: Exploring the Costs and Benefits - said that the savings could exceed £200 million a year in the UK, including £165 million in the higher education sector alone....

The Jisc study looked at three models of scholarly publishing: subscription access, where readers are charged and the use of material is restricted; open access, where access is free and publication is funded from the author's side; and open access self-archiving, in which authors post their work in online repositories, making it free to all via the internet.

Professor Houghton said the report's aim was to determine the most cost-effective publishing model, not simply the cheapest....

"When you consider that (more than £20 billion) a year is spent on research and development in the UK, it would be useful if we knew that we were getting value for money.  We have a traditional publishing model based on copyright, principally involving journals, books and databases, and the reader is charged to access it.  At the other end, the Government is putting in huge amounts of taxpayers' money to achieve a public good, so the country is supposed to get something out of it.  These two things seem to me to be contradictory: public money is being put into research, but then the people are being stopped from accessing it," [Houghton] said.

Professor Houghton added that the current model operated by journal publishers was "unsustainable", not least because of the ever-increasing subscription charges faced by universities.

And he noted that researchers would benefit from the wider dissemination of their work, too.  "There's a proven citation advantage of open-access publishing: you get more readers, you get a better range of them, and that should help you get more funding."

While acknowledging that there would be costs involved in reshaping the publishing landscape, Professor Houghton said the Jisc analysis suggested that they would be affordable within current budget allocations, and that the long-term savings would be significant.

PS:  See our post on the latest Houghton study and all our past posts on his research on the economic impact of OA.


Wednesday, February 04, 2009

BMJ unlocked update

BMJ just added this paragraph to its page on BMJ Journals Unlocked, its hybrid OA option: 

Access to and use of Unlocked articles are covered by the terms and conditions of the exclusive licence agreement, which specifically prohibits commercial use of these articles. Authors participating in Unlocked are able to place the full final version of their article in the depositories of their choice. BMJ Journals will deposit all Unlocked articles in PubMed Central immediately on final publication.

New OA search tool for PubMed

Kevin Davies, Bioalma Launches Novoseek PubMed Search Tool, Bio-IT World, February 3, 2009. (Thanks to ResourceShelf.)

Bioalma, the Spanish biomedical IT/text-mining company, has launched a free search tool for the PubMed literature database called novo|seek, which the company claims provides intelligent search functionality to help life scientists guide and refine their searches of the biomedical literature.

The company calls novo|seek “a dynamic information extraction system” for searching biomedical records in repositories, particularly PubMed. Novo|seek indexes the biomedical literature in PubMed and enables researchers to find relevant results efficiently by using external sources of data and contextual term information. The tool provides familiar chronological listings of search results, but a sidebar presents a series of additional related terms based on relevancy, allowing researchers to drill down and refine additional queries. ...

Bioalma downloads and indexes 18 million documents in Medline each day. That information is then put into the company’s own database using the open-source Lucene search engine library. ...

While Bioalma believes that novo|seek will help introduce its other products to a broader audience, it also hopes to generate revenue by selling online advertizing through Google ads, targeting companies selling reagents or equipment. Bioalma has not held discussions with NCBI about the tool. ...

The first release of novo|seek focuses on PubMed, but in time Bioalma plans to integrate additional resources, such as grant information and full-text search.

Updates to BASE search tool

The Bielefeld University Library has updated BASE (Bielefeld Academic Search Engine). From the February 3 announcement:

... [BASE] offers a new designed web interface ... and new advanced search options ... [W]e are normalizing the according OAI-Metadata information if possible. ...

In addition BASE is still the only search engine for OAI Metadata which works with cross-language retrieval. The Eurovoc Thesaurus enables you to search a term in up to 21 languages not matter which language you use. There are 6,500 basic terms for every language, all in all 239,000 terms are included in the thesaurus.

The number of indexed sources has grown up to more than 1,080 so that we have reached the number of indexed sources in OAIster. ...

See also our past posts on BASE.

Search tool for OA humanities journals

Jurn is a new OA tool for searching more than 900 OA journals in the arts and humanities. See the February 3 announcement or the Jurn blog. (Thanks to Catherine Grant.)

Update. P.S. The tool is a Google Custom Search Engine.

Report on libraries and repositories

ARL Digital Repositories Task Force Releases Final Report, press release, February 3, 2009.

The Association of Research Libraries (ARL) Digital Repository Issues Task Force has released its final report. The task force was charged with evaluating trends, contextualizing repository activities among ARL libraries, and recommending leadership roles and activities for ARL.

The report, “The Research Library’s Role in Digital Repository Services,” identifies key issues surrounding repository development, explores common strategies that libraries are using, analyzes relevant environmental trends, discusses issues where ARL and its member libraries should focus attention, and recommends the following actions for research libraries to undertake:

  • Build a range of new kinds of partnerships and alliances, both within institutions and between institutions.
  • Base service-development strategies on substantive assessment of local needs rather than blindly replicating work done at another institution.
  • Engage with key local policy issues and stakeholders to encourage institutional engagement with national and international policy issues.
  • Develop outreach and marketing strategies that assist “early adopters” of repositories to connect with the developing repository-related service system.
  • Define a scope of responsibility to guide the development of repository services for varied forms of content. ...

The task force notes that, due to repository services’ powerful potential to enable key work and enhance the effectiveness of functions across the research enterprise, research institutions cannot afford to do without such services, even in difficult economic times. ... This report presents a fresh perspective on the digital repository environment and is intended to inspire ARL member libraries and others to assess their views and plans for service development. ...

New OA journal on disease prevention

Primary Prevention Insights is a new OA journal published by Libertas Academica. See today's announcement or the inaugural editorial. The article-processing charge is $1395, subject to discount or waiver. Articles are published under the Creative Commons Attribution License.

RePEc January update

Christian Zimmermann, RePEc in January 2009, The RePEc Blog, February 3, 2009.

The big news this month is that we have now surpassed 700,000 bibliographic items listed in RePEc. The last 100,000 additions took only 7 months, something that seems difficult to beat. ... [W]e experienced heavy traffic on our sites, with 766,586 file downloads and 2,756,978 abstract views in January 2009.

New participants in RePEc are: Universität Augsburg, Universität Bielefeld, World Agroforestry Centre, University of Craiova, Centre of Financial and Monetary Research “Victor Slavescu”, Hokkaido University, Kobe University, Romanian National Institute of Economic Research, C.D. Howe Institute, Central Bank and Financial Authority of Ireland, University of Virginia (II), Danubius University, INRA-Nancy, Utah State University, University of Applied Sciences Düsseldorf, University of Resita ...

Harold Varmus' memoir released

Harold Varmus, chairman of the Public Library of Science board and and co-chair of Barack Obama's President's Council of Advisors on Science and Technology, has published his memoir, The Art and Politics of Science. From the book blurb:
A Nobel Prize-winning cancer biologist, leader of major scientific institutions, and veteran of science policy wars reflects on his remarkable career. An English major with a year of graduate studies in literature at Harvard University, Harold Varmus discovered he was drawn instead to medicine and eventually found himself at the forefront of cancer research at the University of California, San Francisco. In this warm, engaging memoir, Varmus considers a life's work that thus far includes not only the groundbreaking research that won him a Nobel Prize but also tenure as the director of the National Institutes of Health and his current position as president of Memorial Sloan-Kettering Cancer Center. Varmus also shares his perspective from the trenches of politicized battlegrounds ranging from budget fights to stem-cell research, global health to science publishing. Beyond evidence of Varmus's penetrating intellect, self-deprecating humor, and the deep joy he takes in science, The Art and Politics of Science offers a stimulating reminder to people in all walks of life about the fascinating--and central--role of science in our world.

Sparky Award winners announced

Sparky Award Winners Announced, press release, February 3, 2009.

Four student productions are winners of the second annual Sparky Awards, a contest organized by SPARC (the Scholarly Publishing and Academic Resources Coalition) and adopted by campuses nationwide that calls on entrants to creatively illustrate in a short video the value of sharing ideas. ...

The winners were announced on January 24 at a public screening held in connection with the American Library Association Midwinter Conference in Denver. The videos will also be screened at the Campus MovieFest Southern Regional Grand Finale in Atlanta March 28 and 29, 2009.

This year’s winners are:

  • Grand Prize Winner:
    To Infinity and Beyond
    By Danaya Panya, Sebastian Rivera, Hemanth Sirandas, Uriel Rotstein, and Jaymeni Patel, University of Illinois at Chicago Honors College
  • First Runner Up:
    How to Make Things Easier
    By Taejin Kim, Savannah College of Art and Design
  • Second Runner Up:
    Brighter
    By Christopher Wetzel, Ohio Northern University
  • Special Merit Award:
    GrowUp
    By Cécile Iran, Laurie Glassmann, Christophe Zidler, Aldric de Villartay, University of Versailles-Saint Quentin, France ...

Developed by SPARC, the Sparky Awards is co-sponsored by the Association of College and Research Libraries, the Association of Research Libraries, Campus MovieFest, the University of Pennsylvania Libraries, Students for Free Culture, and The Student PIRGs.

The Grand Prize Winner will receive $1,000 plus a Sparky Award statuette. The two Runners Up each receive $500. All the winners will receive a copy of Freedom of Expression®: Resistance and Repression in the Age of Intellectual Property, a documentary film by Kembrew McLeod that looks at free speech and fair use.

Another approach to the pricing crisis

Joseph Storch, Needed: A Single Electronic Source for Textbooks, Chronicle of Higher Education, February 6, 2009 (accessible only to subscribers).  Excerpt:

...It's time to shift the text-selling system from one between publishers and students to one between publishers and colleges. A consortium-style agreement between the latter two groups would make it advantageous for all publishers and higher-education institutions to participate. The consortium could charge participating colleges a single price for unlimited access, based on their number of full-time enrollments, or FTE's. Each college could then pass that charge on to students as part of tuition or through a dedicated fee, or even seek private donors to help defray the cost....

Institutions acting as single payers on behalf of their students would create cost efficiencies, allowing each student to pay a lower net cost for the single digital source than for purchasing textbooks individually. With the cost included in tuition or charged as a dedicated fee, students could then use financial aid to pay for access....

Most of the article focuses on textbooks, which students now buy directly from publishers.  But in a few places Storch implies that he'd like to see the same solution for journals, which students do not buy directly from publishers:

...Those studying at community colleges and research universities, public and private, large and small, would have access to the same content, allowing them to browse, learn, and choose research areas based solely upon interest — not upon the limitations of their library, school subscriptions, or personal bank account....

Each publishing company, journal, or other content provider would receive a percentage of an institution's enrollment fee based on their market share, and when students select material, the provider would receive a royalty in the form of a "micropayment." ...

PS:  Storch doesn't mention OA. 

More on OA projects for economic recovery

Charles Lowry, Let's Spur Recovery by Investing in Information, Chronicle of Higher Education, February 6, 2009 (accessible only to subscribers).  Lowry is executive director of the Association of Research Libraries.  Excerpt:

...Investing in an open, universal digital commons will help ease the current economic crisis by creating jobs, equipping workers with 21st-century skills, and laying a foundation for innovation and national competitiveness in business and research....

A large-scale effort to digitize library and related cultural-memory holdings would be an effective response to the mounting problems that challenge America. Beyond retraining workers for technology jobs, such a project would bring high-quality historical, scientific, and cultural materials into every home and workplace. That increased access would give businesses, state and local governments, and job seekers a leg up and would enrich education at all levels by bringing the world's collective knowledge to parents, teachers, and students.

But a digital library for the new millennium should reach beyond books. Consider the value of giving every citizen free access to course materials....Add to those materials all the nonclassified studies emerging from government agencies and the results of scientific research conducted at universities and institutes.

Imagine, too, the possibility of developing and applying computational research techniques that would allow users to discover, analyze, and understand relationships among different information sources. The National Science Foundation and National Institutes of Health expect that such strategies, made relevant in an open environment, will lead to crucial discoveries in the years ahead....

At the current pace, it would take generations to bring entire libraries and other cultural-memory collections to the open Web. But with an injection of funds, the rate could be quickly increased. On the basis of current practices, we know that in short order, up to 10,000 people could be trained and put to work scanning books, manuscripts, journals, and other materials in library collections. If the scanning encompassed materials outside of libraries, that number would soar.

To create an open, universal digital library of 10 million books would require $300 million....That is a small price compared with the costs of many worthy public-works projects being contemplated....By financing it, President Obama and Congress would put Americans — and knowledge — to work. Generations to come will thank them for their foresight.

PS:  This article is adapted from a piece Lowry co-wrote last month with Prue Adler, Establish a Universal, Open Library or Digital Data Commons.  For similar arguments that OA projects should be part of an economic stimulus, and could themselves trigger further economic recovery, see Michael Geist's article from January 2009, the Open Access Working Group proposal from January 2009, and my open letter to the next US president from October 2008.

An archive archive

The US National Association of Government Archives and Records Administrators (NAGARA) has launched NAGARA Resources, an OA repository for documents about archives.  (Thanks to Klaus Graf.)  From today's announcement:

...[NAGARA Resources is] an online document library which allows users to share archives and records management publications . Subject categories include: accessibility, advocacy, electronic records, disaster preparedness, electronic records, facilities, files management, grants, historical records, inactive records, local government records, microfilm, preservation, legal issues, retention, scanning & digitization, security, storage, training, and miscellaneous....[T]he library currently houses nearly 300 documents and expands in content almost daily. As content grows, site administrators will add new subject categories and/or subdivide existing categories.

No registration is required to download documents , however users who wish to upload documents must be registered....Only documents in the public domain may be uploaded on this site, or copyrighted documents posted by the owner of the copyright.

The site was built in partial response to a 2008 recommendation of the Council of State Archivists' "Closest to Home" Task Force on Archival Programs for Local Governments to "develop a coordinated plan for a portal to provide access to web-based resources on local government archives," but as content expanded, the site has proven to be of interest to archivists from other fields as well.

Libertarian journal converting to OA

Reason Papers is converting to OA, starting retroactively with the Fall 2007 issue.  More details from Stephan Kinsella (thanks to Kimmo Kuusela):

Until today, the great libertarian journal Reason Papers (founded by libertarian icon Tibor Machan) maintained a two-issue moving wall before posting its print articles online. As of today, that changes. The editor, Aeon Skoble, has heroically decided to put all content online as soon as it's published. Issue No. 29 (Fall 2007) has been put online tonight [2/2/09] (the Reason Papers site is hosted by the Mises Institute, natch). Issue No. 30 should be out soon.

PS:  Just 10 days ago libertarians launched a new OA journal, Libertarian Papers

110 volume Italian dictionary of biography converting to OA

The Istituto della Enciclopedia Italiana is converting the Dizionario Biografico degli Italiani to gratis OA.  (Thanks to AMS via Klaus Graf.)

The Conyers bill is back

Yesterday Rep. John Conyers (D-MI) re-introduced the Fair Copyright in Research Works Act.  This year it's H.R. 801 (last year it was H.R. 6845), and co-sponsored by Steve Cohen (D-TN), Trent Franks (R-AZ), Darrell Issa (R-CA), and Robert Wexler (D-FL).  The language has not changed. 

The Fair Copyright Act is to fair copyright what the Patriot Act was to patriotism.  It would repeal the OA policy at the NIH and prevent similar OA policies at any federal agency.  The bill has been referred to the House Judiciary Committee, where Conyers is Chairman, and where he has consolidated his power since last year by abolishing the Subcommittee on Courts, the Internet, and Intellectual Property.  The Judiciary Committee does not specialize in science, science policy, or science funding, but copyright. 

The premise of the bill, urged by the publishing lobby, is that the NIH policy somehow violates copyright law.  The premise is false and cynical.  If the NIH policy violated copyrights, or permitted the violation of copyrights, publishers wouldn't have to back this bill to amend US copyright law.  Instead, they'd be in court where they'd already have a remedy.  For a detailed analysis of the bill and point by point rebuttal to the publishing lobby's rhetoric, see my article from October 2008.

I'll have more soon on ways to mobilize in opposition to the bill and support the NIH and the principle of public access to publicly-funded research.  Meantime, if you're a US citizen and your representative is a member of the Judiciary Committee, it's not to early to fire off an email/fax/letter/phone call to your representative opposing the bill and defending the NIH policy.  You can find ammo here:

Labels:

U of Bath launches an IR

The University of Bath has launched Opus, its institutional repository.  (Thanks to Kara Jones.)   From yesterday's announcement:

Opus is an Online Publications Store, or repository that collects and hosts the research publications of University of Bath authors.

Currently there are over 12,000 references in the system, including journal articles, books and book sections, conference items, patents, reports and working papers, and research degree theses. Some of these items, including the theses, are available in full-text....

Describing the service as a way to enhance the visibility of Bath publications, [Professor and Pro-Vice-Chancellor for Research Jane Millar] emphasised that the transfer of knowledge and the impact of our research is essential for the University, to its community and collaborators.

Making research publications openly accessible also meets funder requirements for the dissemination of publicly funded work, and helps increase the visibility of Bath outputs, providing a ‘shop window’ for scholarly communication....

Using the Eprints repository software, Opus is based on international standards that make it easier for search services such as Google to find and harvest publication details. Opus also hosts the University of Bath PhD and research degree e-theses, and participates in the British Library Ethos service for improving access to research theses in the UK.

As well as encouraging authors to upload their research publications to Opus, Research Publications Librarian Kara Jones also outlined the assistance available to Bath researchers.

This includes training to use Opus to store publications, information on document versions, and alternatives to Copyright Transfer Agreements from publishers. The aim, she said, was "to give both authors and publishers the rights needed to maximise the dissemination and communication of research publications".


Tuesday, February 03, 2009

More on open bibliographic data

Paula J. Hane, Open Solutions for Libraries Gain Momentum, Information Today, February 2, 2009.

Open sharing of data has often collided with issues of ownership and licensing. Nobody does more of a balancing act than librarians, who work to provide materials freely to patrons and other libraries while protecting owners’ rights. LibLime, an upstart company that has provided open source software solutions for libraries for several years (best known for its Koha ILS), has made its move to the next frontier of openness—providing open data and open library content. In 2008, LibLime introduced ‡biblios, an open source, web-based metadata tool for libraries, and it has just launched ‡biblios.net, a free, browser-based cataloging service with a data archive containing more than 30 million bibliographic and authority records. Records are licensed under the Open Data Commons, making the service the world’s largest repository of freely licensed library records. Moves like this by LibLime and other open source and open data providers, such as U.K.-based Talis, clearly have the potential to shake up some competitors, notably OCLC. ...

The infrastructure of free speech

Jack M. Balkin, The Future of Free Expression in a Digital Age, Pepperdine Law Review, Vol. 36, 2008.  From the abstract:

In the twenty-first century, at the very moment that our economic and social lives are increasingly dominated by information technology and information flows, the judge-made doctrines of the First Amendment seem increasingly irrelevant to the key free speech battles of the future. The most important decisions affecting the future of freedom of speech will not occur in constitutional law; they will be decisions about technological design, legislative and administrative regulations, the formation of new business models, and the collective activities of end-users....

Freedom of speech depends not only on the mere absence of state censorship, but also on an infrastructure of free expression. Properly designed, it gives people opportunities to create and build technologies and institutions that other people can use for communication and association. Hence policies that promote innovation and protect the freedom to create new technologies and applications are increasingly central to free speech values....

New technologies offer ordinary citizens a vast range of new opportunities to speak, create and publish; they decentralize control over culture, over information production and over access to mass audiences. But these same technologies also make information and culture increasingly valuable commodities that can be bought and sold and exported to markets around the world. These two conflicting effects- toward greater participation and propertization - are produced by the same set of technological advances. Technologies that create new possibilities for democratic cultural participation often threaten business models that seek to commodify knowledge and control its access and distribution. Intellectual property and telecommunications law may be the terrain on which this struggle occurs, but what is at stake is the practical structure of freedom of speech in the new century.

Notes on OA at ALA

Jonathan Miller, Blogging from Midwinter -- 2, The Director's Blog, January 24, 2009. Notes from the American Library Association Midwinter Meeting (Denver, January 23-28, 2009) on SCOAP3 and OERs.

Obama's Secretary of HHS won't be Tom Daschle

Tom Daschle has withdrawn his name from consideration as Barack Obama's Secretary of Health and Human Services.  He was expected to be confirmed by the Senate, but damaged by his failure to pay $140,000 in personal income tax until after Obama nominated him.  More coverage.

Daschle is a former Senator from South Dakota (1987-2005), former Senate majority leader (2001-2003), and co-author of a book on reforming the US health care system.  As far as I can tell, he had no public track record on OA issues.

The OA connection:  The National Institutes of Health (NIH) belongs to the Department of Health and Human Services (HHS), giving the next HHS Secretary significant control over the future of OA in the US federal government.  The position of NIH Director is also vacant and will probably remain vacant until we have a new HHS Secretary.  Stay tuned.

Italian rectors comment on EU green paper

The OA working group of the Library Commission of the Conference of Italian Rectors has released its comments on the European Commission's green paper, Copyright in the Knowledge Economy.

See also our past posts on the green paper.

New OA book on open art

FLOSS+Art is an OA book released in late 2008; the OA edition can be downloaded from The Pirate Bay.
FLOSS+Art critically reflects on the growing relationship between Free Software ideology, open content and digital art. It provides a view onto the social, political and economic myths and realities linked to this phenomenon.

Harold Varmus again on NPR's Science Friday

Harold Varmus, chairman of the Public Library of Science board and former U.S. National Institutes of Health director, was the guest on National Public Radio's Science Friday on January 30. The recording is now online; the discussion includes OA. (Thanks to Jonathan Eisen.)

What is open archaeology?

Jo Cook, On being open and what that means, Computing, GIS and Archaeology in the UK, January 29, 2009.

... “Open Archaeology” comprises three strands: open standards, open access, and open source. We see this as the only logical way of fulfilling our remit as a commercial archaeological organisation, and an educational charity. Our job is to record the cultural remains that are damaged or destroyed by development. Our remit is to make those records available in perpetuity, to anyone who wants to see them. At the end of the day, pretty objects in museums are of little use without the background information that gives them context and fires the imagination.

While the three strands are not the same thing (as they are often made out to be), open data is useless without open standards and open software. Open software is useless without open data and open standards. ...

2008 Open Archaeology Prize winners announced

Alexandria Archive Institute, AAI announces winners of the 2008 ASOR Open Archaeology Prize, press release, January 30, 2009.

Winners of the 2008 ASOR Open Archaeology Prize competition were announced on November 21, 2008 at the annual [American Schools of Oriental Research] meeting in Boston. ...

First prize ($500) was awarded to the Abzu web site, led by Charles E. Jones, Head Librarian at the Institute for the Study of the Ancient World, New York University and Research Associate, The Oriental Institute, University of Chicago. Launched in 1994, Abzu collects and manages open access scholarly material relating to the ancient Near East and Mediterranean world, including the rich corpus of ETANA Core Texts, which are available for free for noncommercial teaching and research. ...

Second prize ($200 in books, co-sponsored by the David Brown Book Company) was awarded to the Badè Museum of Archaeology web site, led by Aaron Brody (Pacific School of Religion). ... The web site provides access to reusable content from archaeological excavations at Tell en-Nasbeh, conducted by WF Bade in the 1920s and 1930s under the auspices of Pacific School of Religion. The new web site provides digital versions of the contents in the Museum’s exhibits, overviews of research projects and facilitates the ordering of traveling exhibit materials. By openly licensing all content with Creative Commons licenses, the Bade team has ensured that these free and open resources can be downloaded for reuse by anyone. ...

New OA collections on history, philosophy, religion

Two publications have recently gone OA with Revues.org: Each contains conference presentations by researchers in the respective section of the École pratique des hautes études. The 2006-7 issues are now online.

New OA economics portal launches

Economists Online has launched. See the announcement from Nereus:

Economists Online went on air on 2 February 2009. This is a cross-searchable portal with a focus on quality economics research results. It includes profiles of hundreds of leading authors and their comprehensive publication lists with links to many full texts free of charge. Over 300 researchers from 6 top economics institutions (The London School of Economics, Tilburg University, Erasmus University Rotterdam, Maastricht University, Université Libre de Bruxelles and Katholieke Universiteit Leuven) are currently feeding this first phase of the service.

Feedback from the NEEO project participants and their researchers will be considered for the design of the next version of the service. The public can expect access to the second version, which will cover access to the direct research results of over 800 researchers in some 20 prestigious economics institutions (e.g. the universities of Oxford, Warwick and Carlos III Madrid) in autumn 2009.

See also our past posts on Nereus.

Organizing a web of volunteers for OA transcription and digitization

Does your field of study depend on public-domain records which are not yet digitized or OA?  A lot of us have a lot to learn from the genealogists.

See Volunteers rally to put Norwegian records online, Mormon Times, February 2, 2009.  Excerpt:

FamilySearch International, the University of Tromsø, and DIS-Norge announced today a joint initiative to transcribe the 1875 Norway Census for free online access. It is the only Norway census that has not been indexed and the first to be tackled as a global, Internet-based effort. Volunteers who can read Norwegian are being sought to complete the project....

FamilySearch digitized the census images and is using its Web-based transcription tool and volunteers to create the automated index. The University of Tromsø and DIS-Norge are sponsoring the project, but many more online volunteers are needed to transcribe the 1.6 million individuals found in the tens of thousands of census sheets....

"The biggest challenge is the Norwegian handwriting and names," said Jeff Svare, collection management specialist....

Indexers do not need to worry about their skill level at reading censuses. Each census page is transcribed by two different indexers. Any discrepancies between the two entries will be arbitrated by a third indexer. The result is a highly accurate, free index of tremendous value to family history enthusiasts....

The FamilySearch Records Access program has already generated over 500 million names and images through its volunteer initiatives. The collections can be searched for free at FamilySearch.org....

PS:  Also see our past posts on OA to genealogical information.

The state of data sharing in Canada

Stewardship of Research Data in Canada:  A Gap Analysis, a report from Canada's Research Data Strategy Working Group, October 2008.  Also see the press release, January 12, 2009.  (Thanks to Stéphane Goldstein.)  From the report itself, Section IX on Access:

Ideal state:  There is widespread access to publicly funded research data, with appropriate mechanisms in place for regulating access that takes into account security, ethical, legal, and economic interests where appropriate.

Current state:

  • Much of the research data being produced today is hard to access by other Canadian research communities, and is often not ideally structured to be as useful or as open as possible....
  • There are large reservoirs of existing data not in current use and not available online.
  • Researchers are reluctant to share data because they feel it is their intellectual property.
  • Researchers lack the expertise to ensure that data are accessible by others in the future.
  • While there is a growing international trend towards free access to data held in repositories (e.g. GeoConnections), many repositories still charge fees for access...or restrict access to community members only (e.g. Statistics Canada and Natural Resources Canada).
  • Few research organizations have policies requiring researchers to provide access to data....

Comment.  The Working Group turned off cutting and pasting in the PDF report.  (Why?)  Normally under these circumstances I'd point you to the relevant pages and not bother to rekey an excerpt.  But the Working Group didn't paginate the report either.  (Why?)  This not a good sign of how well the Working Group understands the gap between the current state and the ideal state of information sharing.

Update (2/4/09).  Kathleen Shearer, the lead author of the report, has unlocked the PDF and added page numbers.  Thanks, Kathleen!

The Google settlement will not improve access outside the US

Adam Hodgkin, Google Book Search and the Tragedy of the Anti-Commons, ExactEditions, February 1, 2009.  Excerpt:

Michael Heller, a property lawyer at Columbia University, has coined the term the 'tragedy of the anti-commons'....Heller's insight is that too much private ownership can be as much of a problem as too little: “When too many owners control a single resource, cooperation breaks down, wealth disappears and everybody loses.” He gives plenty of examples in his book The Gridlock Economy -- the book's argument is forcibly stated in its subtitle: How Too Much Ownership Wrecks Markets, Stops Innovation, and Costs Lives.

There is a good chance that the Google Books Settlement is going to show us all how this tragedy of the anti-commons works out in the world of books. The Google project, which is backed by the American publishers and American Authors's representatives should be (in my view will be) a wonderful resource for American universities, schools, public libraries and through them for American consumers. By 2011, if the Settlement is approved, at least 5 million out of print but not yet out of copyright [OOPnotYOOC] titles will be available to readers in the US market. This resource will have little opportunity to work so well for authors, readers and consumers in the rest of the world....

Google is already serving a very different and vastly narrower view of Google Book Search to the rest of the world (even to Canada and Mexico). Books which are public domain and wholly visible and readable in the US are not visible or readable elsewhere. And this copyright caution about territorial rights is unlikely to change, because the Settlement, when it is approved, is only going to be approved and agreed for the US market. Google has been persuaded (or has volunteered?) to accept the territorial restrictions and complications inherent in the market of copyright books. In my view, Google will not risk starting court actions in other jurisdictions, for the very simple reason that they might be lost, or worse still settled on a different basis from the US dispute. Google will be bound to leave the ex-US position of its wonderful aggregate of unloved (mostly 'orphan') copyrights in a national limbo. The orphans will remain unloved outside the 50 states.

The complexity of the rights situations of these millions of titles is effectively unmanageable and un-negotiable, which is pretty much what Michael Heller means by a tragedy of the anti-commons....

PS:  See our past posts (1, 2, 3, 4, 5) showing that Google Book Search already tends to block access to users outside the US.

Is Yemen about to mandate access to research?

The Yemen parliament is considering a bill to enhance citizen access to information.  Most of it concerns public sector information and government sunshine practices.  But it also touches on access to research.

See Faisal Darem, Information law: guarantees access to information, calls for more transparency and privacy protection, Yemen Observer, February 3, 2009.  Excerpt:

Parliament is scheduled to discuss the information draft law during its present legislative session, after it was passed at the meeting of the Council of Ministers, held last December, acting on a proposal made by the National Information Center (NIC).

The advanced information draft law...places Yemen among those countries with modern legislatures that ensure the freedom of information and the citizen’s right of access....

To ensure intellectual property rights, the draft law obliges all scientific and research institutions to put down copies of research or intellectual periodicals which they carry out or supervise....

Comment.  Unfortunately, the key paragraph is obscure.  What does it mean to "put down copies of research"?  Deposit them in repositories?  Pull them off the internet?  The general context suggests that access to research must be wider and easier, if not open.  But the immediate context ("To ensure intellectual property rights...") suggests the opposite.  Can any OAN readers help sort this out?  If so, please drop me a line or post your findings or translations to SOAF.

Technical introduction to SWORD

Sarah Currier, SWORD: Cutting Through the Red Tape to Populate Learning Materials Repositories, JISC e-Learning Focus, February 2, 2009.  Excerpt:

...From the start, the vision for repositories was that of easy self-deposit by resource authors, and widely available access to as many educators as possible....However, the reality to date has been somewhat at odds with this vision, due to, for example, requirements for good quality metadata....The labyrinth of barriers, some of which need good technical or usability solutions, some of which require cultural shifts, and some of which demand courageous decisions from funders and managers to side-step, have given many potential contributors to repositories too much of a headache to fully participate....

[A]mongst many efforts to overcome the multitude of barriers to use of repositories, in 2006 the first JISC Repositories Programme, along with JISC CETIS and UKOLN (in the form of the original Repositories Research Team), identified enabling remote and bulk deposit as a work item that could make a huge difference for all kinds of repositories. Agreeing an open protocol that could be supported by the major repository systems, and used by anyone creating tools and widgets for deposit, became a top priority, and SWORD was funded....

This article gives an introduction to SWORD from an educational technology perspective; most other resources on SWORD focus on research outputs repositories.....

Fighting artificial scarcity

Georgia Harper, OA, IRs and IP: Open Access, Digital Copyright and Marketplace Competition, text of a presentation at the ALA Midwinter meeting (Denver, January 23-28, 2009).  Also see the slides.  (Thanks to Robert Richards.) 

Abstract:   The fundamental concerns about intellectual property for open access institutional repositories are not about who owns what rights, or who can do what with them, or what you have to require contributors to give your institution to be sure you’ve got the rights you need to provide open access to their works. Those guidelines are readily devised and applied. The copyright conundrum created by open access is more basic than this: Is it appropriate, is it even necessary, and certainly, is it the best way going forward, to artificially make our works difficult to find and access and saddle them with high prices in an era when people all over the world could quickly know about our current research results through the Web for no more than the cost to them of their own infrastructure to find and read our works?

For more than 200 years copyright law has enabled, and scholars and their publishers have depended on, the mechanism of state-granted monopoly, "creating artificial scarcity" to give publishers a period of time during which they can charge higher prices than the market would otherwise dictate and recover their costs of publishing plus a profit in most cases. But today we have instant access to digital creative works, and easy, world-wide distribution for almost no cost for the reader beyond the cost of computers, internet access and electricity. In this world, the monopolistic mechanism of "artificial scarcity" turns what is one of the most important, most critical advantages of the digital world into something to be fought tooth and nail. The solution isn’t stronger and longer copyrights. It more likely will emerge from massive experimentation to find satisfactory business models that can fund the creation of works, still a costly undertaking, without sacrificing the digital benefit of relatively free distribution to anyone and everyone who might desire to access our works.

Everyone in this room probably knows this. But what we may not realize is the magnitude of the experiment and what it also tests -- that if in fact it is not just possible but profitable to create and disseminate digital research results (or any creative digital work for that matter) without relying on the copyright monopoly, and if the social costs of monopoly begin to outweigh its public benefits in the coming world of ubiquitous repositories adding to ubiquitous Web content, all providing ubiquitous open access, the experiment tests the validity of the fundamental assumption underlying copyright, that monopoly is the only way (or even the best way) to achieve optimal production of most creative works.


Monday, February 02, 2009

On institutional vs. thematic vs. funder repositories

Bernard Rentier, Dépôts institutionnels, thématiques ou centralisés ?, Bernard Rentier, Recteur, February 1, 2009. Read it in the original French or Google's English. Also includes this note:

Our institutional repository ORBi keeps its promises: it has this week surpassed 4,000 deposits and more importantly, 79% are accompanied by a full text and it is thus ahead of schedule. ...

See also our past posts on Rentier or the Université de Liège, where he is rector.

Update. See especially this graph showing deposits (all deposits in red, full-text deposits in blue) before and after the repository went into general production, with its accompanying mandate.

Update. See also Rentier's English translation (thanks to Stevan Harnad):

... The latest [green OA] initiative comes from the very active EUROHORCs (European Association of Heads of Research Funding Organisations and Research Performing Organisations), well known for its EURYI prizes and its prominent influence on European thinking in the research area. EUROHORCs is working to convince the European Science Foundation (ESF) to set up, through a large subsidy from the EC, a centralised repository (CR) which would be both thematic (Biomedical) and geographic (European). The concept is inspired by PubMed Central, among others.

The EUROHORCs initiative is very well-intentioned. ... But the initiative also reveals a profound misunderstanding about what OA and researchers’ real needs are all about.

The vision underlying the EUROHORCs initiative is that research results should be deposited directly in a CR. However, if research results are not OA today, this is not because of the lack of a CR to deposit them in, but rather because most authors are simply not yet depositing their articles at all, not even in an IR.

Creating a new repository is hence not the solution for making research OA. The solution lies in universal deposit mandates, from both institutions and funding agencies. ...

What is worrisome is the needless double investment in creating two distinct kinds of repositories for direct deposit. This trend seems to rest on the naive notion that, in the Internet era, it is somehow still necessary to deposit things centrally. But in reality, the centralising tool is the harvester, and its search engine. Google Scholar, for example, is quite efficient in finding articles in any repository, institutional or central, yet no one deposits articles directly in Google Scholar. ...

Giving priority to creating more CRs for direct deposit today is not only a waste of time: it is also counterproductive for the growth of convergent funder and institutional mandates. It would generate multiple competing loci of primary deposit for authors ...

On open data and licensing

Rufus Pollock, Open Data Openness and Licensing, Open Knowledge Foundation Blog, February 2, 2009.

Why bother about openness and licensing for data? ...

It’s crucial because open data is so much easier to break-up and recombine, to use and reuse. We therefore want people to have incentives to make their data open and for open data to be easily usable and reusable — i.e. for open data to form a ‘commons’.

A good definition of openness acts as a standard that ensures different open datasets are ‘interoperable’ and therefore do form a commons. Licensing is important because it reduces uncertainty. ...

See also our recent post on licensing and open science.

An IR for UNAM's humanities school

The Universidad Nacional Autónoma de México's humanities school (Facultad de Filosofía y Letras) has launched a new IR, RU-FFYL, as part of UNAM's 3R project. See the January 31 announcement by Juan Manuel Zurita Sánchez in the original Spanish or Google's English.

See also our past posts on the 3R project.

More on OA for bibliographic data

Joshua Ferraro, Beyond Open Source : Other Types of Open, Open Sesame, January 29, 2009.

... Historically, libraries haven’t had openly-licensed, community-maintained sources of library metadata. What we do have are:

  • National libraries, and the Library of Congress, that create records and make them available freely via Z39.50 or on CD-ROM. These types of databases are openly-licensed in the sense that they are often public domain and free. However, the databases themselves are tightly managed in a top-down fashion, and often don’t contain metadata for many of the materials libraries own.
  • Membership-driven organizations, consortia, and ILS vendors often provide subscription-based access to their comparatively large metadata databases, and let members add and maintain the database. This solves the community-maintained piece, but the underlying data is typically not available to non-members and is viewed as the intellectual property of the organization hosting access to the platform, rather than collectively owned by the users.

The philosophy behind an Open Data movement scarcely needs an explanation to a library audience. The mission of most libraries is to provide open, free access to ideas and information. Certainly that same mission applies to the metadata created BY libraries. If we can’t freely share the stuff we’re creating among ourselves, how effective can we possibly be at sharing with our communities? ...

2007 saw the launch of the Open Library project, with a goal of creating ‘a page for every book’. Libraries responded by donating over 30 million of their MARC Bibliographic Records, making them freely available by uploading them to the Internet Archive, thereby placing them forever into the public domain. Notable additions to the effort were data sets obtained from the Library of Congress (over 7 million records) as well as UK ILS vendor Talis (over 5 million records). Around the same time, The Library of Congress Authority file surfaced publicly in MARCXML format. And LoC itself opened up access to their records via an XML web service (MARCXML) using LCCN as an identifier, making it possible to access newly created and modified records records more easily. ...

[Rich Internet Applications] were all the rave when LibLime was selected as a 2007 Google Summer of Code mentor, and thus was born ‡biblios, an open-source web-based metadata editor. Last year LibLime released ‡biblios under the GPL and the software is freely available for download from http://biblios.org. ...

‡biblios the editor provides one part of a technology framework for community-maintained data. The other part, a web-scale, production-ready platform where librarians can search, create, share and collaborate, is where ‡biblios.net comes in. ‡biblios.net is the world’s first community-built and maintained database of freely-licensed library records. It’s more than a cataloging editor, it’s a comprehensive cataloging productivity suite ...

Perhaps best of all, not only does ‡biblios.net contain freely-licensed library records, the service itself, including the cataloging editor is made available for use at no cost. ...

John Mark Ockerbloom, Open catalog APIs and data: ALA presentation notes posted, Everybody’s Libraries, January 28, 2009.

I’ve now posted my materials for the two panels I participated in at ALA Midwinter.

I have slides available for “Opening the ILS for Discovery: The Digital Library Federation’s ILS-Discovery Interface Recommendations“, a presentation for LITA’s Next Generation Catalog interest group, where I gave an overview of the recommendations and their use. At the same session, Beth Jefferson of BiblioCommons talked about some of the social and legal issues of sharing user content in library catalogs and other discovery applications.

And I have the slides and remarks I prepared for “Open Records, Open Possibilities“, a presentation for the ALCTS panel on shared bibliographic records and the future of WorldCat. In that one, I argue for more open access to bibliographic records, showing some of the benefits and sustainability strategies of open access models.

Karen Calhoun has also posted the slides from her presentation at that panel. Peter Murray also presented; I haven’t yet found his slides online, but he’s blogging about what he said. ...

Talis and LibLime Open Data on ‡biblios.net, press release, January 30, 2009.

Talis, the UK market leader in providing academic and public library solutions, and LibLime, the leader in open solutions for libraries, are pleased to announce a partnership to make available over five million bibliographic records to the library community on the ‡biblios.net platform.

biblios.net is LibLime's free browser-based cataloguing service with a data store containing over thirty-million records. The database is maintained by ‡biblios.net and uses a similar model to Wikipedia. Cataloguers can use and contribute to the database without restrictions because records in ‡biblios.net are freely-licensed under the Open Data Commons Public Domain Dedication and License.

Talis is providing data from the Talis Union Catalogue; the open shared core of records from the Talis Base service, to ‡biblios.net, including over 5 million bibliographic records, catalogued by public and academic libraries in the UK over the last 30 years. ...

Richard Wallis, Technology Evangelist at Talis adds "The open sharing of data, the default motivation for most librarians, has often been stifled by confusion and fear about ownership and licensing. Open Data Commons helps clarify and dispel those fears, opening up data that can confidently be shared. ‡biblios.net is a great example of the innovation that results when data is really open.” ...

Google: Don't re-host public domain books from Google Books

Yakov Shafranovich, Change in Google Book Search Guidelines for Public Domain Books, Personal Website of Yakov Shafranovich, January 30, 2009.

For a few years, Google Book Search has provided PDF downloads of public domain books. The books came with a page listing some guidelines that Google asked people to follow and the same guidelines are listed in their Google Book Search help center. ...

[The guidelines have been changed] - it used to say that only non-commercial use is [allowed]. Now it has been replaced by two new guidelines: no hosting, and no reprints including helping people reprint.

An interesting wrinkle about the new hosting restriction is that the Internet Archive is currently hosting about 537,000 PDFs of public domain books from Google Book Search. Under the old rules, non-commercial hosting was ok. What is the story under the new guidelines?

The new no reprint guideline seems to be directed towards services like my own PublicDomainReprints.org ...

Of course, the elephant in the room is that these books are in public domain and thus have no copyrights. Without significant creative change it would not be possible to re-assert copyrights over the public domain scans (sweat of the brow [was struck] down back in 1992). Whatever terms apply are being pushed via contracts and not the tradition route of copyright licensing. This may or may not similar to what the OCLC has been recently doing by trying to enforce contract rules on stuff that cannot be copyrighted. Can a contract override federal copyright law, placing a public domain book under someone else’s legal power? ...

See also the comments by James Grimmelmann and others.

See also our past post on the Google Book Search guidelines.

More on TA liturgical texts

Jeffrey Tucker, Religion and Royalties on Ritual Texts, New Liturgical Movement, February 2, 2009.  (Thanks to Gino D'Oca.)  Excerpt:

The Catholic Church is alone among major Christian denominations to place its core ritual texts under copyright and charge royalties for their use. The texts of Episcopal, Lutheran, and Orthodox Churches are in the public domain, and free for anyone to print under any conditions. This encourages publishers to disseminate the texts, composers to use them for setting music, and website builders and bloggers to freely quote them in any form.

If the mission of the church is to spread the Gospel and evangelize for the faith, what possible rationale could there be for charging for the right to publish the ritual? ...

PS:  See our past posts on toll access for Catholic liturgical texts.

An OA repository for nanoscience

ICPCNanoNet is a new OA repository for nanoscience research.  From today's announcement:

...Successful outcomes to [nano] research will have a measurable impact on the future well-being of our global society; however this can only be achieved through improving access to information and opportunities for international collaboration. The ICPCNanoNet project is one such initiative in this framework. Funded by the EU under FP7 for four years from June 2008, it provides:

  • an electronic archive of nanoscience publications that is freely accessible to researchers around the globe;
  • an electronic database of nanoscience organizations and networks in ICPC;
  • links to nanoscience researchers and stakeholders across the globe;
  • annual reports on nanoscience developments in eight ICPC regions: Africa, Caribbean, Pacific, Asia, Eastern Europe and Central Asia (EECA), Latin America, Mediterranean Partner Countries (MPC), Western Balkan Countries (WBC);
  • online networking tools (forums, workshops);
  • annual workshops, one in each of EU, China, India, and Russia, which will also be webcast to facilitate greater access.

The project brings together partners from the EU, China, India and Russia and aims to...[create] an open access electronic archive of nanoscience publications and tools to facilitate networking between scientists in different world regions.....

NEH's Brett Bobley on the digital humanities

Kathleen Smith and Michael Gavin, Q&A with Brett Bobley, Director of the NEH's Office of Digital Humanities (ODH), HASTAC, February 1, 2009.  Bobley is the the Chief Information Officer for the US National Endowment for the Humanities (NEH) and the Director of its Office of Digital Humanities (ODH).  Excerpt:

1) What are the most interesting innovations happening right now in the field of digital humanities, and is it possible to predict or anticipate what will be most important in the future?

The most interesting innovations?  That's a great question - one that I could talk about all day!  First, let me briefly explain [that]...under the digital humanities rubric, I would include topics like open access to materials, intellectual property rights, tool development, digital libraries, data mining, born-digital preservation, multimedia publication, visualization, GIS, digital reconstruction, study of the impact of technology on numerous fields, technology for teaching and learning, sustainability models, media studies, and many others....

Before we look at humanities scholarship, let me throw out an analogy [to music]....

Now let's look at these three areas again (Access, Production, and Consumption) but in the context of humanities scholarship....The change in access may not be quite as far along as it is for music, but it will be soon.  Like with music, you'll have access to materials from all over the world.  You won't have to send a book via airmail from New York to Chicago because you'll have instant access to it on your PC (or your mobile device).  If you want to study materials in China, you'll be able to view them (or for that matter, find out about them) using the Web.

On the production side, we're already seeing more and more scholars producing their work for the Web.  It might take the form of scholarly websites, blogs, wikis, or whatever.  But, like with music, a scholar (even an amateur, part-time scholar) can make her work available to the entire world at very low cost of production....

If I had to predict some interesting things for the future in the area of access, I'd sum it up in one word:  scale.   Big, massive, scale.  That's what digitization brings - access to far, far more cultural heritage materials than you could ever access before.  If you're a scholar of, say, 19th century British literature, how does your work change when, for the first time, you have every book from your era at your fingertips?  Far more books than you could ever read in your lifetime.  How does this scale change things?  How might quantitative tech-based methodologies like data mining help you to better understand a giant corpus?...

4) How will digital technology in the academic system in general (for example, in the changing role of textbooks in the classroom, open-access databases, or publishing requirements for tenure) affect the way research is performed and shared?

...Let's face it:  sometimes scholarship is constrained by seemingly mundane hurdles like copyright, travel costs, or language barriers....

Making share-alike the default

Guy Pessach, Reciprocal Share-Alike Exemptions in Copyright Law, Cardozo Law Review, Vol. 30, No. 3, 2008.  (Thanks to Charles Bailey.)

Abstract:   This article introduces a novel element to copyright law's exemptions' scheme, and particularly the fair use doctrine --a reciprocal share-alike requirement. I argue that beneficiaries of a copyright exemption should comply with a complementary set of ex-post reciprocal share-alike obligations that come on top of the exemption that they benefit from. Among other aspects, reciprocal share-alike obligations may trump contractual limitations and technological protection measures that are imposed by parties who relied on a copyright exemption in the course of their own use of copyrighted materials. Thus, fair use beneficiaries should be obliged to treat alike subsequent third parties who wish to access and use copyrighted materials - now located in their new "hosting institution" - for additional legitimate uses.

For example, if Google argues that its Book Project's scanning of entire copyrighted works are fair use, a similar exemption should apply to the benefit of future third parties who wish to use, for similar socially valuable purposes and under similar limitations, digital copies of books from Google's databases and applications. Google should also be prohibited from imposing technological protection measures and contractual obligations that revoke its reciprocal share-alike obligations. Similar quid-pro-quo schemes may apply in the context of content sharing platforms that initially rely on the Digital Millennium Copyright Act's (DMCA's) safe harbor for hosting services providers but later on impose proprietary restrictions on third parties who wish to reproduce and further use materials that were uploaded on the platform by end-users (e.g. as in the case of YouTube.com). And one could go on and apply this basic logic of a reciprocal share-alike quid-pro-quo on many other elements in copyright law's scheme of exemptions and limitations.

I argue that the making of copyright's exemptions reciprocal corresponds well and improves the economics of copyright and public-welfare considerations. Overall, reciprocal share-alike exemptions structure copyright law in manner that strikes a better balance between copyright's contribution (incentive) to cultural production and copyright's social cost - the burdens it imposes on future creators. As long as a reciprocal share-alike requirement is structured in a scope that maintains enough incentives to produce secondary works, it represents a social benefit that copyright law should capture. In addition, the article argues that reciprocal share-alike exemptions further enhance democratic, autonomy and distributive values that underlie a public-oriented vision of copyright law.

European Parliament: Licensing to improve access

On January 26, Manuel Medina Ortega, a Member of the European Parliament from Spain, introduced the Commission's report on the application of Directive 2001/29/EC on the harmonisation of certain aspects of copyright and related rights in the information society. The report is "non-legislative" but makes recommendations. An unofficial text of the report is available here (changes from the previous draft are in bold text). The European Parliament's Legislative Observatory forecasts it could be taken up in the parliament's next plenary sitting, scheduled for March 2009. Quoting:

The European Parliament ...

16bis. Applauds the success of the Europeana project in that it demonstrates the viability of the European approach combining respect for copyright with better access for users to creative content online; notes that Europeana, predicated as it is on partnership and ongoing dialogue extending to all stakeholders, enables works to be preserved unimpaired, as well as making for a high standard of legal digitisation; points out in addition that [European] Community copyright legislation stipulates that protected works may not be digitised and made accessible, even in extract form, unless authorisation has been obtained from the rightholders; stresses that this principle is a cornerstone of Europeana; ...

19. Takes the view that digitisation of works should take account of copyright and neighbouring rights and must not conflict with normal exploitation of the works on the internet, particularly as regards revenue earned by virtue of the right of making available to the public; ...

21. Takes the view that the creation of online digital libraries on the basis of large-scale digitisation projects must be carried out entirely in agreement with holders of copyright and neighbouring rights on the basis of voluntarily negotiated agreements; ...

24. Wishes the scientific community and researchers to enter into voluntary licence-issuing schemes with publishers in order to improve access to works for purposes of teaching and research; however, takes particular note of the value of learned journals, which play a key role in the peer review process of validating the results of academic research, and the financial viability of which is dependent on paid subscriptions; ...

Bernard Lang calls attention to the report, with alarm. Fred Friend also comments:

Like Bernard Lang, I have very severe concerns about the direction the European Parliament is taking in looking at revisions to the 2001 Directive. A group of us from European academic and cultural organisations met Manuel Medina Ortega MEP last October. It was a rushed meeting and although we sent a follow-up paper outlining the need to take account of user interests, the points we made have been totally ignored. The continued emphasis upon licensing is a sign that users' interests are not being taken seriously. Licensing has its place but not as a substitute for strong Exceptions, which will not damage the financial viability of journals. ...

European readers of this list are encouraged to make their local MEPs aware of these issues before it is too late. ...

Comments.
  • The recent revisions seem overall to support even more strongly expansive copyright and publisher's interests. The section on Europeana, which is new, has some pro-access language, but also points out that "protected works may not be digitised and made accessible, even in extract form, unless authorisation has been obtained from the rightholders" (emphasis added), a seeming swipe at Google Book Search.
  • The EC's green paper, Copyright in the Knowledge Economy, is attached, but the suggestions of the many OA advocates who commented on the green paper appear to have had little influence on the report.
Update. For additional background, see this post by Anne-Catherine Lorrain.

February SOAN

I just mailed the February issue of the SPARC Open Access Newsletter.  This issue takes a close look at 18 choice-points facing funding agencies and universities when they draft a new OA policy, review an existing policy, or think about policies elsewhere.  The round-up section briefly notes 159 OA developments from January.

Labels:


Sunday, February 01, 2009

Australia's draft research assessment system requires deposit in IRs

The Australian Research Council (ARC) has released the Draft Technical Specifications (January 22, 2009) for its System to Evaluate the Excellence of Research (SEER).  (Thanks to Colin Steele.) 

The draft guidelines --the result of a public consultation which ended in June 2008-- require deposit in institutional repositories and OA whenever possible.  Excerpt:

...[The Excellence in Research for Australia (ERA)] recognises that the Australian Government has provided support for the establishment of digital repositories through the Australian Scheme for Higher Education Repositories (ASHER) program....

Where the institution or the researcher is the copyright owner, or where the copyright owner has given express permission for the research output to be stored in an ‘open access’ repository, the research output should be stored in an ‘open access’ repository or in an ‘open access’ part of an institution’s ERA repository....

The ARC will manage and control access to research outputs and will only allow ARC staff, REC [Research Evaluation Committee] members and peer reviewers and other ARC authorised personnel access to research outputs that are stored in an institution’s repository for the purposes of the ERA initiative....

Labels: