Open Access News

News from the open access movement


Saturday, November 29, 2008

Nov-Dec issue of Science Editor

The November-December issue of Science Editor is now online.  It has two OA-related articles, but not even abstracts or the authors' names are free online, at least so far:


Friday, November 28, 2008

OA info on improving access to info in developing countries

USAID has launched GlobalDevelopmentCommons.  (Thanks to ResourceShelf.)  From the announcement (November 24, 2008):

...The Global Development Commons initiative seeks innovations that to make knowledge more accessible and affordable to help people in developing countries solve social and economic problems. The Commons seeks to catalyze the international development community to become more open and collaborative through information and communication technologies.

The site profiles successful applications of technology that improve access to information in developing countries....

"Seal of approval" for data

The Data Seal of Approval is an initiative of Data Archiving and Networked Services launched earlier this year. It sets out guidelines for research data, with these principles:
Digital research data must meet five quality criteria:
  1. The research data can be found on the internet.
  2. The research data are accessible, while taking into account ruling legislation with regard to personal information and intellectual property of the data.
  3. The research data are available in a usable data format.
  4. The research data are reliable.
  5. The research data can be referred to.
See also the summary and comments by Stéphane Goldstein.

New issue of the eIFL newsletter

The November/December issue of the eIFL newsletter has a section on OA/IR news. Developments discussed:
  • DRIVER and eIFL.net sign a Memorandum of Understanding
  • Events celebrating Open Access Day in eIFL countries
  • eIFL.net is joining the Open Access Scholarly Publishers Association
  • The Serbian Citation Index – SCIndeks 1.6. published

Impact of OA on clinical trial results

Maxine Clarke, Positive skew of clinical-trial publication, Peer-to-Peer, November 27, 2008.

A news story in Nature Medicine discusses an investigation into the publication status of the clinical-trials literature, which concludes that positive results of clinical trials for drugs or devices have a higher chance of getting published than negative trials. ...

This 'positive publication bias' is a serious problem, because it can make a drug or device appear in the literature to be more effective than it is. ...

According to the Nature Medicine article, a paper in Science indicates that "the FDA Amendments Act of 2007 has improved transparency, because the law mandates that sponsors or primary investigators of clinical trials for approved drugs post a summary of their results in a national open-access database. The lead author of the report, Deborah Zarin, oversees the ClinicalTrials.gov registry at the National Library of Medicine of the US National Institutes of Health ...

More blog notes on the Students for Free Culture conference

Rich Jones, The Future of Free Culture, The New Freedom, November 27, 2008. Blog notes on the Free Culture 2008 conference (Berkeley, October 11-12, 2008).

See also our past posts on the conference:

Leeds Met U launches its IR

Leeds Metropolitan University is hosting a launch event for its IR today.

See also our past posts on Leeds Met U's IR: 1, 2.

Update. See these further notes on the IR.

Videos on complying with NIH policy

The U.S. National Institutes of Health has posted two videos on complying with its Public Access Policy: one on depositing the final peer-reviewed manuscript in PubMedCentral and one on approving submission of a publisher-deposited manuscript. (Thanks to SPARC.)

Medscape founder: most medical journals will be OA in 5 years

Peter Frishauf, The End of Peer Review and Traditional Publishing as We Know It, video commentary, Medscape Journal of Medicine, November 24, 2008. Frishauf is the founder of the journal. (Thanks to Mike Cadogan.)

Australian economics journal converts to OA

Economic Analysis and Policy, a journal published by the Economic Society of Australia, Queensland Branch, converted to OA in March 2008. See the journal's editorial on the change. The journal's backfiles to 1970 are also OA. (Thanks to Janet Baker.)

Universities would save money if all journals were OA

Philip Johnson, University libraries, budgets, and open access, Biocurious, November 26, 2008.  Excerpt:

I’ve been playing with numbers in my head based on the statistics from my home institution – the University of Toronto – relative to publications, the real cost of open access publishing, and the U of T library’s annual budget for journal subscriptions.

It turns out that U of T is listed as an institution on some 6470 publications per year (averaged during 2000-2004, data from Thomson Scientific), and as of 2005-2006, the U of T libraries (spread over a couple of campuses) have a periodicals budget of just over $10 million per year....For the rest of this article, let’s assume the American and Canadian dollars are roughly equal in value (which was the case until a couple of months ago)....

What, then, if the university decided to embrace the open access movement entirely? This is obviously a pie-in-the-sky idea, but bear with me, as the result is interesting.

Subscription costs would obviously be nil for an open access journal: we are all free to access the content of an open access journal via the internet, with no restrictions on who can read the content. In contrast, the author would pay to publish the article. This is perhaps the biggest resistance from scientists (and I’m sure the situation would be similar in the arts, or law, or what have you) to the open access movement, many feeling they don’t have enough funding for students or experimental equipment as is, and couldn’t possibly afford to pay to publish as well. I can appreciate this argument, though some progress is being made as you can specifically request funding to cover open access publication charges from some of the granting agencies.

(Also, let’s be honest, the current situation of paying for page charges and to have colour figures means the author is already paying to publish, and sometimes non-trivial amounts.)

The funding supplied to the library for journal subscriptions could instead go towards paying for the publishing costs in open access journals. Using the PLoS journals as our benchmark, premium quality publications would cost around $2500/article (the current fee to publish in PLoS Biology or PLoS Medicine), while the bread-and-butter publication costs are maybe closer to those of PLoS ONE — $1300/article.

Let us imagine that 10% of the publications coming out of U of T are of the premium variety, while 90% are your more run of the mill papers, and that there are open access journals in which to publish them. Using the current costs from the Public Library of Science, 650 premium papers would run around $1.6 million dollars, while the 5850 “bread-and-butter” papers would cost an additional $7.6 million each year. This is already less than the 2005-2006 periodicals budget of slightly over $10 million!

Let’s further assume that the economies of scale would kick in if universities around the world decided to embrace this philosophy. This should lead to an overall lowering of the publication costs, all the while bringing access to academic literature to everyone with an internet connection....

[T]he take-home message is reasonably clear, at least using the University of Toronto numbers: we could already afford going entirely open access....

So what’s the hold up?

Comments

  • Johnson is right that universities would save money.  But they'd save even more than he calculates.  His calculation assumes (1) that all OA journals charge publication fees and (2) that universities would pay all of them.  Both assumptions are untrue.  Refining the calculation to reflect no-fee OA journals and fee-paying funding agencies would lower the projected cost to universities even further.
  • See my June 2006 article for a full analysis of those two assumptions, focusing on earlier versions of the same calculation, which tended to show that switching over to OA would cost more than the current cost of subscriptions.  Also see my November 2006 article on the majority of OA journals which charge no publication fees. 
  • Since November 2006 several studies have give us newer data on the the ratio of no-fee OA journals to fee-based OA journals.  The best current estimates are that 67% of the journals listed in the DOAJ charge no publication fees, and 83% of OA journals from society publishers charge no publication fees.  Clearly if Johnson zeroed out the publication fees for two-thirds of the articles published by Toronto faculty, the projected saving would rise significantly.
  • The bargain would be even more compelling if we could subtract the publication fees paid by funding agencies rather than universities.  Unfortunately there are no studies yet estimating that number.

More comments on the EU green paper

Remember that comments on the EU green paper, Copyright in the Knowledge Economy, are due on November 30.  One of the questions raised in the paper --Question 19, quoted below-- has a strong OA connection.  See other comments I've posted recently (1, 2, 3).  Here are two more.

From SURF:

...The interest that scientists and scholars have in the dissemination of their knowledge and research results therefore differs from that of publishers....

SURF notes that the Green Paper has been written from a rather one-sided perspective – that of right holders, in particular publishers – and that it clearly expects a great deal from contractual arrangements. In actual fact, educational and research institutions, scientists and scholars, consumers, and other re-users of knowledge products would be better served by clearly formulated exceptions than by contractual arrangements, not least because of the weak negotiating position in which they find themselves.

It is also notable that the Green Paper takes the old traditional publisher’s model as its starting point, i.e. with the author transferring his copyright to the publisher. By doing this, the Green Paper disregards the new developments that have been underway for quite some time as regards alternative business models in which there is no such transfer of copyright from author to publisher. Open Access can be facilitated, for example, by the publisher being granted a “Licence to Publish” such as those for which models have been drawn up by SURF (in collaboration with JISC) and Creative Commons. One positive trend is that an increasing number of articles and dissertations are being lodged – sometimes after an embargo period – in the “repositories” of research universities and universities of applied sciences, where they are then freely accessible....

Matters are complicated, however, by the advent of new types of re-use made possible by new technologies. These are hampered by existing licences, quite simply because those licences do not permit them. One good example is the data mining of large numbers of scientific or scholarly articles; having computers analyse articles can reveal relationships that humans could not possibly have discovered. Such previously hidden relationships can then open up new solutions to research problems....

(3) Is an approach based on a list of non-mandatory exceptions adequate in the light of evolving Internet technologies and the prevalent economic and social expectations?

No; that is undesirable for two reasons. For one thing, a limitative list of exceptions cannot take account of future developments....Secondly, non-mandatory exceptions are insufficiently proof against the monopoly position that right holders have on the basis of their copyright. In order to protect the interests of users such as those in the teaching and research sectors, it is necessary for certain specific exceptions to be made obligatory. SURF believes that two aspects are involved here:

  1. It must be made mandatory for the provisions of the exceptions concerned to be implemented in all Member States, thus achieving harmonisation at European level.
  2. A contractual provision that runs counter to the exception must be deemed null and void, and the exception must not be thwarted by technical protection measures....

(19) Should the scientific and research community enter into licensing schemes with publishers in order to increase access to works for teaching or research purposes? Are there examples of successful licensing schemes enabling online use of works for teaching or research purposes?

SURF’s answer to the first of these questions is yes....

SURF adopts the position that scientific and scholarly information and research data generated by researchers should be freely accessible for the general public, specifically when the research has been publicly financed. This principle does not conflict with the economic interests of authors at research universities and universities of applied sciences....

In contrast to this, one has the economic interests of the publishers. Traditionally, they have protected those interests by requiring authors to transfer their copyright, thus ensuring themselves of an exclusive monopoly to information, including on the Internet. They only make information available in return for payment. The teaching and research exception in the Directive is of no benefit in this situation because someone who does not actually have the content at their disposal cannot legally reuse it. Technical protection measures or contracts thus make it impossible to reuse work on the basis of the teaching and research exception.

Publishers are dependent, however, on the content provided by scientists and scholars and they are increasingly acceding to their wish for Open Access. Rather than arranging this subsequently by means of licensing agreements, it is more efficient (and also cheaper) to tackle the problem at source by doing away with the information monopoly enjoyed by the publishers. This can be done in a number of different ways:

  • The author does not transfer his copyright but grants the publisher a non-exclusive licence, a “Licence to Publish”. SURF and JISC have developed a model for this, with associated principles.
  • Information can be made available online free of charge via a number of sources, for example via repositories operated by universities.

The Creative Commons licensing system is an example of an effective system in which it is the author himself who determines the conditions under which his work may be reused, for example for non-commercial purposes such as teaching and research. SURF is not in favour of Open Access models which require the author to pay the publisher in order for his articles to be accessible solely on the publisher’s website; such a system still does not make the content available for everyone free of charge....

From Heather Morrison:

19. Should the scientific and research community enter into licensing schemes with publishers in order to increase access to works for teaching or research purposes? Are there examples of successful licensing schemes enabling online use of works for teaching or research purposes?

Comment:

In the Internet age, the optimum dissemination of scholarship involves open sharing of peer-reviewed research articles, data, and more. The human genome project illustrated just how rapidly humankind can advance in knowledge through open sharing of information and cooperation, building a solid base of knowledge on which researchers - both academic and commercial - will be able to build for many years to come. We need to apply such approaches in other areas, such as finding solutions to global warming and sustainable energy sources....

In the Internet age, the copyright that makes sense is the Creative Commons approach, not at all the traditional copyright transfer agreement. Many scholarly journals are using Creative Commons licenses. Even traditional publishers are increasingly seeking only a License to Publish, leaving copyright with the author.

Research funding agencies, universities and research organizations around the world have, or are developing, strong policies requiring open access to the published research of funded research, and to research data....

Notes on the U of Cagliari's OA workshop

An unnamed blogger at the University of Cagliari library blog has posted some notes from the university's "study day" on OA, Accesso aperto e comunicazione scientifica (Cagliari, November 18, 2008).  Read them in Italian or Google's English.

The six presentations from the workshop are now on deposit in the Cagliari IR.

Norwegian govt official on OA

Øystein Johannessen, Økt bruk av Open Access i vitenskapelig publisering [Increased use of open access in scientific publishing], a slide presentation at Money talks – New institutional policies in scholarly publishing (Tromsø, Norway, November 28, 2008) --now in progress.  The other presentations aren't online yet, but they should be soon.  Johannessen is the Deputy Director General of Norway's Ministry of Education and Research.

Founder of OA medical journal honored as Junior Scientist of the Year

Junior Scientist of the Year award goes to founder of an open-access (OA) journal, Informationsplattform Open Access, November 28, 2008.  Excerpt:

The career portal academics is a joint initiative of the German weekly newspaper DIE ZEIT and the magazine Forschung & Lehre. Each year it awards a prize to the "Junior Scientist of the Year". This year's award, which came with prize money of 2000 Euros, went to Dr. med. Christoph Kleinschnitz of the University Hospital, Wuerzburg. Dr. Kleinschnitz received the award in recognition of the fact that he founded the Journal of Experimental Stroke & Translational Medicine (JESTM), the first ever OA journal worldwide in the field of stroke research. JESTM provides cost-free publication of and access to scientific literature which is of particular benefit to researchers in developing countries....


Wednesday, November 26, 2008

Presentation on the CIHR policy

Geoff Hynes, Access to Research Outputs, presented on the Canadian Institutes of Health Research OA policy at the University of Ottawa, November 4, 2008. (Thanks to Jim Till.)

See also our past posts on CIHR.

Podcast of talk by PLoS CEO Peter Jerram

A podcast is now available of the talk by Public Library of Science CEO Peter Jerram entitled Open Access: Accelerating Scientific Progress at Rockefeller University on October 27, 2008. (Thanks to Joe Bonner.)

Varmus one of the 10 most influential people in science

Discover Magazine has named Harold Varmus one of The 10 Most Influential People in Science (November 26, 2008).  Excerpt:

Harold Varmus; Former Director, National Institutes of Health: Champion of Open Access

Nobel laureate Harold Varmus was one of the driving forces of medical research even before he tried to revolutionize the way scientists do their work. In the 1970s Varmus and his colleague Michael Bishop discovered the cellular origin of retroviral genes that turn cancerous, launching the modern era of cancer research.

In the Clinton administration, Varmus led the National Institutes of Health (NIH) and transformed it into a biomedical powerhouse. “As director of NIH, Harold cultivated bipartisan support of biomedical research,” Bishop recalls, “and sound science was always at the heart of the agenda.”

Varmus’s latest challenge has been an attempt to overhaul the system of publishing research in journals so that all papers are freely available on the Internet —instead of only by expensive subscription. This allows researchers at any level of income, in any part of the world, to build on the body of knowledge. The manifestation of Varmus’s effort, the Public Library of Science and its roster of academic publications, has become one of the most cited sources in academic research and has inspired others worldwide to follow its lead.

OKF joins COMMUNIA

The Open Knowledge Foundation has joined the COMMUNIA network (“the European Thematic Network on the Digital Public Domain”).

JISC report on subject and institutional repositories

Catherine Jones and four co-authors, Report of the Subject and Institutional Repositories Interactions Study (SIRIS), from JISC and STFC, November 2008.  Excerpt:

This report was commissioned by JISC to produce a set of practical recommendations for steps that can be taken to improve the interactions between institutional and subject repositories in the UK....

Key findings

  • The majority of institutional repositories (IRs) are at an early stage of development and the desired ‘critical mass’ of content is far from having been achieved;
  • despite the declared interest of IR administrators in a co-ordinated approach to the gathering and sharing of information, there is in fact very little interaction between repositories;
  • most deposit is initiated and mediated by repository staff, while self-archiving is not yet embedded in author workflows. Technical and administrative solutions for management of research outputs, developments in reporting of article usage statistics, and the requirements of the Research Excellence Framework (REF) are likely to drive cultural change;
  • content collection is strongest in established subject/funder repositories;
  • there may be scope for greater collaboration with publishers in the development of deposit and distribution procedures;
  • repository administrators struggle to identify relevant content/metadata in external sources because identification by author or organisational association is highly problematic;
  • content transfer between repositories requires a relationship of trust, which must in turn be based on explicit metadata standards, clear provenance and rights statements, and agreed protocols for transfer and updates to objects and metadata;
  • there is considerable interest throughout the community in creating aggregations of content held in repositories and other sources by linking to data and related items. The OAI-ORE web content aggregation specification represents one potentially valuable model of a user-centred content organisation technology;
  • there is no coherent approach to content preservation among repositories, and in many cases long-term preservation policy appears underdeveloped. This is a critical issue for the long term;
  • there is wide variation between repositories in metadata formats and quality;
  • for pragmatic reasons many IRs collect largely metadata-only records. The extraction of metrics to support local and national assessment and administration is an important driver for collection. There is a different imperative to acquire, preserve and make freely available full-text content. There is evidence of a trend towards integration of institutional repositories with research management systems.
  • Funding organisations and HEIs share many common purposes and would each benefit from collaboration. That such collaboration is not as yet taking place on any significant scale is attributable less to technical barriers than to the absence of any established structure for the negotiation of co-operative working practices.

Recommendations

We make a total of seven recommendations, which are intended to be achievable in whole or in part in the immediate future. They are variously addressed to a number of stakeholder groups: JISC, funding organisations, repository managers, software developers and creators of content.  This report recommends:

    with regard to standardisation

  1. that continued support be given to implementation of national standards for unambiguous identification of authors, funders and higher education institutions;
  2. that the community work towards the adoption of common information interchange standards;
  3. that a watching brief be kept on the Trusted Repository certification process and that all repository managers participate in this scheme when fully established;

    with regard to best practice

  4. that records transferred from one repository to another contain clear provenance information;
  5. that repositories implement version identification at object and metadata levels;

    with regard to community engagement and dialogue

  6. that a UK repository community forum be established where representatives of subject/funder and institutional repository communities can work to agree and implement standards and protocols for co-ordinated information management;
  7. that continued efforts be made to engage with users and ensure that developments address user needs in viable ways.

Also see the appendix of survey questions and responses.  The report and appendix are also available as DOC files.

Update (11/30/08).  Also see Stevan Harnad's comments:

The JISC/SIRIS report...fails to make clear the single most important reason why Institutional Repositories' "desired ‘critical mass’ of content is far from having been achieved."

The following has been repeatedly demonstrated (1) in cross-national, cross-disciplinary surveys (by Alma Swan, uncited in the report) on what authors state that they will and won't do and (2) in outcome studies (by Arthur Sale, likewise uncited in the report) that confirm the survey findings, reporting what authors actually do:

Most authors will not deposit until and unless their universities and/or their funders make deposit mandatory. But if and when deposit is made mandatory, over 80% will deposit, and deposit willingly. (A further 15% will deposit reluctantly, and 5% will not comply with the mandate at all.) In contrast, the spontaneous (unmandated) deposit rate is and remains at about 15%, for years now (and adding incentives and assistance but no mandate only raises this deposit rate to about 30%).

The JISC/SIRIS report merely states: "Whether deposit of content is mandatory is a decision that will be made by each institution," but it does not even list the necessity of mandating deposit as one of its recommendations, even though it is the crucial determinant of whether or not the institutional repository ever manages to attract its target content....

Labels:

Another comment on the EU green paper

Stevan Harnad, Comment on EU Green Paper: "Copyright in the Knowledge Economy", Open Access Archivangelism, November 25, 2008.  Excerpt:

I am commenting only on the bearing of EC policy on one specific body of content: The 2.5 million articles per year published in the world's 25,000 peer-reviewed research journals in all fields of science and scholarship.

The authors of all these articles neither receive nor seek royalty or fees from access-tolls to their users or their users' institutions. These authors only seek that these research findings should be accessed and used as fully and widely and possible, to the benefit of research progress and applications, and hence to the benefit of the society that funds their research and their institutions....

Sixty-three percent of journals already formally endorse depositing the author's final, revised, peer-reviewed draft in their institutional repository immediately upon acceptance for publication, and immediately making that deposited draft accessible free for all.

For that 63% of articles, it should be evident that no copyright reform whatsoever is needed. What is needed is that the authors' institutions and funders mandate (require) that they deposit and make them Open Access immediately upon acceptance by those journals.

The remaining 37% of articles can also be deposited in the author's institutional repository immediately upon acceptance for publication, but unless their publisher endorses making them immediately Open Access, the deposit has to be set initially as Closed Access (accessible only institution-internally, to the author and his employer).
It is here that legislation can help, although it is not certain that even that is necessary: A Europe-wide law requiring that publicly-funded research and research produced by employees of publicly funded universities must be made openly accessible will exert the requisite pressure on the remaining 37% journals so that they too should endorse that the deposited articles are immediately made Open Access rather than Closed Access.

Note that peer-reviewed research is fundamentally unlike books, textbooks, software, music, and videos. It is in its very essence author give-away content....

[Basic scholarly] uses all come automatically with free online access....But there are further uses, over and above these, that some fields of research feel they need, including modification and republication. It is likely that free online access will moot the need for copyright modification to guarantee these further uses, but there is no harm in trying to stipulate them formally in advance, as long as it is not treated as a prerequisite for Open Access, of for Open Access Mandates....

PS:  Remember that comments on the EU green paper, Copyright in the Knowledge Economy, are due on November 30.  One of the questions raised in the paper --Question 19, quoted below-- has a strong OA connection.  For other comments on the green paper, see the two batches I posted recently (1, 2).

Elsevier opens its article API

Wouter Gerritsma, The changing face of Elsevier Science, Wouter on the web, November 22, 2008.  Excerpt:

The last couple of days I had the pleasure to attend the Elsevier Development Partners meeting. The exact products they are working on might be of interest to some people, but that’s up to Elsevier to announce. But what was really the big surprise at this meeting -which lasted 3 days- was the tone from Elsevier. It was all about open Science. They clearly wanted to open up. There was a lot of talk about sharing information, making mash-ups possible, Application programming Interfaces (API). Elsevier Science wanted to move away from the double barred information silo to become an open solution provider in the scholarly world....

This change will take time. It doesn’t happen overnight. But Raphael Sidi just announced the other day on his blog the Elsevier Article API at the programmable Web. So, Elsevier is not only talking, they are acting up on it as well....

From the Elsevier Article API (November 16, 2008):

The Elsevier Article API facilitates search and access to scientific journals and scientific articles. The API provides web services for searching for journals, journal volumes, specific issues, articles, and article images. The Article and Article Image specific API interactions provide access to the full-text article XML (and the associated images) and enable a mashup developer to render the returned article in customizable formats....

Update (11/27/08).  Elsevier opened its article API at least to some degree, and temporarily, for its Article 2.0 contest.  Does anyone know whether the recent announcement goes beyond that?

More on OA in the humanities

Peter Plener, Open Access: Einführung, a 24:20 minute podcast introducing the Emergenzen 7 // Open Access meeting (Vienna, October 4, 2008).  (Thanks to Klaus Graf.)  Read the blurb in German or Google's English.

Blog notes from public domain conference

Blog notes (in Spanish) are now available on the Seminario Internacional de Dominio Público (Santiago, November 21, 2008). Topics include OA, CC, OERs, and the public domain.

Update. See also Google's English translation.

18thConnect, OA platform for digital humanities

18thConnect is an OA research platform and aggregator for collections of materials on the 18th century. See this interview from HPCwire with project leaders Laura Mandell and Robert Markley.

Tuesday, November 25, 2008

More on SciVee

Bruce V. Bigelow, The Bourne Innovation: UC Researchers Launch a YouTube for Scientists, Xconomy, November 24, 2008.

... [Phil] Bourne saw the emerging trend [of OA] and recruited Leo Chalupa, a friend and colleague at UC Davis, to launch an online video project to help scientists make their research better-known. Bourne and Chalupa were initially unsure if the project they started last year was merely an interesting science project or a business. But they decided to form a startup company earlier this year around what they call SciVee. It is basically a YouTube for academic researchers. ...

Bourne says his first idea was to essentially create “pubcasts,” which typically consist of a 15-minute video in which the author of a published and peer-reviewed article explains the research and highlights the key findings. ...

Scientists also are using SciVee to enhance research published in so-called “poster sessions” at scientific conferences and to form online communities of interest. ...

By providing a technology platform much like YouTube’s, SciVee enables scientists across a host of disciplines to create content in any field of science, technology, or medicine. Bourne says K-12 teachers and other educational users also are using SciVee to post videos likethis for younger students to access. About 1,000 users have posted videos on the site so far. ...

[T]he National Science Foundation ... provided the $175,000 “exploratory grant” that enabled Bourne to get SciVee started. ...

“It is not an advertising supported model,” says [Marc] Friedmann, SciVee’s CEO. “We are distinctly not pursuing the approach of putting up a website, trying to build a lot of traffic for the site, and then realizing revenue through advertising based on that traffic. To this point in time, we’ve built the business with a little bit of government grant support, but we’re looking to commercialize it by generating revenue from paying customers.”

See also our past posts on SciVee.

Correlation between self-archiving in subject repositories and IRs

Jingfeng Xia, A Comparison of Subject and Institutional Repositories in Self-archiving Practices, Journal of Academic Librarianship, November 2008. Only this abstract is OA, at least so far:
The disciplinary culture theory presumes that if a scholar has been familiar with self-archiving through an existing subject-based repository, this scholar will be more enthusiastic about contributing his/her research to an institutional repository than one who has not had the experience. To test the theory, this article examines self-archiving practices of a group of physicists in both a subject repository and an institutional repository. It does not find a correlation between a disciplinary culture and self-archiving practices.

Update (1/5/09). Also see Stevan Harnad's comments.

Update (1/8/09). Xia has now self-archived an OA edition of this article.

Papers from Ibero-American e-publishing conference

Some of the papers and presentations from the II Conferência Ibero-Americana de Periódicos Eletrônicos no Contexto da Comunicação Cientifica (Rio de Janeiro, November 17-21, 2008) are now online. Some are currently available in full-text, some are abstract only, others don't have even an abstract. Many are OA-related.

See, e.g., this conference paper with an English abstract:

Fernando César Lima Leite and Patrícia Rocha Bello Bertin, Acesso aberto à informação científica em pesquisa agropecuária: modelo metodológico de gestão da informação com foco na melhoria da comunicação científica.

This work describes the ongoing transformations in both the scientific communication process and the scientific information management as well as presents a methodological model for the implementation and development of the Open Access at the Brazilian Agricultural Research Corporation (Embrapa). The aim is to provide the necessary mechanisms to capture, store, organize, preserve and widely disseminate the scientific knowledge produced by Embrapa, SNPA, and by the scientific community involved in agricultural research, through the implementation of the Open Access strategies. It is our contention that effective information management improves institutional scientific communication, which contributes for the betterment of scientific research related processes.

RIN recommendations to university administrators

Research Information Network (UK), Ensuring a bright future for research libraries: a guide for vice-chancellors and senior institutional managers, report, November 2008. (Thanks to Fabrizio Tinti.)

... Effective communication of research results is an integral part of the research process. Library and information professionals are key sources of advice and expertise about the rapid changes taking place in disseminating, publishing and sharing research results ...

HEIs therefore should:

(a) establish policies and procedures, in consultation with researchers and library and information services professionals, so that research outputs are made accessible widely and rapidly, and as far as possible, free of charge to users, ensuring compliance with the requirements of the major research funding bodies in the UK and overseas
(b) develop clear policies and procedures as to the roles that institutional and/or subject-based repositories should play in promoting access to institutional research outputs, as well as in facilitating the creation of registers of these outputs for research evaluation and other purposes; and as to their staffs’ obligations to deposit outputs in such repositories
(c) establish clear policies and procedures on how to support researchers in meeting the costs of the publication fees charged by open access journals ...

Library and information services need sustainable resourcing at a level that enables them to deliver and develop their mission to support and enhance research performance. ...

HEIs therefore should: ...

(c) note that while moves towards open access to research outputs have the potential to reduce costs for some libraries in the long term, open access initiatives will impose additional costs on research-intensive institutions in the short to medium term, in terms of establishing and running institutional repositories and in meeting the costs of publication fees for open access journals ...

Report on Canadian research

Momentum: The 2008 report on university research and knowledge mobilization is a report by the Association of Universities and Colleges of Canada, released on October 21, 2008. (Thanks to Fabrizio Tinti.)

... Universities are also taking steps to make more knowledge available online, for example, archiving of research results through “open access” initiatives. All of these activities require significant investments to maximize the reach and impact of university research both nationally and internationally. ...

Universities are important creators of collections and databases that play crucial roles as platforms for research and dissemination of knowledge of cultural and scientific value. ...

[T]he University of Calgary has a comprehensive collection of designs from 20th century Canadian architects, with 10,000 images available to the general public over the Internet. ...

The University of Prince Edward Island has built up a comprehensive collection relating to P.E.I., with a number of digitization initiatives to bring materials to the public over the Internet.

In addition, libraries and archives from universities and other organizations have been working together for the past 30 years to build a shared collection of early Canadian materials, first on microfiche and then online. With close to threemillion pages of printed materials available online through Canadiana.org, and approximately 15 million pages of printed text available on microfiche, this is the largest collection of early published Canadiana in the world. Key items in the collection include pre-1920 Canadian periodicals, as well as government publications (such as acts, bills, debates, and sessional papers) from the 18th and 19th centuries. The collection has been popular with the public, with as many as fourmillion hits in a single month. ...

Universities are also taking steps to make research results available online. A notable example is the BALSAC population register for Quebec, initiated by the Université du Québec à Chicoutimi. This is a computerized database designed for automatic construction of family histories and genealogies, going back as far as 18 generations. Another example is the CFI-funded Synergies project led by the Université de Montréal, which will provide access to datasets, theses, conference proceedings and 170 journals currently assisted by SSHRC. A third example, in the natural sciences, is the Biodiversity Institute of Ontario, located at the University of Guelph. The Institute ... houses the Canadian Centre for DNA Barcoding for identifying and discovering new species, with the data available online. ...

See also our past posts on Canadiana or the Synergies project (1, 2, 3, 4, 5).

End of the line for the House IP subcommittee

Andrew Noyes, Conyers To Abolish IP Subcommittee, CongressDaily, November 12, 2008.  (Thanks to Kara Malenfant.)  Only the first two paragraphs are free online:

House Judiciary Committee Chairman John Conyers will abolish the Subcommittee on Courts, the Internet, and Intellectual Property in the new Congress and instead keep intellectual property issues at the full committee level, a Judiciary aide told CongressDaily today. A Subcommittee on Courts and Antitrust will be created, but no other subcommittee changes are expected, the staffer said.

In the 110th Congress, the IP subcommittee was among the House's most active under the direction of Rep. Howard Berman, D-Calif., who plans to chair the House Foreign Affairs Committee in the coming session....Conyers plans to remain just as active on IP issues at the full committee level, the staffer said....

Comment.  What's the OA connection?  The Subcommittee on Courts, the Internet, and Intellectual Property held the September hearing on the Fair Copyright in Research Works Act (a.k.a. Conyers bill).  What does its abolition mean for OA?  All the subcommittee members who sided with publishers against OA are still members of the larger House Judiciary Committee, which is still chaired by the bill's sponsor, John Conyers.  The new committee structure won't change any votes --although some might have changed because of the testimony at the hearing, subsequent lobbying by both sides, and the election three weeks ago.  However, the new structure will give Conyers greater control over copyright-related legislation, and withdraw the control formerly held by subcommittee chair, Howard Berman.  Conyers, of course, supports the anti-OA bill, and Berman had doubts about it.

43% of Lund's 2008 dissertations are OA

Lund University gives its grad students the choice whether to make their dissertation OA.  Of the 350 dissertations deposited in the Lund IR in 2008, 152 or 43% are OA.  (Thanks to Klaus Graf.)

Update.  While exploring the Lund Libraries pages on OA, I found this 10 question quiz.  Test yourself.  How much do you really know about OA?  If you're reading this blog, you ought to pass.  But can you get 10 out of 10?

More on OA and university presses

Sarah F. Gold, Academic Publishers Debate The Digital Future, Publishers Weekly, November 24, 2008.  Excerpt:

The tension between the digital world and the world of the printed book was highlighted during “Why Books Still Matter,” a one-day symposium on scholarly publishing held November 14 at Yale University. The conference was sponsored by Yale's Whitney Humanities Center, the Beinecke Rare Book and Manuscript Library and Yale University Press to mark the press's centennial....

Michael Heller, an expert on property theory at Columbia Law School, challenged the current publishing business model, emphasizing that all forms of culture today, from music to news, involve assembling information from various sources....Heller said...universities and their presses “shrink fair use, clamp down on copyright 'pirates,' monetize every shard of an idea. I'm all for the survival of university presses, but let's not fund them by crushing the leading edge of art and science.” ...

The final panel, “Whither the University Press,” featured four heads of presses—Donatich of Yale, Peter Dougherty of Princeton, Ellen Faran of MIT and William Sisler of Harvard—who dealt with the nuts and bolts of experiments in digital and open-access publishing and surveyed other ways of expanding their readership. Faran reported on MIT publishing books simultaneously in print and open-access formats. In areas of business and economics, she said, the model has worked. “What I mean by 'work,' ” she explained, “is that we sold 4,000 to 6,000 copies” of the print edition.”

Sisler reported that Harvard's press is working on an open-access journal with the law school and is seeking ways of collaborating with various arms of the university....

Two more responses to the EU green paper

Remember that comments on the EU green paper, Copyright in the Knowledge Economy, are due on November 30.  One of the questions raised in the paper --Question 19, quoted below-- has a strong OA connection.

I excerpted two public responses yesterday.  Here are two more:

From Wikimedia Nederland (English version):

...(19) Should the scientific and research community enter into licensing schemes with publishers in order to increase access to works for teaching or research purposes? Are there examples of successful licensing schemes enabling online use of works for teaching or research purposes?

Education and research are excellent examples of areas in which the separation between copyright owners and content users is not clear. Teachers are historically also the producers of teaching material. Education therefore has a large interest in leeway offered to end users....Not only Wikipedia and Wiktionary are built upon free licenses, but also Wikibooks and Wikiversity which are specifically targeted towards educational goals. This approach also contributes to the empowerment and engagement of people around the world to collect and develop educational content under a free content license, one of the goals of the Wikimedia Foundation. The advantage of free licenses is that they avoid the excessive bureaucracy arising from the demand of remunerations. Where these are used, it seems fair to tie them to a reasonable time frame, after which the work would become freely usable....

From Science Commons:

...Science Commons - Response to Question 19 ...

With respect to governmentally-funded research, the fruits of research should be openly available to the scientific community and the public, in accordance with the principles laid out in the Budapest Open Access Initiative, the Berlin Declaration on Open Access to Knowledge in the Science and Humanities, and the Bethesda Statement on Open Access Publishing....

[TA publishers] have argued that an exclusivity or “embargo” period is needed in order to fund investments in quality control and to support publication costs. We believe that fee-for-access publishing models are not necessarily inconsistent with the broad goals of open access, as long as the embargo period(s), if any, are reasonable, and that subsequent to the embargo period, scholarly papers published in journals are deposited in an online repository and made available for download free of charge and free of technical or legal restrictions....

Furthermore, such works should be licensed to the public under terms that permit redistribution and appropriate reuse....Examples of licenses that support the ability to disseminate and to reuse works include the Creative Commons licenses....[S]uch open licenses need not be limited only to open access journals, but they can also be used a model for licensing works made available after any relevant embargo periods. Such licenses ensure that open access is not only available at a technical level through download (read-only access) but also at a legal level through appropriate licensing of copyright in order to permit the preparation of derivative works and other transformative uses (read-write access), which are central to scientific and cultural enterprises....

For almost two years, Science Commons has operated a portal and a tool called the “Scholars Copyright Addendum Engine,” which aggregates a wide variety of recognized “Author Addenda” by means of which scholars can enter into negotiations with publishers to retain rights of reuse for scholarly and teaching purposes....[But case-by-case negotiations will not bring about widespread transformation.]  Therefore, we believe that effective policy intervention requires action at the funder or governmental level to set the appropriate standards, through mandates and incentives that ensure that fruits of research, and especially government-funded research, are disseminated as broadly as possible and with the fewest legal restrictions, consistent with sustainability and quality.

Science Commons also supports broad digital access to the scholarly and scientific corpus because we believe that many difficult and important scientific and social problems require that scientists and researchers be empowered to take advantage of software, Web tools, and other data management technologies to support advanced searching, querying, and information integration. However, in the present environment in which access to the corpus of scientific knowledge is restricted and fragmented into a variety of “wall gardens,” our ability to use that corpus and to apply modern computer technology to it is likewise fragmented and piecemeal. This has important implications for scientific productivity, impact of funding for research, knowledge dissemination and preservation, and the achievement of social and governmental goals.

Therefore, Science Commons encourages the Commission to consider strategies that incorporate the broad goals of open access, adoption of standardized licenses that facilitate appropriate reuse and exchange of knowledge and research products, and the enablement of digital information technology. We encourage the Commission to consider a variety of tools, including mandates, policies, and incentives to achieve this goal.

Self-archiving policies worldwide

Hélène Bosc, Le droit des chercheurs à mettre leurs résultats de recherche en libre accès : appropriation des archives ouvertes par différentes communautés dans le monde [Researchers' Right to Self-Archive Their Articles In Open Access Repositories: Evolving Policy Worldwide], a preprint self-archived November 22, 2008.  (Thanks to Stevan Harnad.)

Abstract:   In 2002, a group of researchers, librarians and publishers, launched the Budapest Open Access Initiative (BOAI), formulating the concept of Open Access (OA) as well as the two strategies for achieving it – OA self-archiving (BOAI-1, “Green OA”) and OA publishing (BOAI-2, “Gold OA”). The concept of OA spread rapidly among researchers and research policy-makers, but was at first equated almost completely with Gold OA publishing alone, neglecting Green OA self-archiving, despite the fact that it is Green OA that has the greatest immediate scope for growth. After considerable countervailing effort in the form of strategic analysis, research impact and outcome studies, and the development of technical tools for creating OA archives (or  “Institutional Repositories” IRs) and measuring their impact, the importance and power of Green OA has been demonstrated and recognised, and with it has come a growing number of IRs and the adoption of mandatory OA self-archiving policies by universities, research institutions and research funders. In some countries OA self-archiving policies have even been debated and proposed at the governmental level. This strong engagement in Green OA by policy makers has begun to alarm journal publishers, who are now lobbying vigorously against OA, successfully slowing or halting legislation in some cases. It is for this reason that the research community itself – not vulnerable to publisher lobbying as politicians are – are now taking the initiative in OA policy-making, mandating self-archiving at the university level.

Repository software: open source or outsource?

Dorothea Salo, Home-grown versus outsourced repository software, Caveat Lector, November 24, 2008.  Excerpt:

The usual way to characterize the decision to run an institutional repository on open-source software versus outsourcing it is to think of it as in-house IT expertise versus lack of same. If you’re a big library with an IT shop, you run open source. If you’re not, you call up the vendors.

I used to subscribe to that view. After some things I heard at SPARC-DR, I have changed my thinking. Truthfully, your choices may be constrained in either direction; if you have money but no IT staff, you’re hiring a vendor, whereas if you have IT expertise (even at my borderline-competent level) but no budget, you have no choice but to free-ride on open source.

Let’s imagine that you have a real choice, though. You have in-house IT. You also have a budget. You have to make a choice. In that case, the question is where you want to use your people. Where is their time best spent?

My current thinking is that if you have that choice, the only defensible reason to use open-source right now is if you are seriously planning to write novel code on top of the platform you choose....

I’m willing to give a little bit on this stance in the case of EPrints, which I believe stands up very well on its own against vendor offerings....I would also encourage repositories that use vendors to demand that those vendors support SWORD....

Bluntly, I don’t think there’s any excuse to adopt DSpace or Fedora unless you’re planning some serious hacking—unless the value you add actually depends on that hacking....

This equation would change if there were livelier code exchanges in the open-source software communities....

The simple fact is that every minute I spend on DSpace, I’m not spending on content recruitment, advocacy, metadata, copyright clearance, service development and provision, and all the other aspects of repository management. Honestly, those things are where a conscientious repository manager should be spending her time....

How to make an IR, in Portuguese

Fernando César Lima Leite, Curso sobre diretrizes para a construção de repositórios institucionais de acesso aberto à informação científica, presented at Seminário Nacional de Bibliotecas Universitárias (São Paulo, November 10-14, 2008). English abstract:
Building open access institutional repositories workshop (presentation). Provide an conceptual basis and practical training to the participants to building and maintain their institutional repositories.

Monday, November 24, 2008

Role of E-LIS in Brazil

Simone da Rocha Weitzel, et al., E-LIS: um repositório digital para a biblioteconomia e ciência da informação no Brasil, presented at Seminário Nacional de Bibliotecas Universitárias (São Paulo, November 10-14, 2008). English abstract:
This paper describes the bases of digital repositories, its function within the scientific communication and the role of E-LIS in the organization, dissemination and international access to Brazilian scientific publications in Library and Information Science.

New issue of The Electronic Library

The new issue of The Electronic Library is now available. See especially:

Making IRs relevant for the next generation of researchers

Ryan Randall, et al., The Next Generation of Academics: A Report on a Study Conducted at the University of Rochester, report, September 17, 2008. (Thanks to Charles Bailey.) Abstract:
This document reports on the user research portion of “Enhancing Repositories for the Next Generation of Academics” (IMLS Grant No. LG-06-06-0051). We conducted user research from December 2006 through March 2008 to support development of a suite of authoring tools to be integrated into an institutional repository. Our understanding of the work practices of graduate students enabled us to design the authoring tools to meet their needs for individual and collaborative writing and to make it easy for them to move completed documents from the authoring system into the repository.
See also our past post on the study.

New issue of Online Information Review

The new issue of Online Information Review is now available. See especially:

Notes on open data seminar

Stuart Macdonald, Consorcio Madrono - research data seminar, DataShare Blog, November 19, 2008. Blog notes on Seminario sobre datasets (Madrid, November 17, 2008).

60,000 new documents from ERIC

Education Resources Information Center, Digitization Project Releases Full-Text Documents, notice, November 19, 2008.

ERIC is pleased to announce the release of nearly 60,000 additional full-text electronic documents. This release is part of the continuing effort underway within the ERIC Microfiche Digitization Project. To date, more than 170,000 documents scanned from microfiche are now available in full text. The majority of the documents in the current release were published prior to 1978.

The Microfiche Digitization Project will close in March 2009, and ERIC is still seeking to discuss ERIC permissions with authors and copyright holders of works that were indexed 1966-1992. If you or someone you know has work indexed in ERIC that is available in microfiche only, please visit the digitization page to learn more about this initiative. Granting permission for ERIC to release copyrighted works in electronic format will ensure that these materials are widely accessible.

See also our past posts on the ERIC Microfiche Digitization Project.

Winners: best OA content in anthropology

Savage Minds has posted the winners of its contest for best OA content in anthropology. Top honors for journal go to Cultural Analysis, with Anthropology Matters the runner-up.

See also our past posts on the contest: 1, 2.

Amazon gets into the data hosting business

Amazon Web Services has launched Hosted Public Data Sets. (Thanks to ReadWriteWeb.)

AWS Hosted Public Data Sets provide a convenient way to share, access, and use public data within your Amazon EC2 [Elastic Compute Cloud] environment. Select public data sets are hosted on AWS for free as an Amazon EBS [Elastic Block Store] snapshot. Any Amazon EC2 customer can access this data by creating their own personal Amazon EBS volume from a publicly shared Amazon EBS public data set snapshot. They can then access, modify, and perform computation on these data sets directly using an Amazon EC2 instance and just pay for the compute and storage resources that they use. Common use cases for these public data sets would include scientific research, academic studies, and market research. Our goal is to provide easy access to commonly used public data sets like the human genome, astronomy data, and the United States census information.

As ReadWriteWeb points out:

If you have a public data set and hold the rights to the distribution of it, you can submit a request on the AWS Public Hosted Data Sets site to have it included.

See also our past posts on Google's entree into the same business: 1, 2, and 3.

Labels:

Spanish art history journal converts to OA

Imafronte, an art history journal published (in Spanish) by the University of Murcia, converted to OA earlier this year. Its backfiles (to 1985) are also available.

Good advice on pushing for openness under the Obama administration

Tim O'Reilly, It's Not Over: We are "the change we need." O'Reilly Radar, November 22, 2008.  Excerpt:

...[T]he idea that [the campaign for change] is over till the next election is, well, "so 20th century." As Barack Obama said in his presidential acceptance speech:

"What began twenty-one months ago in the depths of winter must not end on this autumn night. This victory alone is not the change we seek - it is only the chance for us to make that change. And that cannot happen if we go back to the way things were. It cannot happen without you."
The question, of course, is the right way to get involved. What do we do next?

There are four biggies for the tech community:

  1. Actually apply for one of the jobs in the new administration. If there's going to be any substance to the incoming administration's plans for change, there will be a need for people with clue from outside the beltway to join in. And this doesn't just mean more lawyers. There are great technical people who've been working from the outside on government transparency. I'm thinking of folks behind initiatives like the Sunlight Foundation, or Everyblock, or public.resource.org. Heck, I'd even reach out to the geniuses behind mysociety.org in the UK. You do a great job of showing what's possible. I'm wondering whether some of you ought to be on the inside, helping to implement "the change we need." Seeing Kevin Werbach and Susan Crawford as the FCC transition team leads was an awesome wakeup call. Hey, these aren't Washington insiders or telecom lobbyists! They are our peeps from the internet community!
  2. Whether inside or out, the tech community can continue to lead by example. I'm imagining legions of bureaucrats saying "it can't be done" countered by demonstration projects that show that "yes we can." I'm remembering Carl Malamud's heroic work putting SEC data online in 1993. The project started with activism by Jamie Love - "you guys ought to do this." Told by the SEC that it would take many years and tens of millions of dollars, Carl got a small team together, built an online database in a few months, and showed them how to do it. After Carl operated the service for two years as a non-profit, the SEC took it over....
  3. Identifying specific proposals for best practices and points of leverage. We held an open government summit at O'Reilly at the end of last year, and came up with some guiding principles for open data, but we need to identify specific government data sets that could be opened up, specific channels for citizen involvement and oversight, and concrete actions that we can take together to make change. Hopefully, change.gov will become a platform for independent citizen efforts....
  4. We really need to weigh in on the issues that matter....There's a lot of discussion on the net, but we need to remember to channel it to the people who are actually making the decisions. If it gets loud enough, maybe they will hear it on their own, but it's good if we can make concerted efforts to bring our suggestions to them via the channels they've provided. Let's give change.gov a chance! ...

PS:  Last week I sent my An open letter to the next President of the United States to Change.gov.  And it's not too late to vote for the OA proposal at Obama CTO.  It's been up for less than 10 days and it's already the 26th most popular proposal (out of 630+) on the site.  If we could rise to 25th or above, we'd appear on the front page and get a lot more attention.  Spread the word.

Two responses to the EU green paper on copyright

Remember that comments on the EU green paper, Copyright in the Knowledge Economy, are due on November 30.  One of the questions raised in the paper has a strong OA connection.

Here are two recent submissions to the EU:

From Knowledge Ecology International:

In general, KEI's position on copyright limitation and exceptions to exclusive rights are as follows:...

[4] The world that is connected to the Internet is experiencing a new era of access, and a proliferation of new approaches to authorizing and using creative works and data. As dramatic as the changes have been, there will be more.

[5] Among the most important features of the new environment are the empowering experience of much more equal access to creative works and data, and an acceleration of the creation, transformation dissemination and re-purposing of information.

[6] The new information technologies have greatly expanded the opportunities to create and use information resources, enabling new actors, new business models and an immense role for non-commercial and user generated publishing.

[7] A legal framework that focuses on protecting commercial rights under the older publishing technologies is inferior to one that recognizes, appreciates and focuses on the value of the new opportunities to create and use information....

Given the growing interest in access to older works that can be digitalized, including but not limited to the many works that have been orphaned by authors or publishers, there is great value now wasted because the European Union (and other countries) provide decades of extra years of exclusive rights, even when the owners of the works don't bother to exploit the works commercially. Solutions to these issues are examined in Part 3 of the attached Draft Treaty on Access to Knowledge. From the point of view of users, there is no reason to extend exclusive rights to owners that are not even willing to register works for sale. While registration of works is not a condition for copyright protection during the minimum Berne/Rome/TRIPS mandatory terms, registration can be required for the extended terms, by converting the extended terms to sui generis regimes with different conditions (not dissimilar to supplementary protection regimes used to extend exclusivity for patented medicines in some countries). The right owners that have valuable assets to protect can easily do so, without laying waste to an enormously valuable body of knowledge....

Question 19: Should the scientific and research community enter into licensing schemes with publishers in order to increase access to works for teaching or research purposes? Are there examples of successful licensing schemes enabling online use of works for teaching or research purposes?

The scientific and research community already enters into many such licensing schemes, with mixed results. As one might expect, there are examples where licenses work well, and examples where licenses are excessively priced and too restrictive. The progress of scientific and research efforts is too important to be solely determined by these licensing negotiations, particularly given the excessive concentration of ownership among publishers of journals, and the widespread evidence of excessive pricing for journals and textbooks. Such licenses also often lack the flexibility to allow works to be distributed across borders, inhibiting the development of distance education services....

From Glyn Moody:

...[T]oday's copyright is around 300 years old; little wonder, then, if it is in need of updating.

Although traditional copyright has been adapted over the years to encompass new technologies such as sound recordings and films, we are currently going through a technological transition that is fundamentally [different] from all those that have gone before. Where sound recordings and film are essentially analogue (physical) objects – wax cylinders or vinyl LPs for early sound recordings, celluloid strips for films – just like books, today's content is fast becoming an immaterial, *digital* artefact.

This is not a change of degree, but of kind. The most dramatic manifestation of that can be seen in the marginal cost of production: the cost of making a digital copy is small, and becoming smaller all the time as technology advances, even for large digital objects like films. Potentially, then, we can contemplate universal access to most if not all digital knowledge, for effectively zero cost. Again, this is a radical departure from the past, where the physical nature of knowledge – books, recordings etc. - and its costs necessarily imposed limits on how widely it could be distributed.

Today, then, the question is not: “How much can we afford to spend on making knowledge widely available?”, but: “How much should we limit the free, universal access to all digital knowledge because of historical constraints like copyright?” ...

Of course, this implies major changes within the world of content creation. But just as manufacturers of horse carriages had to reach accommodations with motor car technology when it appeared, so I believe the content industries must face up to the reality of a digital world....

To those who say it is not possible to operate in a world where digital content is made more freely available, I would draw their attention to the success of open source software, which has been variously estimated to be worth tens of billions of pounds in terms of the value of the lines of code that have been created. Thousands of companies now thrive in the open source ecosystem – from IBM and Google downwards – making money by providing goods and services that use and build on the vast pool of freely-available software.

And so it will be in a world where knowledge is made more freely available. There will be huge opportunities for publishers to make money by providing services around the content, rather than directly by selling that content. In a world where digital knowledge is inherently abundant – because it can be copied endlessly for near-zero cost – it does not make economic sense trying to mandate artificial scarcities that existed when content was analogue....

Against this background I would therefore urge the European Commission to have the courage to move on from old-style thinking regarding content, and to seek to seize the initiative by embracing the extraordinary possibilities of digital knowledge that can be freely shared to create more knowledge, and with it more wealth – and happiness. This means lowering the unduly high barriers imposed by copyright, not enshrining them further in future legislation.

Unfortunately, many of the questions posed explicitly in the current Green Paper are framed in terms that make little sense in this world of digital abundance, couched as they are purely in terms of “exceptions” to current legislation....

Toward an OA directory of special collections in North American libraries

Jeffrey Makala, On the Need for a New, Open Access, Online Directory of Special Collections, ARL Bimonthly Report #260, November 21, 2008. Excerpt:

One year ago this month, ARL published an impressive book and Web site to
commemorate its 75th anniversary: Celebrating Research: Rare and Special Collections from the Membership of the Association of Research Libraries. With this joint publication, a significant and surprising gap in our contemporary information environment has been highlighted: there is no current, freely available directory of major research collections or academic and research library subject strengths in North America. ARL’s Celebrating Research Web site could serve as the springboard for the library community to create one....

So let this be a call:

Let ARL’s Celebrating Research Web site be the start of a new, open access, online directory of all special collections in North American libraries....

OA to Arab genome data

Andrew Hammond, Saudi project hopes to put Arabs on genetic map, Reuters, September 25, 2008. (Thanks to Jose Manuel Lopez Castro.)

Saudi researchers have mapped the first Arab genome in a project to put the Arab world on the global genetic map and improve healthcare. ...

The collaboration between the private Saudi company [Saudi Biosciences], Danish firm CLC Bio and the Beijing Genomics Institute will make their sequencing of Arab genomes available on a public database. ...

The Arab Genome Project this year completed initial sequencing and analysis of its first volunteer, an anonymous tribal figure from Saudi Arabia, in the space of six months. ...

The [Arab Human Genome Project] team want to complete 100 results by the end of 2010 as part of an international "1000 Genomes Project" to establish a detailed map of DNA variations the world over. ...

See also our past post on the 1000 Genomes Project.

Sunday, November 23, 2008

Award for the Cervantes Virtual Library

The Biblioteca Virtual Miguel de Cervantes, an OA library of Hispanic texts, has been awarded the Palma de Alicante for expanding access to its contents. (Thanks to Fulgencio García Ros.)

See also our past post on the Cervantes Virtual Library.

Digitization project at the Royal Galician Academy

Paola Obelleiro, Los fondos de la Real Academia abren la era digital del patrimonio gallego, El País, November 20, 2008. (Thanks to ANABAD Galicia.) Announces a €150,000 project by the Galician government to digitize documents held by the Royal Galician Academy and the National Library of Galicia.

New tool for repository automation: BibApp

Dorothea Salo, About the BibApp, Caveat Lector, November 21, 2008.

... The public face of BibApp is a set of researcher profiles (see live examples from Woods Hole Oceanographic Institution), anchored in but not limited to their publication record. Researchers or their proxies can add photographs, statements of interest, and so on. BibApp is browseable and searchable, and results can be limited by facet. Publication results have OpenURLs attached to them, so interested readers can be directed to their local link resolver.

The behind-the-scenes face of BibApp is a publication-list manager and repository-populator. BibApp vacuums up citation lists in popular bibliography formats such as RIS and RefWorks XML. ...

It does its level best to assign authorship automatically to individuals it knows about ... BibApp performs similar authority control on publisher and journal names, and it will shortly be possible to federate this information in order to pre-populate new BibApp installations with the knowledge other adopters have built.

Once BibApp recognizes a journal or publisher in a just-imported citation, it checks with SHERPA/RoMEO for policies relating to green open access. If it recognizes that the publisher’s typeset PDF is archivable, it bundles up a package for import to a repository via the SWORD protocol, which currently works over DSpace, EPrints, and Fedora. No muss, no fuss, no bothering faculty for keystrokes! ...

New OA resource on women's medicine

The Global Library of Women's Medicine is a new OA resource launched on November 21, 2008. From the announcement:

... The Global Library of Women's Medicine, was launched at a dinner held at The Royal College of Obstetricians and Gynaecologists in London last night. It is a unique web library incorporating a vast range of detailed clinical information across the whole field of women's medicine. It consists of 442 main chapters and 53 supplementary chapters, supported by over 40,000 references, which will be kept permanently up-to-date. The chapters have been written by more than 650 specialists and will reflect some of the very best worldwide opinion. ...

The Global Library of Women's Medicine will support doctors and other health professionals in their care of women. As an open access resource it is hoped that in addition to providing an expert resource for the medical profession in the Western World, it may also be of real value to doctors and others in parts of the Developing World where access to current clinical information has always been challenging. ...

Spanish report on OA

Julio Alonso, et al., Informe APEI sobre acceso abierto, report by the Asociación Profesional de Especialistas en Información (Spain), self-deposited November 14, 2008. (Thanks to BusalBlog.) English abstract:

Report about open access where it is explained that the movement 'open access' is, the new forms of scientific communication, what is a repository and a haverster, and the state of the art of open access in Spain.

Contents:

  1. The electronic edition (e-journals, scientific communication).
  2. Free access to research results (movement for open access, intellectual property).
  3. The publishing and free access (editorial policies, data on open access).
  4. Repositories, harversters and services (repositories, self-archive, harversters, open-access journals).
  5. Repositories and haversters in Spain (the development of repositories, Government initiatives).
  6. Information resources (information sources and bibliography).

Sparky Awards deadline is Nov. 30

The deadline to enter the Sparky Awards for short videos on the value of information sharing is coming up on November 30, 2008. A $1,000 prize awaits the winner.

See also our past posts on the Sparky Awards.

Google-Europeana synergies

Anders Bylund, How the European Union Joined Google's Mission, The Motley Fool, November 20, 2008.  Excerpt:

...[W]hen the EU opens up Europeana, a massive digital library that pulls a wealth of information on European culture out of dusty old museums or private collections and into the digital world, it's like giving Google a helping hand. After all, Google likes to show us the way to important info that somebody else owns.

Its own efforts to digitize university library collections, host historical photographs, and the like are happening because nobody else stepped up to shoulder the responsibility of getting it done. Where's Amazon...when you need 'em? Those deep, long-standing connections to publishing houses could lead to a massive public-domain library one day.

Welcome to the party, Europeana. Don't be scared when Google's indexing spiders crawl across your back. They're just scratching your back -- and sending more traffic than you could muster on your own. In return, you're scratching Google's back simply by existing. With gentle backrubs all around, everybody wins.

OA to US forestry research

Bradley Brazzeal and Patrick L. Carr, The Potential Impact of ‘Public Access’ Legislation on Access to Forestry Literature, Serials Review, December 2008.

Abstract:   There have been increasing calls for the United States (U.S.) government’s implementation of broad public access policies mandating free online access to federally funded research. This study examines the potential impact of such a policy on peer-reviewed forestry literature. The authors analyze information about federal government authorship, federal government funding, and U.S. authorship indicated in articles published in five core forestry journals in 2006. The results of the analysis provide evidence that federal public access legislation would have a significant impact on the accessibility of forestry literature published in leading journals in the field.