Open Access News

News from the open access movement

Saturday, December 23, 2006

Housekeeping notes

  1. I'm slowing down for the holidays, and will try to catch up later with the developments I missed while making merry.
  2. Sometime yesterday Open Access News passed the milestone of 10,000 posts.  About 200 of them were posted by my co-contributors during the time when OAN was a group blog.

Happy holidays!

Open publishing tools for non-profits in developing countries

Tactical Tech and iCommons have released the Open Publishing Edition of NGO-in-a-box.  From the site: 

The Open Publishing Edition of NGO-in-a-box is a toolkit of Free and Open Source software, tutorials and guides for producing, publishing and distributing content. The aimed at small to medium sized non-profits, independent media organisations, free culture creators and grassroots journalists with a particular emphasis on those in developing and transition countries.

The contents of the toolkit were selected by an editorial team made up of leading international practitioners working in DIY publishing, free culture, technology for social justice and the development and deployment of free and open source software....

Movement toward OA in classics

On September 26, 2006, the American Philological Association (APA) and the Archaeological Institute of America (AIA) issued a joint statement on electronic publishing.  (Thanks to Ross Scaife for alerts to this and the documents below.)

The statement didn't mention OA, but the draft Issues and Recommendations for Discussion (October 20, 2006) discussed the problem of OA in the humanities in good detail --but for the false assumption that "most" OA journals charge author-side fees (in fact, most charge no fees).  It recommend a study of the extent to which American classicists lack access to American classics journals.  It also recommended OA archiving and the launch of new OA archives.  The two organizations called for a period of public comments, which ended on December 20.

Gregory Crane, Editor-in-Chief of the OA Perseus Project, has made his December 20 comment public.  Excerpt:

[O]ne way to improve working conditions and research opportunities for university and college teachers is to support, in every possible way and with as much energy as we can muster, the creation of massive digital libraries based on open access such as those now being built by Google, Microsoft, the Open Content Alliance and others. Even if only partially realized, these efforts will expand the intellectual reach of all college and university teachers. If these efforts come close to their original goals, we will find online and freely accessible a larger and far more useful research library than any institution of higher learning has ever created. Classicists stand to gain more than any other discipline, for the field is often strongest at liberal arts colleges which have never had access to first class research environments. Nor is open access alone always enough. After sustained requests from an increasing set of researchers, we at Perseus decided to make all the content that we could available under a Creative Commons attribution/sharealike/non-commercial license. Researchers want to apply their own analytical tools to the full source texts and to create derivative works....

Saving money is a good way to go bankrupt. The strategic challenge that classics faces as a field is to maintain and expand its role in the broader intellectual life of society. Our situation is much closer to that of a university competing with other universities than it is to Hollywood or the music industry raising capital from mass market sales....

The APA reports 3,195 members.21 In our last survey of users (April 2005), 400,000 unique users consulted the Perseus Digital Library, 90% of whom were working with classical materials....

The joint APA-AIA Task Force on Electronic Publication "plans to reach some conclusions when it meets in San Diego in January 2007."

John Blossom on PLoS ONE

John Blossom, Second Nature: PLoS One Picks Up Where Nature Left Off, Content Blogger, December 22, 2006.  Excerpt:

The launch of the new PLoS ONE scholarly research portal looks like a big win for open access research content from a number of angles. PLoS ONE is posting research and will allow interactive review before and after publication for scientific articles via a very sophisticated publishing environment. The PLoS ONE platform applies many of the best practices of social media, providing ready access to comments posting and awareness of active discussions to draw in more active discussions. PLoS ONE will publish all papers that are judged to be rigorous and technically sound, and had already posted more an 100 papers by its launch - a remarkable number for a just-launched scholarly journal of any kind. By contrast Nature's recently shuttered open-review portal trial, which ran for around four months, attracted only 71 authors willing to post their work online and attracted 92 technical comments.

As we noted in our latest news analysis article one of the keys to successful social media products is a dedicated core of trusted contributors who will be able to ensure editorial success. PLoS ONE starts with a global editorial board of more than 200 scholars, ensuring a broad array of inputs for reviewing content. Some of the fears about having content rejected after having had it exposed to comments prior to publication may be relieved by the PLoS ONE policy that allows papers that have been already rejected by PLoS Biology and Medicine journals to be re-submitted via PLoS ONE. This is a potentially valuable feature, allowing research that may not have yet reached the highest levels of acceptance to mature through its exposure to comments from a broader audience.

PLoS ONE is finally opening the doors to the potential for fundamental changes in how scholarly research proves its worth. With an open exchange of ideas and commentary facilitated by technologies long available to the general public and a solid body of research and reviewers PLoS ONE holds out the potential to liberate the highest levels of scholarly innovation from the regimen of the printing press. Changing the way that research is paid for was a good first step for open access, but with the ability to eliminate artificial distribution bottlenecks that choke off natural conversations PLoS ONE may do for scholarly research what Wikipedia has done for reference materials - with much more integrity in the underlying editorial processes.

After PLoS ONE, PLoS Too

Alf Eaton has launched PLoS Too, a mirror of PLoS ONE that he'll use as a "testing ground for trying out article display formats" --taking advantage of the fact that all PLoS ONE articles are free to manipulate under CC licenses and published in XML under the NLM DTD.  If you know Alf's earlier work, this experiment will be worth watching.

Undoing bad copyright transfer agreements

Mia Garlick, “Returning Author’s Rights: Termination of Transfer” Beta Tool Launched, Creative Commons blog, December 21, 2006.  Excerpt:

Creative Commons is excited to launch a beta version of its “Returning Authors Rights: Termination of Transfer” tool....It’s a beta demo so it doesn’t produce any useable results at this stage. We have launched it to get your feedback.

Briefly, the U.S. Copyright Act gives creators a mechanism by which they can reclaim rights that they sold or licensed away many years ago. Often artists sign away their rights at the start of their careers when they lack sophisticated negotiating experience, access to good legal advice or any knowledge of the true value of their work so they face an unequal bargaining situation. The “termination of transfer” provisions are intended to give artists a way to rebalance the bargain, giving them a “second bite of the apple.” By allowing artists to reclaim their rights, the U.S. Congress hoped that authors could renegotiate old deals or negotiate new deals on stronger footing....A longer explanation of the purpose of the “termination of transfer” provisions is set out in this FAQ.

Despite this admirable Congressional intention, the provisions are very complex and have not been frequently used. CC’s tool is intended to go some way towards redressing that....Unfortunately, the termination provisions are currently so complex and technical that this tool can only serve an informational role. Many aspects of the “termination of transfer” provisions require legal analysis which is impossible to code so we are working on linking the tool to legal referrals. This FAQ provides an explanation of the tool’s intended architecture....

We have set up a page on the Creative Commons wiki to gather comments....

Comment. I've never heard of a researcher using this provision to reclaim rights to a published journal article.  If any one knows a past case, please drop me a line.  I'd also like to hear about any researcher who uses the provision in the future to reclaim key rights, especially for the purpose of providing OA to the peer-reviewed full-text.

Update. Read Lawrence Lessig's blog post on the right of termination and transfer. The right doesn't kick in until an agreement is 35 years old, which makes it moot for most journal articles. Journals that don't provide OA to 35 year old articles, and don't let authors do it themselves, are an endangered species.

Profile of South Africa's only OA publisher and its marketing manager

Spotlight on HSRC Press, SA BookNews Online, December 15, 2006.  Excerpt:

As the Marketing Manager of HSRC Press, Karen Bruns has a fairly good idea of what is needed to achieve success in this line of business....

"As far as we know, the HSRC Press is South Africa's only open access publisher. We think we might be the only open access publisher in Africa but as we haven't been able to verify that, we really can't make that claim. We publish both in print and in electronic form. It's one of the things that make us unique and it's probably the most exciting part of what it is that we're doing. It feels very pioneering and at the same time, we're increasing both the pool of and access to high quality social science research-based publications.”

"Considering where we're located, I am often asked whether we only publish the research output of the Human Science Research Council (HSRC). While we do manage all of the intellectual property of the HSRC, the answer is no, as we publish many externally authored works - provided they're furthering the social sciences, which is the mandate of the organisation in terms of a statutory act.” ...

"[W]hat most people ask us most often is whether the open access model assists in selling more books!  The question is most often accompanied by a cynical eyebrow and a wary expression. I am just as wary to answer, because my answer would have to be that we have seen significant year on year sales increases since our inception in 2002."

According to Karen, their success can be linked to the improvement of their products, the increase of their sales network, and their growing efficiencies. She is wary of pinning their success to the adoption of the open access model, as she wouldn’t want publishers, librarians, authors, academics, policymakers, or civil society to think for one minute that the adoption thereof was a marketing ploy!

“The reason that we have adopted this that we wanted to assist in opening access to quality social science in Africa - both to Africa and from Africa.” ...

Thanks to Eve Gray for the alert, as well as for this comment:

Karen strikes a chord for me in this interview when she comments wryly on the fact that that the most common question people ask is whether Open Access online provision sells more books. As she says, that is not the point at all. I cannot imagine that book sales come anywhere near covering the costs of the publishing department. The HSRC provides generous financial support for the HSRC Press, presumably because the organisation finds that this is a good investment. Having its research effectively and widely disseminated achieves the purpose of the research council, ensuring that its research findings have significant development impact. Moreover, I would imagine that its successfully marketed publications profile the HSRC very effectively in the eyes of the government that funds it and contributes to its ability to attract private research contracts to expand its research activities and supplement its public funding.

Making use of AGORA and HINARI

Gracian Chimwaza and Vimbai M. Hungwe, AGORA/HINARI Training of Trainer workshops: imparting hands-on skills on the use of e-resources in agriculture and health in Sub-Saharan Africa, INASP Newsletter, December 2006.  Excerpt:

Since April 2004, ITOCA [Information Training and Outreach Centre for Africa] has carried out 20 AGORA/HINARI National Training of Trainer workshops in 14 Sub-Sahara Africa countries, and over 500 professionals in health and agriculture sectors have been trained. The aim of the training is to equip participants with adequate practical knowledge on the use and access of scholarly literature and relevant electronic resources to enable researchers and information managers improve on research and teaching in the region. The train-the-trainer model has seen over 6000 users trained downstream at participating institutions....

ITOCA spearheads outreach and training programmes for TEEAL (The Essential Electronic Agricultural library), FAO's Access to Global Online Research in Agriculture (AGORA) and WHO's Health InterNetwork Access to research Initiative (HINARI) programmes in the region.

These workshops are conducted over 3-4 days for 25-30 participants. During the workshops, researchers, policy-makers, educators, librarians and extension specialists have access to high quality, relevant and timely information on agriculture and health via the Internet and CD-ROM....

SPARC comment on a draft Australian OA mandate

SPARC has released its December 20 comment on the draft OA mandate from the Australian Government Productivity Commission.  The public comment period ended on December 21.  (For background on this proposal, see my blog post from November 13, 2006.)  Excerpt:

SPARC enthusiastically commends the far-sighted draft recommendation of the Productivity Commission that “published papers and data from ARC and NHMRC-funded projects should be freely and publicly available” (Draft Finding 5.1). This is an important step that will be welcomed by all beneficiaries of research.

While your recommendation recognizes the tremendous potential to expand the impact of Australian research outputs, we encourage you to go a step further in delineating the kind of public policies that are needed. The experiences of other nations have demonstrated that the effectiveness of well-intended policies can easily be undermined by unnecessarily timid implementations. For example, the voluntary public access policy of the U.S. National Institutes of Health, implemented in May 2005, has resulted in deposit of less than five percent of eligible articles in NIH’s digital repository. The agency is now evaluating steps to improve on this unfortunate outcome, but success has been delayed by years.

We believe that, in order to guarantee a better result, it would be useful if your report called for Australian public access policies that fall within these parameters:

  • Research funders should include in all grants and contracts a provision reserving for the government relevant non-exclusive rights (as described below) to research papers and data.
  • All peer-reviewed research papers and associated data stemming from public funding should be required to be maintained in stable digital repositories that permit free, timely public access, interoperability with other resources on the Internet, and long-term preservation. Exemptions should be strictly limited and justified.
  • Users should be permitted to read, print, search, link to, or crawl these research outputs. In addition, policies that make possible the download and manipulation of text and data by software tools should be considered.
  • Deposit of their works in qualified digital archives should be required of all funded investigators, extramural and intramural alike. While this responsibility might be delegated to a journal or other agent, to assure accountability the responsibility should ultimately be that of the funds recipient.
  • Public access to research outputs should be provided as early as possible after peer review and acceptance for publication. For research papers, this should be not later than six months after publication in a peer-reviewed journal. This embargo period represents a reasonable, adequate, and fair compromise between the public interest and the needs of journals.

We also recommend that, as a means of further accelerating innovation, a portion of each grant be earmarked to cover the cost of publishing papers in peer-reviewed open-access journals, if authors so choose. This would provide potential readers with immediate access to results, rather than after an embargo period.

While SPARC is not in a position to evaluate whether Australian public access provisions should be limited to ARC and NHMRC, we believe the benefits apply to all publicly funded research. We urge that your recommendations with regard to public access be framed broadly....

PS:  In October 2006, the Productivity Commission recommended OA mandates for the Australian Research Council (ARC) and National Health and Medical Research Council (NHMRC), and since then both agencies have adopted strong OA policies.  See the ARC policy (c. December 3, 2006) and the NHMRC policy (c. December 8, 2006).

Update (12/25/06). See comments by Stevan Harnad and Arthur Sale on the SPARC letter.

Grassroots book-scanning for uncompromising OA

Nick Hodson has recently launched a pilot project to let web users post OA copies of public-domain books to the Internet Archive.  From his announcement (on Klaus Graf's blog, Archivalia):

I have recently started a project to upload the scans in PDF form of many of the above books to the Internet Archive. The main purpose is to clear the path so that people from all over the world can upload their scans, and was suggested to me by Brewster Kahle. He calls it a Grassroots Book-Scanning enterprise. I am doing a pilot study, with twenty-one books in Stage One, and a further fifty in Stage Two. All the problems should be ironed out by the time this is complete in a few weeks from now. I am working on a manual to advise people wanting to get involved. After that a further hundred books will be prepared, put onto a DVD, and possibly posted for me directly at Internet Archive. There will be many more to follow after that. You can review progress on this project [from this page].

In addition to the PDF I have posted an HTML file for each entire book, and a TEXT file that can be used to make an audiobook. The spelling in the latter has been converted to the American style (some of the posted books have not been done yet). There is also in each book's folder a small text file that explains how easy it is to make a good audiobook, with a recommendation that people should use TextAloud MP3 available from NextUp whence you can also get the highly recommended voices from Acapela. These are of course once-off purchases, and after that you can make the audiobooks for free, except for the small cost of storing them on CDs. The technology also works for most novels on Project Gutenberg. There is a very easy process available within TextAloud for splitting the book into chapter files, correctly named, and from this creating a set of MP3 files for the book, one for each chapter.

Wikipedia-based search engine to challenge Google

James Doran, Founder of Wikipedia plans search engine to rival Google, Times Online, December 23, 2006. Excerpt:

Jimmy Wales, the founder of Wikipedia, the online encyclopaedia, is set to launch an internet search engine with that he hopes will become a rival to Google and Yahoo!

Mr Wales has begun working on a search engine that exploits the same user-based technology as his open-access encyclopaedia, which was launched in 2003.

The project has been dubbed Wikiasari — a combination of wiki, the Hawaiian word for quick, and asari, which is Japanese for “rummaging search”.

Mr Wales told The Times that he was planning to develop a commercial version of the search engine through Wikia Inc, his for-profit company, with a provisional launch date in the first quarter of next year....

Mr Wales believes that Google’s computer-based algorithmic search program is no match for the editorial judgment of humans....Mr Wales aims to exploit the same network of followers and the same type of free software [used for Wikipedia] to create his search engine.

“Essentially, if you consider one of the basic tasks of a search engine, it is to make a decision: ‘this page is good, this page sucks’,” Mr Wales said. “Computers are notoriously bad at making such judgments, so algorithmic search has to go about it in a roundabout way. But we have a really great method for doing that ourselves,” he added. “We just look at the page. It usually only takes a second to figure out if the page is good, so the key here is building a community of trust that can do that.”

Mr Wales believes that the reputation already fostered by his Wikipedia community and the transparency of his technology will build sufficient trust in his search engine to bring in advertising revenue and make the Wikiasari venture profitable....

Update (12/29/06). Ben Vershbow corrects some widespread errors about this project. For more corrections and new details, see Danny Sullivan's interview with Jimmy Wales.

Update (1/6/07). To follow this story, subscribe to the mailing list or read the Search Wikia blog set up by Jimmy Wales.

OA for negative results

Enrico Alleva and Igor Branchi, Making available scientific information in the third millennium: perspectives for the neuroscientific community, a presentation delivered at Institutional archives for research : experiences and projects in Open Access (Rome, November 30 - December 1, 2006). 

Abstract:   The rules governing the globalised process of sharing scientific information in the research community are rapidly changing. From the 1950s, commercial publishers started owning a large number of scientific journals and consequently the marketable value of a submitted manuscript has become an increasingly important factor in publishing decisions. Recently some publishers have developed the Open Access (OA), a business scheme which may help stopping such tendency. Indeed, in the case of an open-access publication, the marketable value of a manuscript may be not the primary consideration, since access to the research is not being sold. This may push scientists to re-consider the purpose of peer reviewing. However, costs remain a key point in managing scientific journals because OA method does not eliminate peer review process. Thus, OA may not solve the problem of the market pressures on publishing strategies. Furthermore, the OA has another strong point: everyone can read OA papers, including scientist living in poor countries. But, will OA method create new discriminations on who can publish on OA journals? Will it be possible to really exclude or strongly limit the influences of the market from scientific publishing? The example of the non-profit e-print arXiv, a fully automated electronic archive and distribution server for research papers with no peer review will be discussed. For neuroscientists, the possibility to make available scientific data, even in the case of negative results (usually, very difficult to publish) is an important step to avoid purposeless repetition of costly experiments involving animal subjects. The possibility to arrange internationally or locally peer reviewed papers in institutional repositories (IR) is a necessity. However, access to IR should be regulated, e.g. banning or limiting profit organizations and exploiting internet systems, professional organizations or network groups.

OA resources for medical education

The Alfa Institute of Biomedical Sciences has launched E-Meducation, a portal of OA resources for medical education.

The IR at the University of Naples

Maria Rosaria Bacchini, fedOA, Open access archives a "Federico II" University of Naples, a presentation delivered at Institutional archives for research : experiences and projects in Open Access (Rome, November 30 - December 1, 2006).  In Italian but with this English-language abstract:

In this paper the Author illustrates the Institutional Repository of “Federico II” Naples University: fedOA. This archive was developed since 2004 and it has been officially introduced in November 2005 based on GNU EPrints software.
This software use got the possibility too go through the following aspects:  [1] Eprint type, [2] Metadata set, and [3] Fulltext format type supported.  Moreover, FedOA archive will contain all the research doctorate e-theses and furthermore even all the University Teachers staff scientific releases.

Getting good data on OA journals

Franco Toni, Statistics of Open Access Journals, a presentation delivered at Institutional archives for research : experiences and projects in Open Access (Rome, November 30 - December 1, 2006). 

Abstract:   The exponential growth of e-journal access and downloads has strongly enhanced the role of statistical data, in order to evaluate the use of resources and define subscription acquisition strategies and their management.

On one hand, the automatic data harvesting performed by computers provides statistics, but on the other hand it does not guarantee the comparability and harmonisation of collected data. Therefore, the process of statistical data formulation has to be supported by the use of standards
- the most important being “counter” which is gradually becoming the de facto in this field. That could permit the merging of obtained results from different systems.

Furthermore, all the main commercial publishers regularly supply reliable statistics unlike Open Access resource suppliers and aggregators, with the exception of BioMed Central, that does provide statistics.

All this could have some negative implications for decision makers that do not have a suitable system to choose between Open Access and equivalent or similar non Open Access resources.

Recent studies have established that Open Access articles have an IF and a citation level higher than the others. It should be fundamental to verify the use of Open Access periodicals compared to the non Open Access ones in the same fields. If the results of this analysis are in favour of Open Access journals, these could become an important factor for the success of the Open Access initiative in terms of reducing library expenditure for serials. The access identification of journals through the user IP address is a globally adopted method and easy to apply, therefore it could bring about a significant increase in the diffusion of Open Access periodicals.

More on OA to American law reviews

Dan Hunter, Open Access to Infinite Content (or 'In Praise of Law Reviews'), Lewis & Clark Law Review, 10, 4 (2006).  Self-archived December 20, 2006. 

Abstract:   This Article is about legal scholarly publication in a time of plenitude. It is an attempt to explain why the most pressing questions in legal scholarly publishing are about how we ensure access to an infinity of content. It explains why standard assumptions about resource scarcity in publication are wrong in general, and how the changes in the modality of publication affect legal scholarship. It talks about the economics of open access to legal material, and how this connects to a future where there is infinite content. And because student-edited law reviews fit this future better than their commercially-produced, peer-refereed cousins, this Article is, in part, a defense of the crazy-beautiful institution that is the American law review.

Friday, December 22, 2006

OA and Web 2.0 tools in medicine

Dean Giustini, How Web 2.0 is changing medicine: Is a medical wikipedia the next step?  BMJ, December 232, 2006.  An editorial.  Excerpt:

Few concepts in information technology create more confusion than Web 2.0. The truth is that Web 2.0 is a difficult term to define, even for web experts. Nebulous phrases like "the web as platform" and "architecture of participation" are often used to describe Web 2.0. Medical librarians suggest that rather than intrinsic benefits of the platform itself, it's the spirit of open sharing and collaboration that is paramount. The more we use, share, and exchange information on the web in a continual loop of analysis and refinement, the more open and creative the platform becomes; hence, the more useful it is in our work....

This tour through Web 2.0 ultimately returns to the idea of using software to create optimal knowledge building opportunities for doctors. The rise of wikis as a publishing medium—especially Wikipedia—holds some unexamined pearls for the advancement of medicine. The notion of a medical wikipedia—freely accessible and continually updated by doctors—is worthy of further exploration. Could wikis be used, for example, as a low cost alternative to commercial point of care tools like UpToDate? To a certain extent, this is happening now as the search portal Trip already indexes Ganfyd, one of a handful of medical wikis being developed....

The web is a reflection of who we are as human beings—but it also reflects who we aspire to be. In that sense, Web 2.0 may be one of the most influential technologies in the history of publishing, as old proprietary notions of control and ownership fall away. An expert (that is, doctor) moderated repository of the knowledge base, in the form of a medical wiki, may be the answer to the world's inequities of information access in medicine if we have the will to create one.

PS:  Also see the call for a medical Wikipedia by Peter Frishauf, founder of Medscape.

More on the benefits of OA for science and scientists

John Wilbanks, Another reason for opening access to research, BMJ, December 23, 2006.  (Thanks to Dean Giustini.)  Wilbanks is the Executive Director of Science Commons

Summary points:  [1] Authors should be prioritising open access to their works —for the good of other scientists and to ensure that the full benefits of the internet and advanced technology may be realised.  [2] Open access is rapidly becoming a mainstream idea in scholarly publishing, with more than 2000 open access journals and more than a million author self archived open access papers.  [3] Legal and technical barriers to open access are easily overcome using freely available tools.

From the conclusion:

Authors have several resources, including open access journals, open access archives, educational materials, and legal tools, that make open access easy and legal to achieve. Although open access has made considerable progress, and more scholarly work is publicly available than ever before, most peer reviewed articles remain closed to both human study and indexing by software. Authors, institutions, funders of research, and scholarly publishers should continue the movement towards open access so that no scholar is disadvantaged by his or her economic status and so that the full value of technological progress can be applied to the scholarly literature.

Copyright management to facilitate OA

Antonella De Robbio, Open access e copyright, a presentation delivered at Institutional archives for research : experiences and projects in Open Access (Rome, November 30 - December 1, 2006).  In Italian but with this English-language abstract:

The aim of Open Access international movement is the removal of any economic, legal or technical barrier to the access to scientific information, this in order to guarantee scientific and technological progress for the benefit of the collectivity. Copyright Management in higher education is a strategic issue because it is involved in any process from creation to dissemination of scholarly works created at the university. Whatever the situation regarding ownership of copyright, university policies should balance the interests of stakeholders by reserving rights or benefits for research uses or teaching activities. A variety of approach can exist even within one country, depending by laws or by habits to faculties. Copyright laws - customized on musical and cinematographic environment - are often inadequate to deal with the complex issues surrounding the management of intellectual works created at universities. Nowadays, inside scholarship communication world, the copyright is perceived as very strong legal barrier, because copyright laws influence in a negative way the dissemination of intellectual research output, and most intellectual content (90%) are hindered
inside editorial platforms. Furthermore authors, but also Universities, not always are awake about difference between
authorship and ownership with disastrous consequences about rights ceased to third market actors which limit or slowdown the dissemination processes and negatively influences the impact on the community, with heavy cultural, social and economic relapses. For this reason authors must take the control of their right and learn to determine the conditions under which her or his work is made available on open access, choosing to deposit a copy of a work in a repository or publish in an open access journal. On the other side the universities, in particular in Italy, should put as priority the identification of stakeholders and the allocation of their own interests. This direction is a crucial step toward the development of policies or agreements that seek to assure to the University and their authors the ability to use and manage the works in fulfillment of their most important interests.

More on OA to publicly-funded data

Michael Cross, Commercial case for free data rises overseas, The Guardian, December 21, 2006.  Excerpt:

Slowly but relentlessly, the question of how government should encourage people to exploit the vast resources of digital information it holds is creeping above the political horizon. Last week, opposition MPs tabled questions to ministers about what action they plan to take following the Office of Fair Trading's criticisms of the way the public agencies behave in the market for data.... 

Technology Guardian's Free Our Data campaign proposes a simple solution: government bodies should make available all data owned or funded by taxpayers, subject to the demands of privacy and security, at the marginal cost of dissemination. In the internet age, this is usually zero. There is a stark contrast between the UK's approach, running parts of government as businesses selling data and the US, where federal data is available for free.

However, an international study carried out for the Office of Fair Trading paints a more complex picture. The study is a rare attempt to perform a consistent analysis of how three different countries' agencies work. They are the US, where federal data is almost all free; Sweden, which has a similar position to Britain's, and Australia, which falls between the two. For each country, the study examines the state mapping, meteorological and companies registration agency. These generate the most commercially exploited public-sector data. But the way they run their businesses varies greatly: Australia's meteorological agency makes a profit of US$6.15m (£3.12m) a year, Sweden's mapping agency US$5.3m. Their US equivalents depend on government funding....

PS: For background, see my post from December 8, 2006.

Thursday, December 21, 2006

Selective webliography on OA and scholarly communication reform

Jim Stemper and Karen Williams, Scholarly communication: Turning crisis into opportunity, C&RL News, December 2006. (Thanks to ResourceShelf.) Excerpt:

Scholarly communication first entered our professional consciousness in the 1990s, centered on the topic of rising serials prices and their impact on libraries’ budgets. Our lexicon was one of problems, crises, and the clear definition of an enemy. Several years’ experience working in this arena has led to a more informed, broader perspective —part of a natural evolutionary process. Formerly we focused almost exclusively on the economic case, with some real successes. A number of faculty and administrators became outraged and engaged. But many also told us the system works just fine for them; publishers told regulators that the real problem is underfunding of universities. To achieve a marked, sustained impact on scholarly communication, librarians need to be advocates for faculty and administrative action. Scholars must be the new face of this effort and focus on how the present system restricts access to their scholarship. In other words, this is no longer just a library problem of serials inflation (with a spillover effect of reduced monograph purchases), but a series of scholarly communication issues and opportunities owned by scholars, their campuses, and their societies.

We still recognize access problems caused by continued high subscription costs, changing copyright laws, and the licensing of access. Current publishing models are still not economically sustainable. But there is a growing awareness of new opportunities for more sustainable models through ongoing advances in technology. There is genuine hope that the symbiotic relationship between higher education institutions, scholarly societies, and commercial publishers, which could previously be characterized as tense and antagonistic, will realize more cooperative and beneficial partnerships in the future.

Where do we go from here? ... Through the cumulative effect of our actions we can accomplish infinitely more than we could alone. In that spirit, the goal of this resource guide is to give nascent scholarly communication efforts a shared knowledge base, one that provides colleagues with tools to build effective programs on their campuses....

Preview of PhysMath Central

Chris Leonard, Details about PhysMath Central, Egg, December 21, 2006.  Excerpt:

[E]arlier this week I was interviewed for a longer article by the people at First Author.  I thought it would be good to share my answers with you all as it gives an insight into what we are planning for in terms of functionality for PhysMath Central. I should have stated that some of these features may not be there on launch, but it gives our development team something to aim for (sorry guys). In any case they will be there very soon afterwards....

[FA] The physics and maths academic communities were pioneering in their adoption of open access, notably with the founding of Arxiv. You also have experience in the commercial sector. How will you work with and borrow from the experience of both these sectors?

We are a commercial company providing an open access service. From a commercial standpoint open access makes sense. Scientists are demanding it and it is almost seen as unethical in some fields to publish results in a subscription journal. It is difficult to see the future of subscription journals as rosy.

But open access does not necessarily imply 'free'. If we are based on a sound financial footing, that bodes well for the long-term future of open access. We are not dependent on grants or philanthropy and will be able to grow with the growing interest in open access in the future.

[FA] You recently promised to take advantage of new technologies to communicate research findings clearly and to meet the challenges of the future. Can you give some examples of these technologies and how you believe they will change the ways scientists research, collaborate and publish?

Sure - this is one of the most exciting parts of working in open access. Not only can we develop tools and services around our data, but anyone can. All articles are available, for free, to anyone in fully-formed XML, so we hope to see some suite of services like 'Google Labs' develop around this data.

However, for our part we intend to use new technology to support the scientific process in many ways. Apart from the tight arXiv integration already mentioned we are also going to use wikis with the editorial board members to refine the scope of the journals, journal blogs to inform everyone of editorial developments, OAI-PMH to update A&I services, RSS for journal content updates, multimedia to support the online text, comments from readers on each article, and we are very keen in working on ways to further structure and open up our data to other services. Other developments, such as 'tagging' of articles and refining the peer-review process will be considered if there is an appetite for it from the community we serve.

There is also an increasing drive to make raw data of experimental results available alongside the article itself. For particle collision data, for example, this would be problematic given the sheer volume of data - but this barrier will come down with time and for some fields it is already possible to publish raw data, so we will be investigating this option in the coming weeks.

More on the Google journal digitization program

Peter Brantley blogged the gist of Google's journal digitization program on November 9, and wrote an open letter to the AAUP about it on November 10, both much before my own first post on the subject December 17.  (Thanks to Dorothea Salo.) 

Oxford's book-scanning project with Google

Ronald Milne, The Google mass digitisation project at Oxford, Liber Quarterly, 16, 3/4 (2006).  Only this abstract is free online:

For most of the 400 years of the Bodleian Library's existence, users have had to travel to Oxford to use its collections. In recent years, Oxford has undertaken a number of focused, 'boutique' digitisation projects. Now, as a partner in the Google Library Project, an immense range of scholarly and other 19th century out-of-copyright library materials from the Bodleian's collections will be digitised en masse and will be made freely available over the internet to anyone who has Web access. Millions of books and journals will be scanned in the course of the project and the author contends that digitisation on such a scale represents a revolution in the dissemination of information that parallels the impact of the invention of printing from moveable type in the 15th century.

OA, annotated edition of Iraq report

Lapham's Quarterly and the Institute for the Future of the Book have launched an OA, annotated edition of the Iraq Study Group Report.  See the background and details by Ben Vershbow on the Institute's blog.

First free online edition of UK consolidated laws

The UK government has released The UK Statute Law Database, the first free online edition of the country's consolidated laws.  Thanks to Francis Irving, who also provided this comment:

It’s super. Last week, access to consolidated versions of the law of the UK wasn’t possible without paying lots of money. Now it is free.

There are some down sides - 40 acts are not covered at all, law is only guaranteed included up until the end of 2001, and the data only has history of changes back to 1991 (details on status here).

Worse, from the point of view of OKF [Open Knowledge Foundation], the copyright/licensing situation is still not good. Now the data is free as in beer, can we have it free as in speech as well please? (More details in my previous post on the subject)

Even so, it is a fantastic new resource, and congratulations to everyone involved in creating it. Meanwhile, make sure you don’t bear armour, you maintain the dykes on the edges of your property, and you don’t write blank cheques.

Update (1/10/07). The source I quoted above has added the following correction: "Thanks to the enquiries of Nick Holmes it has been confirmed that the original copyright notice was a mistake and the database will be fully open, available for anyone to use and reuse under the standard terms of the PSI click-use license. Hurrah!"

Digital media and scientific culture

The new issue of Zeitenblicke (vol. 5, no. 3, 2006) is devoted to Digitale Medien und Wissenschaftskulturen.  The articles are in German but have English-language abstracts.

Fee increase at BMC

BioMed Central has announced an increase in its article processing charges for 2007:

BioMed Central's standard article-processing charge (APC) has been fixed at £750 since July 2005. From January 2007, this will be increased to £850 for the BMC-series of journals and other titles for which peer review is organised in-house. In-house journals with higher editorial input and costs such as Genome Biology will have a higher APC. For independent journals, for which external Editors manage the editorial work, APC changes will take effect from 1 July 2007. The standard APC for independent journals will be £800, but prices will vary according to funding arrangements and agreements with journal Editors.

These increases reflect inflation and increased costs since the introduction of the standard APC of £750 in July 2005 and will keep APCs in line with the true cost to BioMed Central for article production, developing and maintaining websites and editorial systems, providing customer service, and marketing and editorial support.

Full details of all prices will be available on BioMed Central's FAQ page on 1 January 2007.

More on the hybrid OA journal model

Jean-Pierre Lardy, Le modèle de publication hybride: lecteur payant/auteur payant, DADI, October 2006.  (Thanks to Actu-enstblog.)  An overview of the many hybrid journal programs and their different policies on key questions.  (In French.)

More on Citizendium

Mark Chillingworth, Expert edition, Information World Review, November 27, 2006.  Excerpt:

...Discussing Citizendium with IWR, [Larry] Sanger says the main difference between it and Wikipedia is that those who contribute content will have to use their real name and that it will have expert editors....

Sanger says experts will have to specify their credentials to become a Citizendium expert on a user page, as well as offer links to independent sources highlighting their credentials.  He admits, though, that Citizendium offers experts who make their living from writing or publishing material little or no incentive to offer up their knowledge.

Wikipedia has already shown there is a wealth of people prepared to create content or offer expertise for free. “There is a quorum of people who are willing to volunteer for the good of the world,” says Sanger. He adds that he has “dozens” of editors lined up, including PhDs. “A lot of them are disaffected Wikipedia members,” he says....

“It would foolish not to consult [experts], and with the internet it is possible to consult them. There is a potential here now to create a global village of experts who can be consulted on everything. The results could be really useful. A wiki-like project with experts was not possible until now. We needed Wikipedia to show the way.” ...

Despite Citizendium’s non-profit status, Sanger is researching revenue models to pay for its running and possibly for some experts. Sponsorship is a possibility, using a statement form of branding rather than display adverts. But Sanger’s preferred revenue model is to attract philanthropic individuals to pay for the content on certain subjects.

“We are inviting ordinary people who require a certain piece of information to pay for its production. They will pay for the content as a gift to the world.” In this scenario, Citizendium acts as a content broker, selecting and paying an independent author. “Patrons will specify the content, and will be recognised for it as the patron, but they cannot choose the person who writes the content,” Sanger says.

Of the phenomenal growth of Wikipedia, Sanger says: “It is simply a matter of convenience. Most of the information on Wikipedia is locatable on the web, but Wikipedia brings it all together in one place.”  And that is also why Sanger believes that Wikipedia will not be replaced by Citizendium. “Wikipedia has a giant support network. It’s hard to imagine that Citizendium could ever damage it....

More on the Open-Access Text Archive

Tamina Vahidy, A New Digital Library, Line 56, December 21, 2006.  Excerpt:

The latest book digitization effort is The Open-Access Text Archive (OATA) run by the Internet Archive. OATA integrates with Project Gutenberg so, for those who are used to going to for their plain text or Plucker e-content, you can go to OATA instead.

The OATA has a number of advantages over the existing digital book archives we've visited. For one, it has a box marked "batting average" that ranks items by the percentage of people who downloaded them after visiting details pages. This is a refreshing complement to a mere popularity box, which OATA also has. OATA says that keeping a batting average "avoids the 'rich-get-richer' behavior of ranking by the number of downloads."

So far, the most downloaded items from OATA have been, in order, The New Hacker's Dictionary, Ethics of Sex Acts, and Pictures of ARPANET. The items with the highest batting averages are all advanced science texts....

PS:  The article brings welcome new exposure to OATA, but OATA is not "the latest book digitization effort".  It was launched in December 2004.

Nature shuts down its open-review experiment

Nicholas Zamiska, Nature Cancels Public Reviews Of Scientific Papers, Wall Street Journal, December 21, 2006.  (Thanks to TechDirt and Glyn Moody, who both have interesting comments.)  Excerpt:

The journal Nature is abandoning an experiment aimed at bringing Wikipedia-like group editing into the world of scientific publishing.

For several months beginning this past summer, Nature has invited scientists whose articles were shortlisted for publication in the journal to first post their work online for public review....The scientific magazine's move was intended in part to see if a more-open review process could expose low-quality or fraudulent papers that critics of the current system say too often slip into print.

But Nature, which is published by a unit of Macmillan Publishers Ltd., said in an editorial in Thursday's issue that it was ending the experiment due to lack of participation. The journal found that in the competitive world of scientific publishing, the vast majority of authors were unwilling to post their papers and few scientists were willing to criticize their peers' work publicly by posting comments on Nature's Web site....

Meanwhile, another experiment with collaborative editing got under way this week. A new online scientific journal called PloS ONE invites readers to post comments or questions about articles once they are published....


  1. First my standard disclaimer:  open access is compatible with every form of peer review and does not intrinsically favor or resist any of them.
  2. It's a mistake to compare open review to Wikipedia.  Open review invites all comers to offer assessments or judgments, while Wikipedia invites all comers to revise the primary text itself.  Moreover, most systems of open review only post judgments with attribution, while Wikipedia contributions are anonymous.  Finally, of course, most systems of open review --including Nature, PLoS ONE and early proponents of the model like Atmospheric Chemistry and Physics-- combine internal expert review with open external comment. 
  3. Reader comments at Techdirt suggest that the Nature experiment was not widely known, even by Nature readers.  (I blogged it on September 18, 2006.)  In addition, since individual reader assessments may be partial, addressing only a point of intersection with their own research, their value grows over time as the mosaic fills in.  Hence authors and referees may be more willing to take part in a permanent open-review forum than in an experiment that could disappear at any moment.  (The Nature experiment was only scheduled to last three months.)  And it's certainly possible that hopeful Nature authors would rather have the premium whuffie of Nature's standard imprimatur than the unknown value of the new alternative.  If true, then PLoS ONE and Atmospheric Chemistry and Physics should give us a better test of open review than Nature did.

Update. See Nature's own account of the experiment. This article is OA, but Nature's editorial on the experiment is not.

Update. The most detailed account I've seen of the experiment, outside Nature itself, is Christen Brownlee, Peer Review Under the Microscope, Science News, December 16, 2006. Unfortunately, it's only accessible to subscribers.

Presentations on open education projects

The presentations from the meeting, Open Educational Resources: Institutional Challenges (Barcelona, November 22-24, 2006), are now online.

So are the presentations and videos from Univers Lliure (Barcelona, November 29, 2006), on free and open projects at Catalonian universities.

Thanks for both tips to Ignasi Labastida i Juan.

PLoS ONE launches

PLoS ONE officially launched yesterday, at least in beta.  See the launch photos on Flickr and read the PLoS press release:

Until now, online scientific journals have been little more than electronic versions of the printed copy. Today, that all changes with the launch of PLoS ONE, which publishes primary research from all areas of science and employs both pre- and post-publication peer review to maximize the impact of every report it publishes. PLoS ONE is published by the Public Library of Science (PLoS), the open access publisher whose goal is to make the world's scientific and medical literature a public resource.

PLoS has taken a close look at the way scientific and medical publishing works now, and has asked how the Internet can be used to make it work better. As a result, virtually everything about PLoS ONE is new: the peer-review strategy, the production workflow, the author experience, the user interface, and the software that provides the publishing platform.

Harold Varmus, Co-Founder and Chair of the Board of PLoS and President of the Memorial Sloan-Kettering Cancer Center, remarks, "For those of us who have been engaged with PLoS from its conception, the launch of PLoS ONE is tremendously exciting --this is the moment when we seize the full potential of the Internet to make communication of research findings an interactive and fully accessible process that gives greater value to what we do as scientists." ...

Although PLoS ONE opened its doors to manuscript submissions only in August, it already receives in excess of 100 submissions each month and launches with the publication of 100 peer-reviewed research articles. The volume of articles is unprecedented for a journal launch, and is an indication of the strong support within the research community for the PLoS ONE approach....

In almost all other journals, publication of a research paper is a full stop. The next significant step forward will be the publication of another paper following on from the previous work. But in PLoS ONE, as soon as an article is published, a conversation between authors and readers can begin. There might be a question about a method that is described in the article, a link to another useful work or resource that can be added, or an alternative interpretation that can be offered for some of the results. In each case, readers and authors can respond to the addition, and everyone else can benefit from the resulting dialogue. The possibilities are without limit, and the applications of this technology will no doubt hold some surprises.

The beta version of PLoS ONE that is launched today is a work-in-progress. It is presented in beta because PLoS wishes the community to help shape PLoS ONE, and the underlying publishing platform, into its most valuable form. The software is open source, and will form the first part of an innovative and flexible publishing system that will be developed over the next two years and will be available to all groups for storing, disseminating, and sharing literature and data....

PS:  Congratulations to PLoS and bon voyage.

More on the mass digitization of books

Karen Coyle, Mass Digitization of Books, The Journal of Academic Librarianship, November 2006 (accessible only to subscribers) .  (Thanks to Current Cites.) 

Abstract:  Mass digitization of the bound volumes that we generally call “books” has begun, and, thanks to the interest in Google and all that it does, it is getting widespread media attention. The Open Content Alliance (OCA), a library initiative formed after Google announced its library book digitization project, has brought library digitization projects into the public eye, even though libraries were experimenting with digitization for at least a decade. What is different today from some earlier digitization of books is not just the scale of these new initiatives, but the quality of “mass.”

Update. If you're like me and don't have access to the full text, Klaus Graf has blogged an excerpt. Here's an excerpt from his excerpt:

Google has clearly stated that their book project is solely aimed at providing a searchable index to the books on library shelves. They are quite careful not to promise an online reading experience, which would increase the quality control effort of their project and possibly make rapid digitization of the libraries impossible. Library leaders are enticed by the speed of mass digitization, but seem unable to give up their desire to provide online access to the content of the books themselves. If mass digitization is the best way to bring all of the world's knowledge together in a single format, we are going to have to make some reconciliation between the economy of “mass” and the satisfaction of the needs of library users.

Lectures on access and curation

University College London has posted podcasts and slides for the lectures in its series, C21st Curation: access and service delivery.  (Thanks to Richard Akerman.) 

Most are OA-related but see especially Astrid Wissenburg's lecture, Scholarly communications and the role of researcher funders, from April 26, 2006.  Wissenburg is the Director of Communications at the UK's Economic & Social Research Council, which adopted an OA policy about two months after this lecture.  Unfortunately her lecture is one of the only ones in the series to have slides only and no podcast.

OA archive of dead government records

The University of North Texas maintains CyberCemetery, an OA archive of digital documents from defunct US government agencies and programs.  From an article about it in Pegasus News: 

Don't bother asking UNT librarians to remove sensitive but not classified information on the CyberCemetery. They'll refuse. "It's for the people of this country," said [Cathy N. Hartman, assistant dean of digital and information technologies at UNT Libraries] of the CyberCemetery's free and open access. "It belongs to the people."

Update on the European Digital Library

Daniel Griffin, Workshop pieces together European Library digitisation project, Information World Review, December 19, 2006.  Excerpt:

In an initiative launched through the European Digital Library Project (EDLProject), representatives from the European Community’s (EC) various eContentPlus funded programmes which are designed to create easier and improved access to digital collections, have gathered for a workshop held at the Austrian National Library.

The workshop was intended to begin the process of building the European digital library. Remarking on the initiative, Hans Petschar, the director of the Austrian National Library’s picture archive said “It created an atmosphere of enormous willingness to work together with the European Library to achieve a European digital library.” ...

Content...from book, audio, video, photography, rare texts and archived documents from across the EC will need to adhere to certain common criteria including metadata standards, which will mean that users such as researchers can search by theme, across a variety of formats and all under one search inquiry....

It is estimated the European digital library will be built over a three to four year period and its next move is to consider keeping the project sustainable with an organised business model. A second workshop is due to be held once more at the Austrian National Library at the end of January to consider the framework and timelines for the next phase.

Update. Also see the EDLProject's own press release on the meeting.

Wednesday, December 20, 2006

$1 million grant for OA digitization

The Sloan Foundation has given the Internet Archive a $1 million grant to support digitizing projects for the Open Content Alliance.  See today's story by the Associated Press:

...The latest tensions [in the Google Library Program] revolve around Google's insistence on chaining the digital content to its Internet-leading search engine and the nine major libraries that have aligned themselves with the Mountain View-based company.

A splinter group called the Open Content Alliance favors a less restrictive approach to prevent mankind's accumulated knowledge from being controlled by a commercial entity, even if it's a company like Google that has embraced "Don't Be Evil" as its creed.

"You are talking about the fruits of our civilization and culture. You want to keep it open and certainly don't want any company to enclose it," said Doron Weber, program director of public understanding of science and technology for the Alfred P. Sloan Foundation.

The New York-based foundation on Wednesday will announce a $1 million grant to the Internet Archive, a leader in the Open Content Alliance, to help pay for digital copies of collections owned by the Boston Public Library, the Getty Research Institute, the Metropolitan Museum of Art.  The works to be scanned include the personal library of John Adams, the nation's second president, and thousands of images from the Metropolitan Museum.  The Sloan grant also will be used to scan a collection of anti-slavery material provided by the John Hopkins University Libraries and documents about the Gold Rush from a library at the University of California at Berkeley.

The deal represents a coup for Internet Archive founder Brewster Kahle, a strident critic of the controls that Google has imposed on its book-scanning initiative.

"They don't want the books to appear in anyone else's search engine but their own, which is a little peculiar for a company that says its mission is to make information universally accessible," Kahle said....

PS:  Congratulations to Brewster Kahle, the IA, and the OCA.  This is much-needed support for a very important project. 

Final version of ACLS report

Our Cultural Commonwealth, the final version of the report by the American Council of Learned Societies (ACLS) Commission on Cyberinfrastructure for the Humanities & Social Sciences, released December 13, 2006.  (Thanks to if:book.)  From the Executive Summary:

In chapter 3, the Commission also recommends the following measures...:

1. Invest in cyberinfrastructure for the humanities and social sciences, as a matter of strategic priority....

2. Develop public and institutional policies that foster openness and access.

Addressed to: University presidents, boards of trustees, provosts, and counsels; university presses; funding agencies; libraries; scholarly societies; Congress

Implementation: The leadership of the humanities and social sciences should develop, adopt, and advocate for public and institutional polices that foster openness and access....

5. Encourage digital scholarship....

From the body of the report:

By comparison with print, born-digital scholarship will be expensive for publishers to create and, over time, even more expensive for libraries to maintain. Even considering these costs, however, owning and maintaining digital collections locally or consortially, rather than renting access to them from commercial publishers, is likely to be a cost-cutting strategy in the long run. If universities do not own the content they produce —if they do not collect it, hold it, and preserve it— then commercial interests will certainly step in to do the job, and they will do it on the basis of market demand rather than as a public good. If universities do collect, preserve, and provide open access to the content they produce, and if everyone in the system of scholarly communication understands that the goods being produced and shared are in fact public goods and not private property, the remaining challenge will be to determine how much, and what, to produce....

[U]niversity presses could (and should) expand the audience for humanities scholarship by making it more readily available online. Unless this public good can easily be found by the public —by readers outside the university— demand is certain to be underestimated and undersupplied....

These and other experiments in electronic publishing in the humanities and social sciences, and experiments in building and maintaining digital collections in libraries and institutional repositories, need to be supported as they move toward sustainability, and they need to be funded (by universities, by private foundations, and by the public) with the expectation that they will move toward open access —an area in which many of the natural sciences and some social sciences are conspicuously ahead of the humanities.  Open-source software is an instructive analogue here, and the experience in that community suggests, strongly, that one can build scalable and successful economic enterprises on the basis of free intellectual property. It is worth noting, too, that the “Economy of Regard” (that is, prestige) is one of the factors used to explain why this open economy works....

Open access is critical to constructing and deploying meaningful cyberinfrastructure, and it will be important for the humanities and social sciences to engage in active dialogue and then to lobby effectively concerning legislative and policy developments in this area —for example, in support of the Federal Research Public Access Act of 2006. The Open Content Alliance offers one good platform for the dialogue the Commission wishes to promote....We encourage scholarly societies and university presses —currently unrepresented— to join the Alliance.

The Commission also strongly encourages the funders of research in the humanities and social sciences to require from applicants a plan for sharing and preserving data generated using grant funding, and we urge universities with commercial digitization partners to address long-term ownership and access issues when creating those partnerships. We also call on university counsels, boards of trustees, and provosts to provide aggressive support for the principles of fair use and open access, and to promote awareness and use of Creative Commons licenses. We call on senior academic leaders to ensure that their own practices (as producers of intellectual property and as editors of journals) and the practices of university presses, libraries, and museums support fair use and open access. And, finally, the Commission calls on scholarly societies and universities to advocate that Congress redress imbalances in intellectual property law that currently prevent or inhibit preservation, discourage scholarship, and restrain research and creativity.

Comment.  This is a superb report making exactly the right recommendations:  mandate OA, especially for publicly-funded research, lobby for it, support it within universities, support FRPAA, and join the OCA.  Universities that agree needn't wait for funding agencies or governments to act; they can mandate OA to their own research output right now.  Spread the word.

For background, see my blog posts on earlier drafts of the ACLS report.

Librarians back OA to publicly-funded research

Ray English and Molly Raphael, The Next Big Library Legislative Issue, American Libraries, September 2006.  Excerpt:

Taxpayers are storming the fee-based barricades that keep
them from federally funded research....

It’s not just the general public that lacks access: College and university researchers face serious problems, too. Just ask Gary Ward, associate professor of microbiology and molecular genetics at the University of Vermont in Burlington. Ward conducts research in a specialized area of cell biology. The work that he and his departmental colleagues do is relevant to many illnesses, including cancer and AIDS.

Ward estimates that he can access 66%–75% of the articles that he needs for his research thanks to journal subscriptions and database license agreements at the university’s Dana Medical Library. For the articles that his library can’t provide, he’s dependent on the much-more-time-consuming process of interlibrary loan. Given time constraints and the cost of
transactions, he utilizes ILL only if he’s certain that the article he’s seeking will be relevant.

Ward points out the importance of ready access to the scientific literature. As a scientist he needs to see the full text of promising articles —not just abstracts— in order to determine which ones are useful. Time lags are also problematic. “Research scientists need immediate access to all relevant articles so that they can browse and go from one study to another without delay,” he says....

But it’s impossible for Ward to get that kind of access to a significant portion of the journal articles he needs, despite the fact that his library access is far better than that at most academic and research libraries: The university’s medical and general libraries spend over $5 million annually on acquisitions. The result, Ward asserts, is that faculty members “end up teaching what we have access to, rather than what our students most need to know.” ...

Even the largest and most wealthy university libraries, such as Harvard and Cornell, have problems providing all the journals their constituent communities need. “All academic libraries are challenged to provide their students and faculty with effective access to the research literature,” says Jim Neal, university librarian at Columbia University in New York City. “It is one of our profession’s most pressing issues.” ...

The principle of public access to taxpayer-funded research is gaining significant backing. Congressional support for a strong
NIH public access policy and for the Federal Research Public Access Act [FRPAA] is bipartisan, and spans the political spectrum. The principle also appears to have very strong public support. A recent Harris interactive poll, published in the May 31 Wall Street Journal, indicates that over 80% of Americans share the view that government-funded research should be made publicly accessible.

The library community is uniting behind the principle.  Seven major library associations, including the American Library Association, are members of the Alliance for Taxpayer Access. The alliance is coordinated by SPARC, the Scholarly Publishing and Academic Resources Coalition, a leading advocate for change in the system of scholarly publishing. At the ALA Annual Conference in New Orleans, the Association’s governing Council unanimously endorsed a resolution in favor of [FRPAA]....

PLoS ONE is almost ready

Liz Allen, The wait is nearly over! PLoS Publishing Blog, December 20, 2006.  Excerpt:

...Since 2003, we’ve shown the world what high-quality open access journals can achieve. Now we’re ready to shake things up again and we’re involving the scientific community like never before.

Thanks to widespread media and blogosphere coverage, more scientists than we’d ever imagined are aware that we are about to launch PLoS ONE. That’s fine with us - in the words of Oscar Wilde “the only thing worse than being talked about is not being talked about”.

As we are about to release the beta version of this site, we want to acknowledge the inherent challenges of the project and the philosophy that compels us to confront them.

We want to speed up scientific progress and believe that scientific debate is as important as the experimentation itself. PLoS ONE is a forum where research can be both published and discussed – we are launching it in beta so that the whole scientific community can help us to develop it further.

Everything about PLoS ONE is new: the platform, the software, the user interface, the production workflow, the peer-review strategy, the author experience - the development cycle has been intense in order to launch it as soon as possible.

Why did we put ourselves through this? So that we can launch PLoS ONE with its core web 2.0 features at the earliest opportunity. We invite the research community to give us feedback and help us to develop new features.

What makes the site beta? Not the content, which consists solely of quality, peer-reviewed research from hundreds of authors across a diverse range of scientific disciplines. Rather, it’s the tools and functionality surrounding these papers that will be continually refined and developed in response to user feedback....

OA impact advantage beyond self-selection

Travis Metcalfe, The Citation Impact of Digital Preprint Archives for Solar Physics Papers, Solar Physics, Volume 239, Issue 1-2 (December 2006) pp. 549-553.  (Thanks to Michael Kurtz via Stevan Harnad.)

Abstract:   Papers that are posted to a digital preprint archive are typically cited twice as often as papers that are not posted. This has been demonstrated for papers published in a wide variety of journals, and in many different subfields of astronomy. Most astronomers now use the server (astro-ph) to distribute preprints, but the solar physics community has an independent archive hosted at Montana State University. For several samples of solar physics papers published in 2003, I quantify the boost in citation rates for preprints posted to each of these servers. I show that papers on the MSU archive typically have citation rates 1.7 times higher than the average of similar papers that are not posted as preprints, while those posted to astro-ph get 2.6 times the average. A comparable boost is found for papers published in conference proceedings, suggesting that the higher citation rates are not the result of self-selection of above-average papers.

From the conclusion: 

Despite the slower adoption by the solar physics community, digital preprint archives boost the citation rates of posted papers to twice the level of unposted papers, a conclusion first noted in the comprehensive study of Schwarz and Kennicutt (2004). The evidence suggests that, like many other subfields in astronomy, the citation rate is elevated from the improved visibility of the paper rather than from self-selection by authors choosing to post above-average papers. Unlike other subfields, solar physicists maintain an independent preprint archive which also boosts citation rates, though the broader user-base of astro-ph provides a larger boost.

If citation rates track the assimilation of new results by the community, then astro-ph seems to be the best single form of communication available. Editors who want to maximize the impact of their journals should encourage authors to post their preprints to astro-ph. Authors in solar physics, where astro-ph is currently underutilized, should consider the advantages that other subfields have already discovered.

An author addendum for open data

Bill  Hooker, Where are the data? Can I have them? What can I do with them? Open Reading Frame, December 17, 2006.  This is a short excerpt from a long lost.  For critical detail, you should read the whole thing.

...Now, 12 years [after Stevan Harnad's subversive proposal], Open Access is gathering momentum and forward-looking advocates of knowledge as a public good are thinking about Open Data (some extra background here).  Peter Murray-Rust recently stepped up with a subversive proposal of his own:

The simplest thing that researchers can do [to promote Open Data] is to add a Creative Commons license to their data....

I think Peter's proposal is a good one, similar in form and effect to the SPARC author addendum.  Importantly, Science Commons also offers author addenda, and will soon offer them in the machine-, human- and lawyer-readable versions that come with all Creative Commons licenses; as Peter notes, the machine-readable version is crucial to full Open Data utility.  Use of the proposed Open Data addendum (in combination, where necessary, with an Open Access addendum) would clarify the legal status of an author's data, provided we get the wording right.  Herewith some thoughts on how to do that, based on the questions in the title.

First, note that papers do not usually contain raw (useful, useable) data. They contain, say, graphs made from such data, or bitmapped images of it -- as Peter says, the paper offers hamburger when what we want is the original cow....

So if authors want to make their data openly and usefully available, they will need to host it themselves or find someone to host it for them.  Many journals will host supplementary information, and many institutional repositories will take datasets as well as manuscripts.  I have been saying for some time that it should by now be de rigueur to make one's raw data available with each publication. This is very rarely done -- even supplementary information, when I have come across it, tends to be of the hamburger-rather-than-cow variety and so not very useful....

Second, there is the issue of licensing ("Can I have them?  What can I do with them?").... 

It's true but it's simply not enough that, having published in BMC, the authors are probably amenable to giving me the data and allowing me to do with them as I please.  I need unfettered access to the data at the same time as I access the paper.  Even as a human I don't have time to chase down permission for every dataset I want to re-use, and if I'm data-mining by web crawler I need machine-readable licenses that tell my robot what it can have....

So how about Public Library of Science and Hindawi, the other major OA publishers?  Well, Hindawi seems to say nothing about data whatsoever, only that authors retain copyright and articles are published under a CC Attribution license.  PLoS also publishes everything under a CC Attribution license, which says nothing about data, but if you dig a bit you find encouraging things in the editorial/publishing policies....

I suspect another toothless tiger.  It's not that I want the tiger to have teeth, that is, for journals to actively police data availability, but that I wonder why I have to go digging around the website just to find this wishy-washy nod in the general direction of Open Data.  To illustrate my point here, suppose I read a paper in PLoS Biology, and I want to get my hands on some raw data from that paper: where are they?  Can I have them?  What can I do with them?  All of these things are, basically, left up to the authors.

Now remember that these highly unsatisfactory examples are drawn from the most prominent Open Access publishing houses, which might be expected to be much more supportive of Open Data than commercial publishers.  Thus the power of Peter's Open Data addendum becomes apparent: it is attached directly to the paper, so readers do not have to go hunting through journal websites to find out the intellectual property status and location of interesting datasets.  It allows authors to take control....

So, finally, let me take a stab at a draft Open Data addendum....

That's not perfect, not by a long shot -- most especially not for automated data mining, which requires machine-readable metadata and data. It should, however, do what Peter suggests: provide some relief from endless rounds of find-the-permissions, and get a much-needed conversation underway.

Another university press focusing on OA

I just found a few nice surprises at the English-language home page of the Karlsruhe University Press:

Karlsruhe University Press focuses on electronic publications. We also offer the production of printed books. Karlsruhe University Press has four main goals:  Speed....Quality control....Long term availability....[and] Free access: Research results of the University are transferred by the University Press to the scholarly community in accordance with Open Access, to help ensure the free access to scholarly information.

With the continuing proliferation of scholarly information, the future rests in electronic publishing and the internet. Online information is characterized by ease of access, at any time and from any place and reusability, and will lead to an improvement in the field of scholarly information and to higher efficiency of scholarly work in research, teaching and study.

Karlsruhe University Press thereby follows the positions laid out in the paper "Information vernetzen - Wissen aktivieren" [September 2002] of the German Federal Ministry for Education and Research, demanding free access to scholarly information world-wide for everyone at any time and from any place under fair conditions, as well as the promotion of information systems with high-quality service products under the criteria of the global market and efficiency....

Karlsruhe University Press combines the two concepts of Creative Commons and Open Access by publishing scholarly papers electronically on the Internet. It supports scholars in the self-determination of their copyrights while simultaneously guaranteeing the authenticity, world-wide access and long-term availability of their works.

YouTube videos enter scholarly debate

When you really want to reach a wide audience, OA is the solution.  But when an OA article in a journal or repository isn't enough, how about an OA video on YouTube?

When Evangelical preacher James Dobson used the scholarship of NYU psychologist Carol Gilligan to argue that same-sex couples should not raise children, Gilligan made a YouTube video to assert that Dobson had distorted and misrepresented her research.

For background, see Paul Thacker, Fighting a Distortion of Research, Inside Higher Ed, December 19, 2006.  Don't miss the comments at the end of the article.

Eprints 3.0, Release Candidate 1

Eprints 3.0, Release Candidate 1 was released on Monday.  For details see the Eprints wiki page on it, the evolving manual, or the demo

Podcast interviews from CNI's 2006 Meeting

Matt Pasiewicz has recorded podcast interviews with a range of people from CNI's 2006 Fall Task Force Meeting.  See esp. those with Herbert Van De Sompel (on IRs), MacKenzie Smith (on DSpace), David Rosenthal (on LOCKSS), Bill Arms (on digital libraries), Chris Greer (on cyberinfrastructure) and Cliff Lynch (on cyberinfrastructure).

WebGencat over DSpace for ETDs

Lisa A. Atkinson, The Rejection of D-Space : Selecting Theses Database Software at the University of Calgary Archives, a presentation delivered at the 9th International Symposium on Electronic Theses and Dissertations (Quebec City, June 7-10, 2006).  (Thanks to the Richard Jones.)

Abstract:   This paper presents the results of a University of Calgary Archives project to investigate the replacement of our existing theses and dissertations database. In collaboration with the University Library’s Information Technologies department the Archives undertook a pilot project aimed at replacing the existing descriptive database and expanding on its capabilities by implementing D-Space as an institutional repository for electronic theses and dissertations. Ultimately D-Space was rejected as a viable option for housing the theses database, although the software is still used by the University for its Institutional Repository which contains other forms of scholarly output by the university’s academic community. The Archives selected another software program [WebGencat] for its theses database the details of which are given at the end of the paper.

OA at the U of Lyon

S. Dalhoumi, J-P. Lardy, and O. Larouk, Open access at the University of Lyon: a comparative and disciplinary approach.  This undated report is based on a survey from February-March 2006.  Excerpt:

Free documents exercise a real attractiveness on researchers. 93.5 % of them declare to consult firstly and preferably free digital documents against 4 % which consult first of all the documents they have to pay for.

Practically every discipline has their server of favourite open archives....

Researchers give evidence of attendance in the consultation of these open archives. 52 % of the interviewed people declare to consult them at least once a week.

An active minority of 32 % use alert systems by mail or RSS....

[T]raining actions are indispensable and have tangible results in daily practice. Therefore 46 % of the users belong to biology, but this does not explain altogether this wide use....

The practice of open archives at Lyon 1 University is henceforth a more than emerging use even though it is still not well established. But this practice declines in two different but additional types of behaviour : the use of available resources stored by the others and a contribution to the enrichment of these databases by depositing its own scientific production. An active minority adds up these two types of behaviour. Among this active minority, 15.5 % know titles of journals in open access and 11 % have already published some of their works on a server of open archives with an equal practice between pre-publication and post-publication. More than half of this minority (32 %) belongs to medicine and 20.5 % to biology.

It should be said that the urge to publish is obvious in our researchers’. 22 % have already paid to publish some of their articles and 29 % are ready to pay to be published.

The only brakes which make our researchers hesitate a little are the fear of plagiarism and the lack of validation by the peers of the spontaneous production of these documents. This leaves a beautiful space to university libraries in France for the selection, the promotion and the development of this movement of open archives.

Tuesday, December 19, 2006

New IR at University of London's School of Advanced Study

The University of London School of Advanced Study (SAS) has launched an institutional repository, SAS-Space.  For details, see yesterday's announcement.

New OA journal of philosophy and theory

Shibboleths is a new peer-reviewed, open-access journal of philosophy and theory, especially in Caribbean culture.  The inaugural issue (September 2006) is now online.  From the site: 

We welcome submissions that seek to trace the history of these fields (including discussions of particular philosophers or theorists, specific phases and periods, and schools of thought), explore their development in particular socio-cultural contexts (e.g. African as opposed to Asian thought), and examine particular topics, problems or issues of interest to philosophers and theorists....

Though not exclusively Caribbean in focus, attempts to theorise the Caribbean, theories by Caribbean thinkers, and efforts to situate Caribbean thought within a wider socio-historical and intellectual context inevitably form an important part of what we do, given the roots of many of us involved in this project, and are particularly welcome. 

Rockefeller / InnoCentive deal fosters open innovation

The Rockefeller Foundation will pay the fees for selected non-profits to use InnoCentive to solve problems in the developing world.  From yesterday's announcement:

The Rockefeller Foundation and InnoCentive today announced that the Foundation will create a non-profit area on InnoCentive’s global scientific network, specifically designed to spur science and technology solutions to pressing development problems....

This “open innovation model,” which InnoCentive has pioneered in the “for-profit” arena using challenges posed by R&D-focused companies, will now be turned, for the first time, toward technological problems faced by poor or vulnerable people in the developing world. The new agreement is the first step in a larger Rockefeller Foundation initiative aimed at promoting innovation in a manner that spurs development, and that specifically increases access to proven innovation models for work on behalf of poor or vulnerable populations around the world....

Under the agreement announced today, The Rockefeller Foundation will select non-profit entities and others with charitable intent eligible to use the InnoCentive platform under preferred conditions, and will pay access, posting and service fees on their behalf to InnoCentive, as well as challenge awards to those researchers solving the technology problems the non-profits pose....

Notes on the Kansas OA meeting

Gavin Yamey has blogged some notes on the OA meeting at the University of Kansas Medical Center on October 6, 2006:

If you're looking for a primer on open access publishing --the benefits and challenges--you may be interested in watching a video recording of a one day seminar held at Kansas University Medical Center.

I should declare up front that I was one of the speakers. My talk was on how restricted access is impeding global health, a topic that I feel passionate about. Other speakers who advocated for open access were Heather Joseph, executive director of SPARC, Melissa Norton, medical editor at Biomed Central, and John Wilbanks, executive director of Science Commons.

John's talk was the most eloquent and persuasive account of the potential benefits of sharing science that I have ever heard. His organization, Science Commons, is working to advance science by "removing unnecessary legal and technical barriers to scientific collaboration and innovation." One of its demonstration projects is the Neurocommons, which is building on open access scientific knowledge to build a Semantic Web for neurological research. Now that's what I call open access in action.

Just so that I don't get accused of misrepresenting the seminar, I ought to tell you that some of the speakers were from "the rear-guard" of publishing. Daviess Menefee, the Director of Library Relations for the Americas at Elsevier, discussed Elsevier's for-profit subscription-based publishing model. Alice Ra'anan, Director of Government Relations and Public Policy for the American Physiological Society, outlined a non-profit society publishing model, also based on subscription fees.

I suspect, however, that the days of granting access to scientific and health information only to those rich enough to pay for it are well and truly numbered. Disseminating knowledge globally, and allowing everyone to use it freely, is a crucially important tool for human development.

Indeed, Professor Calestous Juma and I have recently published a paper, called Improving Human Welfare: The Crucial Role of Open Access, in the journal Science Editor. We agreed to publish the piece on the condition that we could make it freely available, and we've now done that by depositing a copy into the open archive E-LIS.

As we say in our article, "Developing countries are increasingly improving their capacity to use scientific and technical knowledge to solve local problems. They are investing in communication infrastructure and improving technology policies. For such measures to be effective, those countries also need greater access to the world’s pool of knowledge."

Open bibliographic data

Paul Miller, 'Open Data is not the point'? Oh yes it is, Panlibus, December 19, 2006.  Excerpt:

Over on One Big Library, Dan Chudnov provides a thoughtful post in response to our Open Data podcast. Although notionally on holiday, Richard beat me to a response this morning....

Whilst there is value to those libraries that already buy (or otherwise acquire) Library of Congress data and subscribe to various OCLC services in order to ease the flow of their bibliographic workflows, for many of them this value may not be readily apparent as the payments for these services simply disappear from some ring-fenced account every year as part of their ongoing operational budget. Whether the big pool of data upon which they draw is paid for by some other arm of their organisation or acquired Openly matters little to many of those actually putting the data to work in their catalogue today. That is not to say that they don't want open data, or that open data is not a better way to proceed. It simply recognises the reality that changing the financial model doesn't immediately impact in their world.

Casey's premise is that smaller institutions should be entitled to access bibliographic data - and software - previously beyond their means, and it is this that he sees as the important issue here. Good luck to him on that.

We have a vast quantity of high quality data. We're wasting it. Let's stop. Opening up access to the data allows existing stakeholders to do some interesting things. It also creates opportunities for whole new groups of beneficiaries to do things we haven't thought of yet....

Map data has been around for a very long time. Satellite imagery has been around for quite a long time. Even in countries like the US, where federally funded data were notionally free, third parties were heavily involved in adding value to the basic cartographic units. For the explosion in usage which we see today, it required a change in attitude and the involvement of Google et al. We went from essentially lousy but free or better but costly map data and ridiculously expensive satellite imagery to an explosion in the use of both, essentially free at the point of use.

More on redirecting funds from TA to OA

Heather Morrison, Transitioning to open access: beyond fear of change, Imaginary Journal of Poetic Economics, December 18, 2006.  Excerpt:

...In a recent very well-written post to Liblicense, Sandy Thatcher, Director of Penn State University Press, expresses the viewpoint that a critical mass of self-archived articles will cause sudden, nonlinear change, "a point at which major disruption to the system could occur, with at least very harmful short-term consequences". Sandy is concerned about the possibility of "a real "tipping point," which would lead some major publishers to abandon the field of STM journal publishing in the belief (however erroneous) that they could not sustain their expected profit margins under the new regime thus legislated".

Sandy goes on to point out that an exodus of a few major publishers from the system, due to perhaps erroneous beliefs about cancellations, would cause major disruption in the scholarly communications system. Sandy worries that journals would not be able to find new homes, because "smaller publishers, especially university presses, simply do not have the capital to launch the kinds of sophisticated systems that these major players can provide..."

What Sandy is overlooking is the fact that if a few large, highly profitable publishers were to suddenly decide to go into other businesses - there would suddenly be large amounts of capital available from library budgets, to support university presses such as Penn State. So if, in such a scenario, there are "plenty of professors, administrators, and librarians appealing to us to expand our journals program", as Sandy predicts - no worries, they will have lots of cash! ...

Welcome to SPARC Japan

SPARC and its Japanese partners have launched SPARC Japan.  From yesterday's announcement

SPARC (Scholarly Publishing and Academic Resources Coalition) today announced the launch of SPARC Japan, a collaboration of Japanese academic institutions and scholarly societies working to promote and make widely available the work of Japanese researchers. SPARC Japan is an initiative of the Tokyo-based National Institute of Informatics (NII), a national research institute which unites Japanese academic librarians, scholars, researchers, universities and learned societies to support initiatives that improve scholarly communications in Japan.

SPARC Japan is supported by more than 600 university libraries affiliated with library associations, including those of national, private and prefectural/municipal university libraries. As an authorized SPARC affiliate in Japan, SPARC Japan will create initiatives that encourage improved access to Japanese research and support expanded institutional and scholarly community roles in, and control over, the scholarly communication process.

"Japan is among the top contributors to scientific research worldwide. The launch of SPARC Japan will promote access to Japanese research so that others may benefit from our progress," said Jun Adachi, SPARC Japan Director and Professor at NII. "SPARC Japan will partner primarily with Japanese scientific societies to make their research widely available, while our close association with NII will facilitate the enhancement of Japanese university libraries' open access institutional repositories. These initiatives have been officially recognized and approved by the government in its working report on the Japan Cyber Science Infrastructure, and the overwhelming support we have received so far indicates a great need in the scholarly publishing community that SPARC Japan has already begun to fill." ...

SPARC Japan has already lined up a number of publishing partners. They include the UniBio Press, a non-profit aggregation of biological journals such as the international journal Zoological Science and leading mathematical journals such as Tohoku Mathematical Journal, Japanese Journal of Applied Physics (JJAP), and others. (For a complete list of SPARC Japan partners, please see [here].) ...

Support for the new OA policy at the ARC

The Queensland University of Technology Open Access to Knowledge Law Project has announced its support for the new OA policy at the Australian Research Council.  From yesterday's press release:

Moves to encourage legal open access to all research funded under the Australian Research Council's Discovery
Projects scheme are welcomed by Queensland University of Technology legal copyright academics.

Professor Brian Fitzgerald, who heads the Open Access to Knowledge (OAK) Law Project, funded by the Federal
Government and based at QUT, said the ARC had announced it would encourage researchers whose projects it
had funded to place data and publications in online, open repositories such as QUT's eprints.

He said researchers concerned about copyright issues when making their findings openly and easily accessible
according to ARC requirements would find a comprehensive guide to copyright management strategies in the OAK
Law Report published online this year.

"The OAK Law Report provides a blueprint for research copyright management that sets out the relevant stakeholders' copyright interests and ways to manage relationships between stakeholders to achieve effective open access....The copyright guidelines will provide examples of publishing agreements and model clauses consistent with open access principles," Professor Fitzgerald said....

First newspaper chain to use CC licenses will increase revenue, market cap

Lisa Williams, Newspaper Chain Goes Creative Commons: GateHouse Media Rolls CC Over 96 Newspaper Sites, PressThink, December 19, 2006.  (Thanks to Richard Baer.)  Excerpt:

Over the weekend, the Watertown TAB [newspaper] of Watertown, Massachusetts, revamped its website....[A]t the very bottom, a small silver badge with a line of text that reads: "Original content available for non-commercial use under a Creative Commons license."

That little badge is news. The TAB is owned by GateHouse Media, a newspaper conglomerate that owns 75 daily and 231 weekly newspapers. And the TAB isn’t the only paper that got a silver CC badge this week. Without fanfare, the company is rolling out Creative Commons licenses covering nearly all of the 121 dailies and weeklies they own in Massachusetts. The CC license now covers 96 of the company’s TownOnline sites, which are grouped within a portal for their many Eastern Massachusetts newspapers.

“I don’t know of any other newspaper or any MSM site for that matter, publishing under CC,” said Howard Owens, director of digital publishing for Gatehouse, in a comment on the TAB’s blog. “It’s really not a big change from how a lot of newspaper sites handle content — free non-commercial use, but generally only if you ask. This removes the middle man of asking, because now it’s explicitly stated that free non-commercial use is permitted.”

Mia Garlick, chief council of the Creative Commons Foundation, concurs: she’s not aware of any newspaper chains or major papers that are releasing content under CC. “For a major publisher with significant numbers of people reading to be doing this is great.” ...

Making it easier — and legal — for bloggers to quote stories at length means that bloggers are pointing their audience at the newspaper. Getting a boost in traffic from weblogs may have an impact on online advertising revenue, and links from weblogs also have an impact on how high a site’s pages appear in search results from search engines such as Google. Higher traffic, and higher search engine rankings build a site’s ability to make money on online ads....

GateHouse, which went public in October, saw its stock rise 20 percent in the first day of trading: investors were clearly treating GateHouse like an internet stock, not a newspaper play. The run-up in price made GateHouse the most valuable newspaper company in America, leading Dow Jones, Scripps, The New York Times Company and far above cellar-dwellers Gannett and Tribune. GateHouse’s move towards open source, open licensing, and open conversations is the biggest experiment to date in whether a media company with open source ambitions can walk hand in hand with Wall Street.

Support for the CIHR draft OA policy

Heather Morrison, BCLA response to draft CIHR policy, OA Librarian, December 18, 2006.  Excerpt:

The British Columbia Library Association response to the Canadian Institutes of Health Research consultation on the Draft Policy on Access to Research Outputs has been posted, and is available for download.

Highlights: BCLA congratulates CIHR for developing a draft policy that is considered exemplary, a role model for other funding agencies.

"This draft policy will make research funded by Canadian taxpayers readily available to many more Canadians, as well as to researchers around the world. In a province like British Columbia, this will particularly make a difference to students, faculty, high school students, and health care professionals outside of the major research centres. This policy is considered exemplary by open access advocates, for good reason. The support for open access to research data is particularly noteworthy. This is an area where Canada has an opportunity to become a world leader, advancing research in the medical arena more quickly for the benefit of all, and also advancing Canadian economic interests in the knowledge economy.

Several particular strengths of the policy we would like to highlight are the requirement to deposit peer-reviewed research articles immediately on publication, the encouragement to retroactively archive important articles, and the indication that a researcher’s track record in providing access will be considered with future grant applications."

Profile of the National Geospatial Digital Archive

Larry Carver, National Geospatial Digital Archive: A Partnership Network, a 52 minute webcast of a talk at the Library of Congress, December 6, 2006.  (Thanks to ResourceShelf.)  From the LOC description:

Geospatial data is information such as maps, imagery and data sets that help us better understand, manage and monitor change in the present while providing insight into the past. From the first colonial maps to the time-sequenced satellite imagery of the 21st century, cartographic information has helped define our view of the country and the world. Today, cartographic materials in digitized form are being collected across a broad spectrum of types. Preserving these digitized images in distributed Internet accessible archives will ensure perpetual access to data vital for disaster relief, resource management, management of environmental policy, analysis of population demographics, education and teaching, plus countless other areas of public interest. To preserve this enormous amount of digitized data, the Library of Congress, the University of California, Santa Barbara and Stanford University have partnered to form the National Geospatial Digital Archive.

Larry Carver is the director of Library Technologies and Digital Initiatives at the University of California, Santa Barbara, and the principal investigator of the National Geospatial Digital Archive partnership for the National Digital Information Infrastructure and Preservation Program. NDIIPP is the Library's national program dedicated to the preservation of America's digital heritage.

New blog on medical OA and licensing

Gavin Baker has created a new blog, Essential Medicines News, that will cover OA to medical research along with topics like the law, policy, and licensing of essential medicines.  He's looking for co-contributors.  If you're interested, read his inaugural post and drop him a line.

More on OA to NASA data

David Shiga, Google and NASA pair up for virtual space exploration, New Scientist, December 18, 2006.  Excerpt:

...Google has already produced interactive maps of Mars and the Moon by combining their own software with NASA imagery (see NASA and Google bring Mars to PCs everywhere).

Now, NASA and Google have signed a Space Act Agreement that will see the two organisations cooperating to make more NASA data accessible to anyone on the internet.

"As we go back to the Moon, as we go on to Mars, as we go to near-Earth asteroids, we want every person not only in America but throughout the world to be able to travel with us and to feel the excitement of what it feels like to be on a new planet," says S. Pete Worden, director of NASA's Ames Research Center in Moffett Field, California, US.

The collaboration could eventually lead to capabilities resembling those of the "holodeck" in Star Trek, Worden says. As future robots explore the surface of Mars, this technology "would enable people to feel the crunch of Martian soil underneath their feet as the robots move around – maybe feel the Martian wind on your face," he says.

In the nearer term, the collaboration will make more of NASA's Moon and Mars imagery available for online exploration. Some Mars imagery is already accessible in 3D through a programme NASA developed called World Wind (see Space exploration program is out of this world)....

Even though it is theoretically in the public domain, much of the data from NASA missions is not readily available to the public because it is in not in an easily usable form, says Ames business development director Chris Kemp.

He says NASA and Google are set to change that. "We want to make that information as useful and as accessible to everyone as possible," he says.

Monday, December 18, 2006

Open access and handicap access

Brian Kelly, Accessibility and Institutional Repositories, UK Web Focus, December 12, 2006.  Excerpt:

There has been some discussion on the JISC-Repositories JISCMail list (under the confusing subject line of “PLoS business models, global village”) on the issue of file formats for depositing scholarly papers. Some people (including myself) feel that open formats such as XHTML should be the preferred format; others feel that the effort required in creating XHTML can be a barrier to populating digital repositories, and that use of PDF can provide a simple low-effort solution, especially if authors are expected to take responsibility for uploading their papers to an institutional repository.

An issue I raised was the accessibility of resources in digital repositories. There are well established guidelines developed by WAI [Web Accessibility Initiative] which can help to ensure that HTML content can be accessible to people with disabilities....

WCAG 1 ... requirements...[are] pretty unfriendly towards PDFs, I would argue. WCAG 2.0 (which is in draft form) is, however, neutral regarding file formats....

Milestone for Hindawi

Hindawi now publishes more than 50 OA journals.  From today's announcement:

The Hindawi Publishing Corporation is pleased to announce that its open access journal collection has increased to more than 50 titles following the launch of the International Journal of Aerospace Engineering, the International Journal of Navigation and Observation, and Science and Technology of Nuclear Installations. These three new journals will join Hindawi's growing open access collection, which currently includes titles in a wide range of subjects from engineering and mathematics, to biomedicine, chemistry, and materials science.

The launch of these new titles marks the further expansion of open access publishing into new areas of scientific research. "This is the future," explained Ramesh Talreja, Editor-in-Chief of the International Journal of Aerospace Engineering, "the open access concept is particularly timely for the aeronautics and astronautics field." In addition to providing unrestricted access to the full text of all published articles, all of Hindawi's open access journals allow authors to retain the rights to their work, by releasing their articles under a Creative Commons Attribution License.

Following the rapid expansion of Hindawi's open access collection in 2006, during which the collection increased from 12 titles to their current number of 52, we are planning further expansions in 2007. Hindawi is confident that its open access business model is scalable across a wide range of scholarly disciplines.

PS:  And by the way, the Hindawi OA journal program is profitable.  Kudos to Hindawi on this year-end news.

New version of Bailey bibliography

Charles W. Bailey Jr. has released version 66 of his monumental Scholarly Electronic Publishing Bibliography. (Note the new URL.) The new version cites and organizes over 2,830 print and online articles, books, and other sources on scholarly electronic publishing.

Give the gift of access

Bill Hubbard of SHERPA has made a Christmas card for researchers who are or ought to be self-archiving. 

PS:  Thanks, Bill.  And may you have a happy and healthy new year.

Open Access: Threat or Menace?

The Fifth Annual UK Journal Publishers’ Forum has picked the theme for its February 2007 meeting:

Apocalypse Now? The consequences of system collapse

Each year the PA, ALPSP and STM come together to sponsor a debate among UK journal publishers on events over the last year and to appraise our prospects for the future. The event is open to publishers only....

With UKPMC about to come on stream, the FRPAA heading for law in the US, and the OA debate opening up in Europe and in the Commission, what might be the impact on the journal system of self-archiving mandates with only six month embargoes attached? What happens if the hitherto progressive rate of change accelerates towards collapse of the established system of peer review and research dissemination?

What messages can we use to advocate evolutionary, organic change and to avert the potential for collapse? Can scholarly communication operate without journals? Are these forces for change really working for the public good?

Come along and be challenged to think through our future. Have your say in a lively ‘question time’ style of debate. Help to refine our message and identify where we should be doing more and where we could better serve our authors and readers.

Australian govt funds OA repositories

Julie Bishop, Australia's Minister for Education, Science and Training, has allocated $25.5 million to build OA repositories at Australian universities as part of the country's new Research Quality Framework (RQF).  Here's the key part of today's press release:

$25.5 million - Australian Scheme for Higher Education Repositories [ASHER] programme - to assist with the establishment of university digital data storage systems that will allow research outputs to be submitted for RQF assessment. This programme builds on the Australian Government’s $35 million investment in the research and development of data repository technology, funded as part of Backing Australia’s Ability.

(Thanks to Colin Steele.)

Yesterday's chat on synergies between wikis and the OA movement

The chatlog from yesterday's Wikimedia Open Access Chat is now available.  When you click on the link, the log downloads as a text file.

Michael Geist's A-Z on 2006

Michael Geist, The Letters of the Law: The Year in Canadian Tech Law, Toronto Star, December 18, 2006.  Excerpt:

This past year in law and technology has been marked by a series of noteworthy developments including the explosive interest in user-generated content, the emergence of several artists-backed copyright coalitions, and the arrival of Industry Minister Maxime Bernier, who has focused on reshaping Canadian telecommunications regulations.  From A to Z, it has been a remarkably busy twelve months....

O is for open access, the growing movement that focuses on providing the public with greater access to research and other academic publications.  Several Canadian funding agencies, including the Canadian Institutes of Health Research, introduced draft guidelines in 2006 that require researchers to make the results of their research available in an open access manner within months of initial publication....

Another look at Scholarpedia

Will Scholarpedia Pass or Fail?  Linux Insider, December 17, 2006.  Excerpt:

Have we reached another milestone in the evolution of academic publishing? Scholarpedia is the first "free peer-reviewed encyclopedia," a kind of morphing of open access (OA) publishing with wiki technology.

Initial reaction may be, not another Wikipedia wannabe!, especially as the ink is barely dry on Larry Sanger's Citizendium manifesto, which he describes as a "progressive fork" of Wikipedia. Scholarpedia could be very different, however....

Although suffering from a few gremlins when the blogosphere took a look, Scholarpedia could disrupt publishing models.

For a start, it takes the headache out of setting up and maintaining an online publishing operation for scholars inclined to develop their own OA journal....

Anyone can suggest changes to an article, and there's an anonymous forum for initial peer review. Scholarpedia appears far more inclusive than Citizendium and less obsessed with creating something worthy of "intellectuals."

With concerns continuing to mount about errors in Wikipedia (many put there for malicious reasons) and even hackers using it to hide malware, then something more managed and controlled like Scholarpedia may well be an answer to freely available scholarship online....

Comment.  Two quick ones:  (1) Scholarpedia launched before Citizendium, by eight months, not after.  (2) If Scholarpedia is superior to Wikipedia, then let's find a way to praise it, even as "an answer to freely available scholarship online", without leaving the false impression that the large and growing body of peer-reviewed OA literature suffers from the same problems as Wikipedia.

What does it mean for a university to be open?

Ethan Zuckerman, Charles Nesson’s lunch at Berkman: what does it mean for a university to be “open”?  My heart's in accra, December 12, 2006.  Excerpt:

Charlie Nesson - who introduces himself as “Charles the Infuriator” - is chewing on an interesting new train of thought: the openness of universities. Charlie is the founder of the Berkman Center and often drags our center in interesting new intellectual directions. Larry Lessig dedicated Code and Other Laws of Cyberspace to Charlie, declaring, “For Charlie Nesson: Whose Every Idea Seems Crazy… For About A Year.” ...

Charlie is organizing a conference at the University this summer, asking the question, “How Open Will Harvard Be to Internet and Society?” Some examples make it a bit easier to understand what he’s asking:

- At Harvard’s business school, it’s forbidden to use Google to “solve” a case study by figuring out how the business actually turned out. Is this a broken educational model, where you need to shut out the openness of knowledge to make your teaching methods work? This expands to a larger question: do we need to rethink how classrooms work in an era where everyone is capable of being in an online space at the same time as they’re in a physical space?

- Does it make sense for scholars to use public money to do research, then hand that research over to a private company, which farms them out to other scholars who perform peer review - for free - then binds and sells the research for an awful lot of money? This model may have made sense years ago, but does it make sense in a digital age, or should Harvard move towards an Open Access model for publishing?

- How does corporate or government funding of research work to make parts of universities open and closed? How do we feel about closing off areas of knowledge due to the constraints of funding?

- Universities like MIT have taken big steps towards making their courses and software open and accessible to the wider world. Why have so few American universities embraced what’s available through these repositories?

Given these questions, Charlie invites the room to suggest their topics for this conference. The room is packed, and we get a wide range of ideas offered: ...

Charlie offers his own question at the end of the brainstorm: Is the deal between Google and universities a good deal for the university world? He’s referring to the deal where Google has agreed to digitize a large portion of Harvard’s library and make the works available to the web via their search engine, and to Harvard as well. Charlie points out that the terms of the deal between Google and Harvard are secret, but that the deals with University of Michigan and the University of California are available online - he’s read the Michigan deal and refers to it as “one of the worst contracts I’ve ever read”.

The question Charlie and many others are asking about the Google deal have to do with whether the terms of the deal open up knowledge, or tether it to a single provider....

PS:  This is an excellent set of questions.  Every university should be discussing them.

Update. A podcast of Nesson's talk is now online.

A Spanish take on open science

Antonio Lafuente, Ciencia 2.0, Madri+d, undated but apparently December 2006.  On the rise of open access, open data, and open science (in Spanish). 

Presentations on the European Digital Library

The presentations from the First EDLproject Workshop on developing the European Digital Library (Vienna, November 27-28, 2006) are now online.

OA archiving in law

Carol A. Parker, Institutional Repositories and the Principle of Open Access: Changing the Way We Think About Legal Scholarship, a preprint forthcoming from the New Mexico Law Review, Vol. 37, No. 2, Summer 2007.  (Thanks to Law Librarian.) 

Abstract:   Open access to scholarship, that is, making scholarship freely available to the public via the Internet without subscription or access fees, is a natural fit for legal scholarship given our tradition of making government and legal information available to citizens, and the many benefits that flow from freely disseminating information for its own sake. Law schools, journals and scholars should espouse the principle of open access to legal scholarship, not only for the public good, but also for the enhanced visibility it provides journals and authors. Open access can be accomplished by archiving digital works in online institutional repositories. Legal scholars have enjoyed the benefits of open access to working paper repositories such as SSRN for more than ten years - even if they have not thought of this practice as 'open access.' It is a natural progression for legal scholars to now self-archive published works as well, and they are beginning to do so as awareness grows of the benefits of providing open access to published legal scholarship. Institutional repositories provide new ways to publish student scholarship, empirical data, teaching materials, and original historical documents uncovered during the research process. Author self-archiving does not threaten the existence of law school-subsidized journals, and institutional repositories generate new audiences for legal scholarship, including international and multidisciplinary audiences. Not insignificantly, repositories also help preserve digital work. Law schools are discovering that the publicity and download counts generated by repositories provide new ways to measure scholarly impact and reputation. Approximately 40% of U.S. law schools now have some form of institutional repository, all of which are indexed by Internet search engines. Law schools seeking to establish institutional repositories enjoy a variety of options to choose from, ranging from proprietary applications like Digital Commons, SSRN's Legal Scholarship Network, the Berkeley Electronic Press' Legal Repository, and NELLCO's Legal Scholarship Repository, to open source applications like EPrints and DSpace.

The case against the Google journal-digitization project

Dorothea Salo, Control your bits, Caveat Lector, December 17, 2006.  Excerpt:

Well, here’s a position I never thought I’d find myself in: disagreeing with Peter Suber. In his comments to the news that Google is offering to digitize journal backruns for free, he says that he doesn’t see any downside for publishers who don’t already have a digitized backrun.

I do. I see a ton of downside, so much downside that I don’t think any self-respecting journal should take this deal. I do agree with Suber that should Google’s offer be accepted by a lot of publishers, open access would benefit hugely, at least in the short term —and to be honest, knowledge of that immediate short-term benefit is making it very hard for me to write this post....

My stubborn objection to the shape of this deal stems from my ebook days, and boils down to this: never, ever, EVER agree to a digitization deal that doesn’t leave you in control of a copy of the bits....

According to Suber, Google isn’t demanding de jure exclusivity. You don’t like what Google does to your journal, you’re free to shop it around elsewhere for re-digitization. Looks good on the surface, but let’s be real here: what library or other digitization shop is going to work with a journal that’s already done a Google run, unless the journal coughs up a whale-load of cash? ...

Quality? I scoff. We know from the book project that Google is doing crappy work.  We’ve seen it. And that’s just the scanning! We also know they’re not going to proof their OCR results, much less mark them up. (Has Google even heard of the NLM DTD suite, I wonder?) Journal publishers can do better, and should if they consider themselves responsible agents of scholarly communication....

There’s no way that I see for a publisher to withdraw its material from Google once the contract is signed; as an OA advocate, I love that, but if I were a publisher, I’d hate and fear it....

Preservation? Google isn’t signing up with CLOCKSS or Portico that I’ve heard, nor is it allowing its publisher partners to do so....

Comment. It may seem odd that I agree with nearly all of Dorothea's critique here and disagree only with her conclusion.  Yes, it's better to control the bits than not (which I said in my last post); the journal scans should be higher in quality than the book scans; and preservation matters.  We agree that the deal could be better than it is, even if we disagree about the amount.  But the next question is whether it's better than nothing.  I think it clearly is.  For many journals, it's this deal and OA via Google or it's no digitization and no OA for the indefinite future.

In short, improving the deal would better than accepting it in its current form, but accepting it as is would be better than rejecting it.

Dorothea rightly distinguishes the appeal for OA advocates from the appeal for publishers, and I should be more precise.  When I say the deal is better than nothing, I mean that it's better by a wide margin for OA and better by a somewhat smaller margin for publishers.   But it's only worse than nothing for publishers who don't want OA for their back runs or who have a better way to get it. 

Dorothea seems to admit that most publishers don't have a better way to get it.  That is, we agree that most journals won't be able to take advantage of Google's non-exclusivity, since they won't be able to find another partner with the cash to pay for re-digitization.  (I've made the same argument about the non-exclusivity of Google's book-digitization program.)  But we have to see which way this consideration cuts.  If publishers don't have a better way to digitize their back run for OA, and they want OA for their back run, then accepting this deal is better than rejecting it.

Conceivably Google's offer will elicit better offers from rivals.  That's roughly what happened with the OCA, which not only gave its digitization partners better terms than Google but also pressured Google to liberalize its own terms (e.g. permitting printing and downloading for public-domain books when it previously barred them).  That would be wonderful, and journals could help the cause by talking to the OCA.  But it doesn't change the balance for the current deal considered on its own.

Update (December 19, 2006). See Dorothea's response to my comments, focusing on ways that publishers might get a better deal from Google or find other digitization partners.

Sunday, December 17, 2006

Two OA developments in France

An update from Hélène Bosc, convenor of the workgroup on scientific publishing at Euroscience:

[1] In October 2006 the French Universities, Grandes Ecoles, and research organisations (including CEMAGREF, CIRAD, CNRS, INRA, INRIA, INSERM,the PASTEUR Institute and IRD) signed a draft agreement on adopting a common online deposit platform for all French research publications and scientific writings. This common platform is based on the CCSD's HAL software.

[2] INRA's Institutional bibliographic database PRODINRA will soon be an Institutional Repository. It will then able to communicate and feed into HAL directly.To accompany the implementation of its Institutional Repository PRODINRA, INRA's Scientific Information Directorate commissioned a study on self-archiving practice, "Open Archives: Towards A Deposit  Mandate?" by Dominique L'hostis & Pascal Aventurier. The study is a synthesis of the information on existing implementations, researcher practices and the role of institutions.

The study  is accessible at Archivesic [here] and at Prodinra [here].

A remark: The PRODINRA database contains 100 000 bibliographic references (metadata) for publications by INRA researchers, with a growth rate of 5000 per year. If INRA adds to PRODINRA the "EMAIL EPRINT REQUEST" button (see its explanation in the report) it will make it possible for INRA users to request and INRA authors to provide immediate free access.

This is not yet Open Access, but for researcher purposes, it is almost as fast, and almost as effective, during the first year of new published findings.

Google's offer to digitize journal back runs for OA

Google is offering to digitize and provide OA to the back runs of scholarly journals.  The terms of the offer are not online, as far as I can tell, but here's an excerpt from Google's Overview and FAQ.

While many publishers and organizations are working on bringing journal collections online, a substantial fraction of scholarly journals are currently offline and may remain offline for the foreseeable future. Google is offering publishers an archival journal digitization program to bring these archival collections online and to make them more accessible.


  • Publishers maintain copyright and ownership of their content. Publishers select the journal volumes to be digitized.
  • This service is free and will make articles from the selected issues fully [and freely] accessible to all users.
  • Hosted pages that display journal articles will include publisher co-branding/logo and a link back to the publisher’s website.
  • This program is non-exclusive. There is no restriction on redigitization of this material or on working with other partners.

Additional Details:

  • The digitized journal articles will be included in Google search indices including Google Web Search and Google Scholar.
  • Publishers can create a table of contents on their website and link to their digitized articles in Google, allowing users to browse their archives from their website....
  • Publishers have the option to include Google's ads on the hosted pages that display the journal articles. This is setup as a revenue share between the publisher and Google.

Frequently Asked Questions

Q. Does this mean all my journals have to be offered openly accessible?
A. No, you can choose the journal volumes to be included. The goal of this program is to bring archival journal collections online.

Q. Can you provide us with the digital files?
A: We are unable to provide the digital files unless we terminate the service or the agreement. We would like to point out that this is a non-exclusive program....

Q: Will Google provide us with reporting tools so I know how many users are reading my digitized content?
A: Yes, we will provide tools for publishers to view the usage of their content.

Q: What will users see when they find my digitized articles?
A: Users will see the full digitized article and will have an option to download PDFs for reading or printing. The pages that display the article will also include the publisher's
logo and url....

Comments.  I've been hoping to see this development ever since Google started digitizing books two years ago.

  1. The offer could be better, but it's definitely worth taking.  I'd like to see open access to cut-and-pasteable text, not just to images.  (I've said the same about Google-scanned books.)  I'd also like to see journals get their own copies of the files so that Google wouldn't be the only host.  But then, of course, the files would be indexed by Yahoo, Microsoft, and Scirus as well.  While Google naturally wants some return on its investment, and a period of exclusivity, this provision of the contract shows the effect of a profit motive.  If the OCA gets into journal digitization, and I hope it will, I suspect that it would be delighted if the resulting files were indexed by every search engine on the planet.  Meantime, free online access through Google alone is much better, for authors, readers, and journals, than no free online access at all.
  2. I don't see a downside for journal that doesn't already have a digitized back run.  Google will charge nothing to do the job, the contract is non-exclusive, and it needn't include current issues.  The only journals that might hesitate are those already selling online access to their back runs and happy with the revenue.
  3. Because the offer targets non-OA issues, it could bring about OA to virtually the whole corpus of journal literature that isn't already OA.  And it could do this without compelling non-OA journals to convert to OA for their newer issues. 
  4. The offer is new I don't yet know of any journals taking Google up on it.  If you hear any, please let me know.  Meantime, if an important journal in your field hasn't yet digitized its back run, tell it about this offer. 

Finding hidden gems in repositories

Bill Warters, Finding Hidden Gems in Online Databases & Repositories, a presentation in the Wayne State University series, Emerging Technology for Scholars.  Also listen to the 52 minute podcast.  (Undated but sometime in December 2006.)

How open is your country's budget?

The Center on Budget and Policy Priorities has published the Open Budget Initiative 2006.  (Thanks to the Scout Report.)  From the site: 

On October 18, 2006 civil society organizations from 59 countries around the world unveiled the Open Budget Index.  This is the first index to rate countries on how open their budget books are to their citizens.  It is intended to provide citizens, legislators, and civil society advocates with the comprehensive and practical information needed to gauge a government’s commitment to budget transparency and accountability.  Armed with this kind of information, lenders, development advocates, and aid organizations can identify meaningful budget reforms needed in specific countries to combat corruption and strengthen basic services to improve people's lives.

Scholarly wikis for rapid dissemination

Jon Gresham, Social Software and Research Dissemination: E-Speed is Useful, Electronic Journal of Sociology, 2006. 

Abstract:   Fast and wide dissemination of research promotes successful discussion, debate and dialogue. This paper describes internet-facilitated discussion on ethno-religious research as one component of a communication plan. International organizations asked me to keep them informed on my research when I began a pilot study in Iraq after the 2003 war ended. I began a "wiki" (an open co-authoring forum, and collaboration tool) as a public place to post concept pieces and research-in-progress reports and to organize internet links and resources. The wiki rapidly became a no-cost discussion arena for scholars, practitioners, and the public about social and political systems. This collaboration became global, with often more than one hundred daily readers! Research application is important, and there is no substitute for multi-disciplinary live discussion, archiving online the facts and opinions for future reference, with print versions supplemental instead of primary. While this style of e-review and e-reporting will not take the honoured place of print publishing, it certainly should be considered for rapidly disseminating research in progress, exploring theoretical challenges, and providing resources for practitioners.

PS:  I don't want to speak for Gresham, but he might have meant to say that the aspect of traditional publication with an honored place, not likely to be displaced by wikis, is peer review, not print