News from the open access movementJump to navigation
Amedeo is challenging physicians to write an OA textbook on tuberculosis. From the Amedeo Challenge:
For more of Amedeo's position on OA medical textbooks, see its philosophy page:
Tom Roper has blogged some notes on Arne Jakobson's talk on OA at the recent HSLG meeting, Thinking the Unthinkable: Open Access (Kilkenney, February 23-24, 2006). Excerpt:
Bernard Barrett opened the 4th conference of IHLG, saying, in introducing the first session on open access that it was up to us to change things. Mary Burke of UCD chaired the session and referred to the words of the Budapest declaration on open access on combining old traditions and new technology. She introduced Arne Jakobson, President of EAHIL. Arne spoke on open access and institutional repositories (IRs). EAHIL supports both, he said, but he was going to concentrate on IRs He referred to the Budapest and Berlin statements, and to RCUK's, though also to the controversial Royal Society statement. He gave some figures on numbers of IRs based on ePrints, OAIster and OpenDOAR. He introduced the audience to the distinction between pre- and post-prints. He listed a number of benefits of IRs:  increased visibility of an institution's research,  no delay,  better availability,  secure and sustainable storage. Libraries are the natural hosts for IRs; storage and software costs are low. The challenges are cultural. Arne then illustrated his point by describing the development of the Oslo repository, DUO. They will require all postgraduate theses to be submitted electronically from 2007. For electronic journal articles they have a sister project, FRIDA, and deposit is mandatory for scientific staff. Nationally they have NORA, the Norwegian Open Research Archive. He concluded that a strategic plan was important, and that it was difficult to change scientists' and postgraduate's behaviour. Questions: who adds metadata? Arne said that researchers do so on submission, using a simplified set of subject headings. What abut peer review? Items in FRIDA are peer-reviewed by definition, because they all post-prints. What's the biggest block to compliance? Researchers' time is short and workload heavy. Arne is considering offering financial incentives for submission. Funding bodies are not yet insisting on deposit in IRs. Citation counts? The database will contain citation to journals. It's clear open access increases visibility.
You've probably heard the news that Perfect 10 won its copyright lawsuit against Google. A U.S. District Court ruled on February 17 that Google's display of thumbnail images from the Perfect 10 web site is not fair use. I won't be blogging this story in depth, but I may blog occasional pieces arguing that the case has implications for the Google Library project. Here are two, pulling in opposite directions.
In the New York Times this morning, Edward Wyatt quotes representatives of the Authors Guild and the Association of American Publishers (AAP), who believe the Perfect 10 decision helps their separate lawsuits against the Google Library project. Excerpt:
"I think it takes the wind out of their sails," Jan Constantine, the general counsel for the Authors Guild, said of the Perfect 10 decision. The guild and the Association of American Publishers brought copyright infringement lawsuits against Google over its Book Search program. Michael Kwun, litigation counsel for Google, disagreed, saying that the case "will affect only searches related to Perfect 10, and will not have any effect on other Google products."...Allan R. Adler, a vice president for governmental and legal affairs at the Association of American Publishers, said the California court's willingness to rule against the Arriba Soft precedent under a different set of facts was encouraging to the publishers' group, as was the judge's statement that the public benefit of Google's search engine does not necessarily outweigh the rights of copyright holders. "Google is going to have a difficult time arguing that there isn't a marketplace for publishers to license their works" given the Perfect 10 decision, Mr. Adler said. Neither the Perfect 10 case nor the Arriba Soft case are direct precedents for the Book Search lawsuits, which were filed in Federal District Court in New York. But Mr. Adler said that even if the publishers do not assert that there is currently a market for the few lines of text displayed by Google Book Search, the fact that a market exists for the digital copies created by Google could work in the publishers' favor. Ms. Frank agreed. She noted that the judge in the Perfect 10 case further differentiated that case from Arriba Soft by noting that Google's AdSense program allows it to generate revenue from its search technology.
By contrast, the Electronic Frontier Foundation (EFF) emphasizes the narrowness of the judge's ruling and the principles within it that support search indexing and even the permissionless display of low-res thumbnail images. Excerpt:
[The ruling] will be remembered as a little bad for Google, but a lot good for the Web....First, the court firmly rejected the notion that in-line linking of images directly infringes a copyright owner's public display right. That's a huge victory for the World Wide Web, which has long relied on in-line linking. Had Perfect 10 won on this point, every in-line link could potentially trigger automatic liability unless you got prior permission for the link....Second, the court rejected Perfect 10's secondary liability arguments. Basically, Perfect 10 argued that because Google "created the audience" for infringing websites, it should be held responsible for the infringements on those sites. Imagine that -- because you help someone find a site, you're held responsible for what happens on that site? That would have been a catastrophe not only for search engines, but for linking generally....Third, the court reasoned that merely visiting a website that includes infringing material does not make you an infringer. When you visit a website, your browser makes a copy of images in its cache. According to Perfect 10, that means every person who views a webpage that includes an infringing image becomes an infringer....The court rejected that argument, pointing out that most people don't treat their browser cache as a repository for infringing goodies, and concluding that copies made automatically by your browser are probably fair uses. So that's three major victories for the Web at large. Now what about the bad fair use ruling? While I don't agree with the court's analysis, let's start by examining how narrow it really is. First, the court is not condemning all thumbnails created by image search engines. In fact, the court can't do that because the Ninth Circuit (whose precedents bind the district court here) has already approved that practice as a fair use in the Kelly v. Arriba Soft decision. So the court's ruling only tells us that there is a line out beyond Kelly v. Arriba Soft that search engines may not cross. Second, the court did not announce any new fair use legal principles....So the fair use ruling really boils down to one fact-bound question: what distinguishes Google's thumbnails from Ditto's (the search engine in Kelly v. Arriba Soft)? Two things, according to the court: (1) Google's ability to share ad revenues from the infringing sites, thanks to AdSense, and (2) Perfect 10's deal with Fonestarz to provide low-rez images for cellphones.
David Worlock, OUP: OA In The World Of Intelligent Experiment, EPS Insights, February 24, 2006 (accessible only to subscribers). Excerpt:
Oxford University Press's Open Access programme is a model of pragmatic experimentation, and repositions the publisher as the intermediary of choice who picks the publishing and revenue models most appreciated by the widely differentiated sub-sector markets of STM - and social sciences and humanities....The difference between fundamental research and work on research technique can be demonstrated by a willingness in the former to pay the access bill, and in the latter to go for conventional publishing. These are the type of symptomatic lessons that Oxford are learning through a steady process of experimentation since 2003....In some areas of molecular biology some 30-40% of submissions in some journals are author-paid, and Oxford has accepted around 1,000 articles on these terms. It is probable that only BioMed Central have accepted more. Oxford's tariff is a moderate one: UKP800 (USD150) for authors based in a subscribing institution, UKP1,500 (USD2,800) where the author is based in a non-subscribing institution. As 'free' articles mount in the journal's portfolio, so subscription levels fall. OUP have successful journals where OA author-paid articles are 30% of content, and subscription pricing has declined pro rata.... It seems likely that the first result will be a demonstration of the complexity of the market - some OUP journals will be 100% OA one day - others will never get underway. Oxford Open will prove the poverty of generalisation.
Linda O'Brien, E-Research: Strengthening institutional partneships, University of Melbourne UniNews, February 20, 2006. Excerpt:
Whether it’s e-research in Australia, cyberinfrastructure in the USA, the grid in Europe, or e-science in the UK, a transformation is occurring in research practice, a transformation that will have a profound impact on the roles of researchers and information professionals working in higher education....Arguably technology is the easy part; harder is the human dimension. The matter of connecting people (researchers) to resources is not only an international issue but also a national, regional, and local issue. Linking people to resources – researchers to scholarly materials – has been the role of the librarian for centuries. Libraries have traditionally been central to the research endeavour, managing and preserving resources increasingly in digital form and making these resources accessible to the researcher, often through collaboration and partnerships with other libraries. Hence, libraries have know-how not only in managing, making accessible and preserving scholarly resources but also in forming federations and collaborations to share published scholarly work. But the nature of scholarly communication is changing, with researchers wanting access to primary research data, often in digital form. No longer is scholarly communication a final discrete publication that is to be managed, made accessible, and preserved. Libraries may even risk fading from existence if they don’t respond effectively to the changing environment. In e-research, it is the primary research data that must often be managed, made accessible, and curated....But who will take responsibility for the longer-term curation of and access to this data?
PS: This article first appeared in Educause Review for November/December 2005.
Gervase Markham, Free software? You can't just give it away, London Times, February 21, 2006. (Thanks to Seth Johnson.) If you've been defending open-access literature for long, you've encountered your share of incredulity at the very idea. Enjoy Markham's encounter with incredulity at the very idea of open-source software. Excerpt:
Who could possibly be upset with the Mozilla Foundation for giving away its Firefox browser? One of my roles at the Mozilla Foundation relates to copyright licensing. I'm responsible for making sure that the software we distribute respects the conditions of the free software licences of the underlying code. I'm also the first point of contact for licensing questions....A little while ago, I received an e-mail from a lady in the Trading Standards department of a large northern town [in the UK]. They had encountered businesses which were selling copies of Firefox, and wanted to confirm that this was in violation of our licence agreements before taking action against them. I wrote back, politely explaining the principles of copyleft – that the software was free, both as in speech and as in price, and that people copying and redistributing it was a feature, not a bug. I said that selling verbatim copies of Firefox on physical media was absolutely fine with us, and we would like her to return any confiscated CDs and allow us to continue with our plan for world domination (or words to that effect). Unfortunately, this was not well received. Her reply was incredulous: "I can't believe that your company would allow people to make money from something that you allow people to have free access to. Is this really the case?" she asked. "If Mozilla permit the sale of copied versions of its software, it makes it virtually impossible for us, from a practical point of view, to enforce UK anti-piracy legislation, as it is difficult for us to give general advice to businesses over what is/is not permitted." I felt somewhat unnerved at being held responsible for the disintegration of the UK anti-piracy system. Who would have thought giving away software could cause such difficulties?...In a world where both types of software exist, greater discernment is required on the part of the enforcers. I hope this is the beginning of the end of any automatic assumption that sharing software with your neighbour must be a crime.
Gavin Baker is running for the Student Senate at the University of Florida. Baker co-founded the Florida chapter of Free Culture and is making open access a campaign issue. His platform is offline at the moment (but the problem is probably temporary, so keep trying). He tells me in an email:
I'll advocate for open access to university research and journals, work to expand library digitization projects, promote open source software and open file formats....As far as I know, I am the first student to make open access an electoral issue.
He's the first as far as I know too. His candidacy and position could make a difference: At UF, the Student Senate controls an $11 million budget, the third largest in the US. Go, Gavin!
Note to students elsewhere: learn about open access and what you and your university can do to promote it. Take your commitment into your research and future career. But in the meantime, take it into your student government!
William R. Riedel, R. David Polly, and Whitey Hagadorn, Coming of Age: ISI & Googling, Palaeontologia Electronica, February 2006. (Thanks to Bruno Granier.) Excerpt:
Palaeontologia Electronica has taken two major steps this past year: contributors have been actively probing the potential of the World Wide Web to further paleontology, and ISI began indexing the journal in its Science Citation Index and Web of Science.
The February issue of ScieCom.Info is now online. Here are the OA-related articles.
Implementation Science is an independent, Open Access journal hosted by BioMed Central. From the inaugural editorial:
Implementation research is the scientific study of methods to promote the systematic uptake of research findings and other evidence-based practices into routine practice, and, hence, to improve the quality and effectiveness of health services and care. This relatively new field includes the study of influences on healthcare professional and organisational behaviour.
Implementation Science - Fulltext v1+ (2006+); ISSN: 1748-5908.
Bioinformatics.org has announced that Michael Ashburner has won the Benjamin Franklin Award for 2006. Excerpt:
Bioinformatics.Org is proud to present the 2006 Benjamin Franklin Award in the Life Sciences to Michael Ashburner of Cambridge University. As expressed by his nominators, Prof. Ashburner has made fundamental contributions to many open access bioinformatics projects including FlyBase, the GASP project, the Gene Ontology project, and the Open Biological Ontologies project, and he was instrumental in the establishment of the European Bioinformatics Institute. He is also known for advocating open access to biological information.
Tove Iren S. Gerhardsen, Experts Discuss Balance Between Digital Content Access, Protection, IPWatch, February 24, 2006. A summary of three presentations at a February 21 WIPO meeting on the Development Agenda. The OA position was presented in different ways by Teresa Hackett, speaking for eIFL, and Ronaldo Lemos, speaking for Creative Commons Brazil. Excerpt:
Teresa Hackett of Electronic Information for Libraries (eIFL) had another point of view. She argued that protection of electronic works can make access for libraries difficult and this constitutes a double burden for libraries in developing countries which may not be able to afford the extra costs incurred, such as licence fees and rights clearance. In order to lessen the digital divide between north and south, eIFL is working in more than 50 developing and transition countries to negotiate discount prices and new business models with publishers for access to electronic resources, she said. Hackett said that the copyright agenda is increasingly driven by multinational mass entertainment industries, which have particular and legitimate concerns. These are however not directly applicable to other situations such as not-for-profit education and research, yet libraries find themselves in this “digital marketplace.” According to the WIPO Copyright Treaty, “exceptions and limitations” in copyrights may be extended to digital works, but libraries have met with strong opposition from rights holders when they have tried to implement this, she said. Finally, she cited a problem with technological protection measures (TPMs) which she said may jeopardise public access to works and about which the British Library recently expressed concern during a hearing of the UK All Parliamentary Internet Group. TPMs provide ways of controlling access and use of copyrighted material which, it has been estimated, have an average life cycle of three to five years. One possible solution might be to provide libraries “clean” copies of works with no TPMs, she said. Hackett said eIFL supports the work on a development agenda at WIPO. The group also welcomes a Chilean proposal that suggests studies be conducted on the impact of intellectual property rights on issues such as education in developing countries. “Why not give developing countries the same flexibilities that developed countries had when they developed?” Hackett said.
ALA President Michael Gorman has issued a statement on the closing of the EPA libraries (February 23, 2006). Excerpt:
The American Library Association is deeply concerned about the very negative impact on public access to environmental information that will result if theproposed 80 % cuts to funding for the Environmental Protection Administration’s (EPA) libraries are made. ALA has a long-standing commitment to promoting free public access to government information and we are troubled by what seems to be an accelerating trend in increased restrictions on access to government information. Individuals and communities need to be able to find high quality, accurate information about issues that concern them, such as the health and safety of their families and communities. EPA has, since its creation in the early 1970s, been a key source of such information. We fear that the drastic budget cuts proposed in the FY 2007 EPA budget will have severely deleterious effects on the ability of the EPA libraries to continue their essential role in ensuring public access to critical health and safety information. We encourage members of Congress to maintain the funding necessary to support key government information programs such as the EPA libraries and to ensure adequate future funding for this purpose.
President Abdul Kalam of India gave a speech last Sunday to the Indian Intitute of Science (IISc) in Bangalore. The title was, Towards World Knowledge Platform. (Thanks to Subbiah Arunachalam.) Excerpt:
Initially, the mission of World Knowledge Platform is to connect and network the R&D Institutions, Universities and Industries using fiber broadband from the partner nations on selected R&D Missions. The underground fiber cable infrastructure already exists between the many partners. It is only waiting to be lighted up with state-of-the-art optical networks and to ignite the minds of the knowledge workers. This knowledge GRID will support multitude of seamless connections supporting both synchronous and asynchronous communication, carrying either text or audio or video. We can then use this network in the academic environments to teach courses online and share expensive equipments remotely....The components of the vision [for IISc] are:...(f) Be a partner in the World Knowledge Platform to promote world class knowledge creation, knowledge dissemination and knowledge sharing among all partner countries....(i) Create an IISc-Virtual Education Hub - so that the quality education from IISc can reach out to the entire nation. It should also act as a Virtual Collaborative Hub, which will become the platform for the scientists, researchers from IISc, world wide scientists and Nobel laureates to share their knowledge among the students and faculty across India.
Comment. I've only excerpted the parts most relevant to OA above. But the whole speech is worth reading if only to see how an educated President can speak, in case you've forgotten. If you're not from India, try to imagine your head of state speaking these lines. "It is reported that gene differences between humans and most animals are very nominal. More than 90% of our DNA is similar. This property is a boon to researchers since animal models can be subsequently used for curing human diseases based on trial data." "When I think of Nanoscience and Nanotechnology, I...[think of] Mr. Richard Feynman...Mr. Eric Drexler...[and] Prof CNR Rao." "The era of wood and bio-mass is almost nearing its end. The age of oil and natural gas would soon be over even within the next few decades. The world energy forum has predicted that fossil based oil, coal and gas reserves will last for another 5 - 10 decades only." "I would like to discuss the latest research in the area of photo-voltaic cells using Carbon nano tubes which can give an efficiency of over 45%, nearly three times the efficiency which the present technology can offer."
Update (7/13/07). The Royal Society awarded President Kalam its King Charles II Medal. The award is limited to heads of state who make extraordinary contributions to the promotion of science. It has only been given once before, in 1998 to Emperor Akihito of Japan.
Andrea Foster, Google Wages Fresh Campaign Against Critics of Project to Digitize Library Books, Chronicle of Higher Education, February 23, 2006 (accessible only to subscribers). Excerpt:
The battle between publishers and Google over the Internet-search company's project to digitize library books has heated up with an announcement this month by Google that it was starting a campaign to dispel misperceptions about the project. In an e-mail message posted online and addressed to "Google Book Search supporters," two Google officials said they were creating a "fact-checking brigade" about the company's digitization effort. And they proceeded to rip apart a column in Newsday by the writer Susan Cheever, in which she accused Google of stealing authors' works....Google says its project is permissible under copyright law's "fair use" exemption. But Ms. Cheever disputes the point in her column, stating that fair use allows people to distribute only a set amount of words from a work. "The amount of words that constitute fair use varies according to court case," she writes. "At present, it is 400 words." Taking aim at that statement, Alexander Macgillivray, a Google lawyer, and Jen Grant, a marketing manager for the company, say Ms. Cheever "fundamentally misstates copyright law and misleads readers about Google Book Search." The Google employees say that there is no word limit associated with fair use and that some courts have ruled that republishing an entire work is fair use. They made the statement in a February e-mail message to supporters of the Google project. The message also urges people to "help clear the air when misleading articles like this one are published."
Peng Dong, Marie Loh, and Adrian Mondry, The "impact factor" revisited, Biomedical Digital Libraries, December 2005.
Abstract: The number of scientific journals has become so large that individuals, institutions and institutional libraries cannot completely store their physical content. In order to prioritize the choice of quality information sources, librarians and scientists are in need of reliable decision aids. The "impact factor" (IF) is the most commonly used assessment aid for deciding which journals should receive a scholarly submission or attention from research readership. It is also an often misunderstood tool. This narrative review explains how the IF is calculated, how bias is introduced into the calculation, which questions the IF can or cannot answer, and how different professional groups can benefit from IF use.
Excerpt from the body of the text. Note the sentence I've put in bold.
Given the rapid growth of electronic publications, the online availability of articles has recently become an important factor to influence the IF. Murali et al. determined how the IF of medical journals is affected by their online availability. In that study, a document set obtained from MEDLINE was classified into three groups, namely FUTON (full text on the Net), abstracts only and NAA (no abstract available). Online availability clearly increased the IF. In the FUTON subcategory, there was an IF gradient favoring journals with freely available articles [PS: emphasis added]. This is exemplified by the success of several "open access" journals published by BioMed Central (BMC) and the Public Library of Science (PLoS). Open access journals publish full-text online papers free of subscription fees. BioMed Central (BMC) is an "open access" publisher in business since 2000. BMC hosts over 100 biomedical journals ranging from general interest to specialized research. More than twenty journals published by BMC are currently tracked by the ISI and over half of these have IFs available for the recent years. BMC Bioinformatics was assigned its first IF for 2004. At 5.4, it places the journal second in the field, only marginally below the traditional competitor Bioinformatics (IF = 5.7), which has a 20-years' publishing history and is connected to a major learned society within this field of research (International Society for Computational Biology). PLoS (Public Library of Science) is another example of a successful "open access" publishing strategy. It started publishing two open access journals in biology and medical research in 2003 and 2004 respectively. PLoS Biology was assigned its first IF of 13.9 for 2004. In the ISI subject category "biology", it is thus placed at the number 1 position of 64 in its first year of reporting an IF. FASEB journal at position 2 has an IF of 6.8, but has been in circulation since 1987. Similarly, in the other SCI subject category ("biochemistry and molecular biology")in which PLOS Biology is listed, it ranks at position 8 out of 261. Monitoring the development of such journals' IF will inform the determination of the online-availability bias in the future. This effect will increase in the future with the availability of new search engines with deep penetration such as Google Scholar, allowing researchers to find relevant articles in an instant, and then choose those with immediately and freely available content over those with barriers, economic and otherwise.
Dean Giustini maintains a list of Resources in Evidence-Based Complementary and Alternative Medicine, labelling them as OA or locked.
In June 2004, James Howison and Abby Goodrum asked, Why can’t I manage academic papers like MP3s? They decided that the answer lies in metadata.
Why can’t downloaded academic papers be managed in the simple and effective manner in which digital music files are managed? We make the case that the answer is different treatments of metadata. Two key differences are identified: Firstly, digital music metadata is standardized and moves with the content file, while academic metadata is not and does not. Secondly digital music metadata lookup services are collaborative and automate the movement from a digital file to the appropriate metadata, while academic metadata services do not.
Two days ago, Alf Eaton took a whack at solving this problem by proposing a way to manage metadata for Academic PDFs.
Louise Krabbe Boserup et al., An Introduction to Openness and Access to Information, Danish Institute for Human Rights, December 2005. (Thanks to Archivalia.) Principles and recommendations on openness to government information (in general, not merely in Denmark). Does not cover government-funded scientific research.
Amanda L. Brewster, Audrey R. Chapman, Stephen A. Hansen, Facilitating Humanitarian Access to Pharmaceutical and Agricultural Innovation, Innovation Strategy Today, 1, 3 (2005). Focusing more on access to patented technology than on access to copyrighted literature. Excerpt:
This paper seeks to raise awareness about the importance of managing IP to facilitate humanitarian use and applications. Our goal is to identify intellectual property approaches that can promote access to and use of health and agricultural product innovations by poor and disadvantaged groups, particularly in low-income countries. The paper encourages more public-sector IP managers to understand and employ strategies that will accomplish these goals. Humanitarian use approaches should become the norm, and we seek to help private-sector licensees understand the rationale and potential benefits behind such strategies. This paper focuses on the pharmaceutical and agricultural sectors, but the principles noted could potentially be applied to other areas as well. There are key moments when technology managers can improve the likelihood that their IP will benefit people in need: when they decide 1) who will receive a license, 2) whether the license will be exclusive, 3) what types of applications will be covered, and 4) how long the duration of the license will be....We acknowledge that improved IP management cannot by itself solve the access crisis. Even if technology managers adopt humanitarian IP management strategies, they will need to connect with development partners who can utilize the protected technologies. In some cases, these partners may not yet exist. But when partners are found, it will be important to establish simple, efficient ways for them to identify technologies that public sector institutions are willing to share. We believe that the number and variety of technologies being managed with humanitarian goals in mind will continue to increase, and so the SIPPI project plans to explore ways to increase the transparency of license terms covering these technologies, thus making this information more widely available to potential beneficiaries.
Paul Ginsparg has won the CNI/ARL/Educause Paul Evan Peters Award for 2006. From today's announcement:
Paul Ginsparg, physicist and Internet scholarly communications pioneer, is the latest recipient of the Paul Evan Peters Award, announced today by the Coalition for Networked Information (CNI), the Association of Research Libraries (ARL), and EDUCAUSE. The award will be presented on April 3, 2006 at the CNI Membership Meeting in Arlington, VA, where Ginsparg will deliver the Paul Peters Award lecture at the opening plenary. A professor of physics, computing and information science at Cornell University, Ginsparg has distinguished himself as the visionary behind arXiv, an Internet e-print archive for articles in the sciences, which allows scholars to circulate and comment on research prior to publication in traditional peer-reviewed journals, thereby significantly reducing the amount of time it takes for an article to be available to researchers. Started in 1991 as a service for preprints in physics, arXiv eventually expanded to include mathematics, computer science and quantitative biology. Today, the resource boasts open access to over 350,000 articles.
Update. Also see the 3/7/06 story in Cornell's ChronicleOnline.
John Lorinc, The bottom line on open access, University Affairs, March 2006. Also available in French. (Thanks to Stevan Harnad.) Excerpt:
The rapidly evolving debate over free online scholarship drives right to the heart of some of the most fundamental questions about research....Worldwide, there are about 24,000 scholarly journals, but only three to seven percent of them are considered to be “open access” – OA for short – meaning that they make their research papers available for free on the Internet. But the rapidly evolving debate over open-access scholarship extends well beyond academic journals like the [new] one at www.econtheory.org, and drives right to the heart of some of the most fundamental questions about research: Do publicly funded universities and granting bodies have a democratic – indeed a moral – obligation to ensure that academic scholarship is available on the Internet? What kinds of public and institutional policies are needed to make such wide-ranging dissemination both possible and useful? And what are the implications for publishers, research libraries, copyright, and for scholarship itself? Few self-respecting researchers argue with the idea per se. “It’s easy to get people to sign off on a principle,” says Stevan Harnad, who holds the Canada Research Chair in Cognitive Science at Université du Québec à Montréal. “It becomes interesting and substantive when you take a practical policy.” Since the mid-1990s, Dr. Harnad has been at the centre of an international campaign to promote open access. But it’s only in the last three or four years – since the George Soros Open Society Institute orchestrated a 2002 summit of OA activists in Budapest – that granting bodies and universities have begun to look hard at how to translate open access from a feel-good cyber principle into something entrenched in the way academics do business – either by encouraging them to patronize open-access journals or urging them to routinely upload all their published research papers to a growing network of institutional repositories. “The right to know is at the forefront [of OA],” says John Willinsky, a language and literacy education professor at the University of British Columbia who heads the Public Knowledge Project, a research initiative that asks whether and how online technologies can improve the quality of academic research. “The critical point we’re at now is mandated access. We’re seeing a momentum build.” The epistemological benefits are difficult to dispute. Dr. Harnad refers to studies showing that citations can more than double for articles that are freely available on the web. Accessible online papers benefit academics in poor countries where universities have few resources. And research libraries see institutional electronic repositories as one way of ensuring the preservation of digitized online material that is highly vulnerable to the problem of disappearing URLs....In the past few years, large research councils in the U.S. and U.K. have grappled with the mechanics of applying the OA principle to publicly funded research. In Canada, in late 2004, the board of the Social Sciences and Humanities Research Council approved OA in principle; council staff are preparing recommendations based on public consultations....But open-access advocates contend that universities must now step up to the plate and adopt policies that compel faculty to “self-archive.”...Tim Mark, executive director of the Canadian Association of Research Libraries, describes...the “absurd” situation whereby academics working for publicly funded institutions give up their intellectual property rights to commercial journal publishers, who turn around and sell the fruits of their labour right back to those institutions in the form of costly journal subscriptions....The potential of the OA movement, [Harnad] argues, doesn’t begin with policy conditions aimed at altering the operating conditions for a small subset of journal publishers [to make them convert to OA]. Rather, it needs a much broader-based effort to make institutional self-archiving a routine and unquestioned part of the work of scholarship – as basic as including bibliographies and reference lists at the end of any paper. OA advocates say the pieces for such a cultural change are beginning to fall into place. There are now numerous open-source software and “harvesting” systems that allow institutions to create searchable, indexed and networked electronic repositories....But, as Dr. Harnad observes, the availability of user-friendly archiving software is necessary but not sufficient. That’s why, in 2005, OA activists approved the Berlin 3 institutional policy commitment. It calls on universities and research institutions to establish policies requiring academics to self-archive, as well as encouraging them to publish in open-access journals....So far 17 universities and research institutions – including the University of Zurich, Portugal’s University of Minho, and the University of Southampton, where Dr. Harnad taught before joining UQAM – have signed the 2005 Berlin commitment. No Canadian universities are signatories. How do academics feel about self-archiving? “Authors haven’t picked it up,” says Dr. Willinsky at UBC. “It has a lot to do with the fact that the focus of [academics’] work is getting published, not getting circulated.” Indeed, a U.K. survey of scholars showed that about half of the respondents had self-archived at one point, mainly on personal websites, but many didn’t do it routinely. Yet 95 percent said they’d be prepared to self-archive if their university required it as a condition of tenure or employment. What’s become increasingly apparent is that copyright issues aren’t a roadblock for the OA movement....While journal publishers, from giants like Elsevier to upstarts like Econtheory.org, will continue to work out a sustainable online business model, the OA policy ball has now landed squarely in the university sector’s court.
Stevan Harnad announces that Eprints has renamed two of its services.
Andy Carvin has blogged some notes on Nancy Davenport's keynote address at the University of Missouri conference, Open Access, Open Source, Open CourseWare: Sharing as a Solution to the Digital Divide (Columbia, February 22, 2006). Davenport is the president of Council on Library and Information Resources (CLIR). Excerpt:
Scholarly Communications. What are the issues? What are the options? What are the leadership issues?...Scholars are the supply and the demand. Research has to be distributed, through print, e-format, open access, repositories, self-publishing, even blogs. Who is in the middle, mediating scholarly discussions? Societies, reviewers, publishers - for profit and nonprofit - aggregators, librarians, provosts, administrators, the Internet....Digital scholarship: only way to integrate disparate content, allows new research and scholarship, encourages using material in new ways, creates new fields and communities of practice, creates new knowledge. CLIR call to action: tells publishers that librarians want independent, third party preservation of your content. "The academic community is built upon a sham. More and more you don't own your content - you're paying rent." What impedes open access? The academic reward system. Tenure requires publishing in "the right journals." Scientists can put open access fee into their budgets. But in the humanities, you don't get that kind of funding. PloS.org won't work for most humanities scholars....Where are we now? We pay a lot of money. Most institutions are paying 24% for digital serial journals in their collections budget. Libraries each pay large fees to access the same material. Meanwhile, libraries are digitizing their own special, unique materials.
Ulrich Herb, Open Access to Grey Resources – Die siebente internationale Konferenz zu grauer Literatur, forthcoming from Information: Wissenschaft und Praxis, 57 2, (2006) p. 119 - 121. In German, but Herb will post an English edition after the article is published. A report on the Open Access to Grey Resources: Seventh International Conference on Grey Literature (Nancy, December 5-6, 2005).
Rajarshi Guha and seven co-authors, The Blue Obelisk: Interoperability in Chemical Informatics, Journal of Chemical Information and Modeling, February 22, 2006. Only this abstract is free online, at least so far:
The Blue Obelisk Movement is the name used by a diverse Internet group promoting reusable chemistry via open source software development, consistent and complimentary chemoinformatics research, open data, and open standards. We outline recent examples of cooperation in the Blue Obelisk group: a shared dictionary of algorithms and implementations in chemoinformatics algorithms drawing from our various software projects; a shared repository of chemoinformatics data including elemental properties, atomic radii, isotopes, atom typing rules, and so forth; and Web services for the platform-independent use of chemoinformatics programs.
The Texas Digital Library (TDL), announced last July, is now online. (Thanks to Adrian Ho.) TDL is a consortial OA repository for five Texas universities: Rice, Texas A&M, Texas Tech, the University of Houston, and the University of Texas. From the site: "The Texas Digital Library will serve as a repository for research output including electronic theses and dissertations, faculty datasets, departmental databases, digital archives, course management and learning materials, digital media, and special collections."
Jan Velterop, Rituals, The Parachute, February 22, 2006. Excerpt:
Open access publishing, in addition to all the other benefits it has, also keeps the cost of scientific literature in line with research spending. This isn't, of course, proven yet, let alone scientifically. But how would one prove it without doing it in the first place? The proof of this pudding, I'm afraid, can only be in the eating, as the saying goes.
Comment. Exactly right. How do we respond to those who want the proof in advance? I'm thinking of those who want proof that OA publishing costs less before doing it, as well as those who want proof that high-volume OA archiving won't harm journals --at least outside physics, where we already have proof of both harmlessness and synergy. Part of the answer is that the subscription model, with annual price increases above inflation, could never have gotten off the ground if it had to provide proof prior to experience. (Nor could it justify its continued existence if it had to prove its continuing sustainability.) No innovation could get off the ground if it had to answer all skeptics in advance. There's a nice generalization of this observation by Ronald Bailey in the February 17 issue of Reason, Culture of Fear: Dealing with cultural panic attacks. Excerpt:
Earlier this week, the American Enterprise Institute in Washington, DC, held a remarkably interesting conference titled "Panic Attack: The New Precautionary Culture, the Politics of Fear, and the Risks to Innovation."...[I]t looked at how many Western countries are losing their cultural nerve, as evidenced by the increasing cultural acceptance of the so-called precautionary principle. The strongest versions of the precautionary principle demand that innovators prove that their inventions will never cause harm before they are allowed to deploy or sell them. In other words, if an action might cause harm, then inaction is preferable. The problem is that all new activities, especially those involving scientific research and technological innovation, always carry some risks. Attempting to avoid all risk is a recipe for technological and economic stagnation.
The National Science Foundation (NSF) is funding enhancements to the National Science Digital Library (NSDL) that will bring the NSDL's vast body of OA research closer to students taking courses. (Thanks to ResourceShelf.) From yesterday's announcement:
Virginia Tech and Villanova University researchers have received a $450,000 grant from the National Science Foundation to extend the benefits of its free, online library by developing technology that will allow college students and professors to conduct flexible and customized information searches directly from course Web sites. The project is directed by Manuel Pérez-Quiñones, Virginia Tech assistant professor of computer science. Other team members are: Weiguo Fan, assistant professor of accounting and information systems, and Edward Fox, professor of computer science, both at Virginia Tech, and Lillian Cassel, professor of computing sciences at Villanova University. "Our goal," Pérez said, "is to get content from the National Science Digital Library (NSDL) closer to its intended audience." The project's target beneficiaries are students and professors in all areas of computing....The NSDL has a vast collection of resources for education and research in science, technology, engineering, and mathematics....As its collections grow, the NSDL is placing more emphasis on user services and higher-level functionality, Pérez said. But in helping to expand the library's usefulness, he adds, researchers must not assume that a "feature-rich" Web site -- with browsing, searching, recommendations, and discussion forums -- would be enough to draw users. That would be like "building a large library with lots of facilities, but placing it at the outskirts of campus -- few teachers and students will interrupt their daily routine to use the new facilities." The project focuses on course Web sites, he said, as they are "the most commonly used application in today's educational environment, typically providing access to a course syllabus, schedule, assignments, grades, and discussion forums." The technology that the researchers will develop will allow a course Web site to be the entry point into the NSDL's collections. The product would be "a flexible and personalized information interface that supports both exploratory and focused searches and provides the ability to obtain context-sensitive services at various levels of interaction."
Quoting Mary-Beth Peters, the U.S. Register of Copyrights in a talk November 2, 2005, at the University of North Carolina Law School:
We've certainly lengthened the term [of copyright] perhaps -- I won't even say perhaps -- too long a term. I think it is too long. I think that was probably a big mistake.
Comment. This admission comes very late in the game. I'd love to see Congress shorten the term of copyright, but I know that the political obstacles are nearly insuperable. (These are the same political obstacles that make the length of the term of copyright into a ratchet that always goes up, never down.) But Peters' admission is nevertheless extraordinary and helpful. It shows that the U.S. Copyright Office sees encroachments on the public domain as harmful to all of us, even if helpful to one wealthy industry. It shows that the boneheaded Bono extension of 1998 might be the last. It shows that the Copyright Office might take a more balanced approach to other large copyright issues in the future, as it recently did with its guidelines on orphan works.
Effective immediately, Hindawi Publishing is converting 13 of its subscription-based journals to OA. From today's announcement:
Comment. This is the largest bulk-conversion of non-OA journals to OA in the history of OA. It's especially welcome because journals that convert to OA bring their readership, reputation, impact factors, and prestige with them, unlike new OA launches, which have to develop these from scratch. Note to other publishers: Hindawi's direct experience with OA publishing is leading it to accelerate its expansion into OA rather than to retreat.
From Brendan O'Keefe's story in today's The Australian on OA archiving at Australian universities:
University of Sydney library manager of innovation and development Ross Coleman told the conference: "We're moving from the technical to the community."...The Queensland University of Technology has begun a project that academics hope will make knowledge free to all, from members of the public to top-tier researchers. QUT school of law professor Brian Fitzgerald has $1.3million and two years to develop the Open Access to Knowledge Law project, which aims to remove barriers to the use of information on university websites....Open repositories were "probably not a threat to large publishers, though some people like to think [they are]", Mr Coleman said. "Some of the journals allow authors to put their work in the local repositories, but that's not the final, edited copy. "The large publishers are still getting their heads around this."
Subbiah Arunachalam, Public access to the Internet, a chapter in Word Matters: Multicultural perspectives on information societies, C & F Editions, 2005. Self-archived February 21, 2006. Excerpt:
One can see a parallel between the telecenters and the open access archives. Both of them are using advances in technology to include the excluded and making available much needed information at a low cost through the "public commons" approach. Both of them are overcoming a serious problem by intelligently marrying technology and the public commons approach. Both of them are about sharing and caring. Both of them are eminently suited to increase the overall productivity of the world as a whole and lead to greater collective happiness. It sounds almost utopian. But many publishers, including some scientific societies, are working to stall the progress of the open access movement, as they see it as a potential threat to their business interests. On the other hand many donor agencies, such as the Wellcome Trust, who fund scientists to perform research are avid supporters of the movement. In the area of scientific data, as distinct from full texts of research papers, organizations such as ICSU (and CODATA) are promoting the culture of open access. Even Celera Genomics Corp., the for-profit company that sequenced the human genome simultaneously with the public-funded Human Genome Project, has stopped selling subscriptions for access to its sequence/data and would donate the data to the National Center for Biotechnology Information, USA. As Francis Collins of the National Human Genome Research Institute put it "data just wants to be public." Scientists in developing countries need particular attention, says Bruce Alberts, former President of the US National Academy of Sciences. In his 1999 presidential address  to the National Academy of Sciences, USA, he suggested "Connecting all scientists to the World Wide Web, where necessary by providing subsidized Internet access through commercial satellite networks," and "taking responsibility for generating a rich array of scientifically validated knowledge resources, made available free on the Web, in preparation for a time when universal Internet access for scientists is achieved in both developing and industrialized nations." ...Unfortunately digital technology also has brought about new forms of information enclosure that undermines the public's right to use, share, and reproduce information. Such enclosures threaten to undermine the political discourse, free speech, and creativity needed for a healthy democracy. And in reality governments can and do have considerable control over what is transacted on the Internet. As the Economist pointed out, "the Internet is part of the real world. Like all frontiers, it was wild for a while, but policemen always show up eventually."...As Nancy Kranich has said, public access to the Internet is vital to rekindle civic participation, and to claim for public space and to promote the public interest in the digital age. There are two ways of promoting public access to the Internet:  Enabling citizens the world over to use the tools of the information network to gain access to available information, as well as to create their own information and circulate it worldwide.  Ensuring free access to essential information, so that the opportunities provided by the Internet are actually used to spread throughout the world access by all to knowledge.
Scott Shane, U.S. Reclassifies Many Documents in Secret Review, New York Times, February 21, 2006. Excerpt:
In a seven-year-old secret program at the National Archives, intelligence agencies have been removing from public access thousands of historical documents that were available for years, including some already published by the State Department and others photocopied years ago by private historians. The restoration of classified status to more than 55,000 previously declassified pages began in 1999, when the Central Intelligence Agency and five other agencies objected to what they saw as a hasty release of sensitive information after a 1995 declassification order signed by President Bill Clinton. It accelerated after the Bush administration took office and especially after the 2001 terrorist attacks, according to archives records. But because the reclassification program is itself shrouded in secrecy — governed by a still-classified memorandum that prohibits the National Archives even from saying which agencies are involved — it continued virtually without outside notice until December. That was when an intelligence historian, Matthew M. Aid, noticed that dozens of documents he had copied years ago had been withdrawn from the archives' open shelves. Mr. Aid was struck by what seemed to him the innocuous contents of the documents — mostly decades-old State Department reports from the Korean War and the early cold war. He found that eight reclassified documents had been previously published in the State Department's history series, "Foreign Relations of the United States." "The stuff they pulled should never have been removed," he said. "Some of it is mundane, and some of it is outright ridiculous." After Mr. Aid and other historians complained, the archives' Information Security Oversight Office, which oversees government classification, began an audit of the reclassification program, said J. William Leonard, director of the office. Mr. Leonard said he ordered the audit after reviewing 16 withdrawn documents and concluding that none should be secret. "If those sample records were removed because somebody thought they were classified, I'm shocked and disappointed," Mr. Leonard said in an interview. "It just boggles the mind."
Charles W. Bailey Jr. has released version 61 of his authoritative Scholarly Electronic Publishing Bibliography. The new version cites and organizes over 2,610 print and online articles, books, and other sources on scholarly electronic publishing.
University of Strathclyde, with JISC funding, has launched Stargate (Static Repository Gateway and Toolkit), whose motto is "Enabling small publishers to participate in OAI-PMH-based services." From today's announcement:
The STARGATE project is exploring the use of static repositories as a means of exposing publisher metadata to OAI-based disclosure, discovery and alerting services within the JISC Information Environment and beyond. Such an approach allows smaller publishers to participate in OAI-based services without having to implement and maintain an full repository. The project is implementing a series of static repositories of publisher metadata, and will demonstrate the interoperability of the exposed metadata through harvesting and cross-searching via a static repository gateway, and conduct a critical evaluation of the static repository approach with publishers and service providers....Although the project will initially focus on e-journals, the static repository approach also lends itself to other types of publishing, including e-books, e-learning materials and other digital resources
Currently four journals and an OA archiving project are participating in Stargate: (1) Journal of Digital Information, (2) Information Research, (3) Library and Information Research, (4) Information Scotland, and (5) the PerX project
Laura Grant, Open university courseware trend comes to SA, Tectonic, February 21, 2006. Excerpt:
The University of the Western Cape (UWC) has made a policy decision to make its course material freely available over the Internet. This is believed to be a first for an African university. Material that will be made available includes courses, syllabuses, lecture notes and exam papers. UWC's "free content and free open courseware strategy" was approved by the university's senate in October 2005. It marks what Derek Keats, the executive director of information and communication services at UWC, describes as "the beginning of a journey". "Tertiary institutions the world over are recognising the value of freely sharing educational curricula and content, collaborating in their further development and extension, and doing so under the umbrella of free and unrestricted access to knowledge," the strategy document states. UWC is following a precedent set by the Massachusetts Institute of Technology (MIT) in the United States....Other universities in Japan, France and the US have followed suit. But, says Keats, to the best of his knowledge, UWC is the first in Africa to adopt a free and open courseware policy....Unlike MIT, which received $11 million in grants to get its OCW initiative going, UWC's free and open content project has no outside funding, says Keats. "It's a UWC initiative. We see it as part of our normal university function."...Unlike MIT, which uses a restrictive Creative Commons licence, UWC licenses its material under the Creative Commons Attribution-ShareAlike (BY-SA) licence because it promotes greater freedom, says Keats.
3D has released a new report, Policy Brief On Intellectual Property, Development And Human Rights: How Human Rights Can Support Proposals For A World Intellectual Property Organization (WIPO) Development Agenda, February 2006. Excerpt:
The Friends of Development proposal also covers a number of additional development issues, of which a Treaty on Access to Knowledge and Technology (A2K). The proposed A2K Treaty aims to respond to concerns that current trends in IP laws, particularly in relation to copyright, patents and databases, are limiting access to knowledge for public goods and thereby constraining innovation. Its objectives include increasing technology transfer to developing countries and promoting access by developing countries to the results of publicly funded research that might aid development. The principle of such a treaty is supported by the African Group proposal. The Chilean proposal does not make explicit mention of the A2K Treaty, but supports the idea of stronger protection for the public domain, in order to increase the availability and dissemination of knowledge. A number of human rights rules and mechanisms promote these objectives and could be harnessed as a supporting framework for the drafting of such a treaty. For example, human rights law calls for measures that respect, protect and fulfil the right to education, the right to seek, receive and impart information which is part of the right to freedom of expression, and the right to the enjoyment of the benefits of scientific progress and its applications. These human rights all have access to information as a core element. Therefore, they can be supportive of a treaty aimed at ensuring that IP rules and policies do not stifle access to public goods such as educational materials, public libraries, archives, commons databases, public broadcasts or publicly funded scientific research.
From a John Blossom post on today's ContentBlogger. He's talking about digital TV clips, not scholarship. But how far does his analysis carry over?
This is only the beginning of what promises to be a very long and frustrating battle between media executives who are still focusing on controlling distribution channels as the primary way to protect intellectual property and a Web-enabled audience who know that they are both the present and the future of content distribution. As with their music industry brethren, TV execs completely ignored grass-roots distribution until it now promises to create somewhat of a crisis in their industry. They think traditional marketing and channel control while the markets are already declaring on their own the "hit singles" that everyone wants to see and building their own distribution channels. While this can be frustrating to traditional producers, at the end of the day it's the most efficient and effective way to maximize the value of content.
Inera has created a web form that takes bibliographic citations in many different free-text formats and returns the DOI, if any, for the cited work. Unfortunately, access is limited to publisher-members of CrossRef. For details see the February 14 press release. (Thanks to ResourceShelf.)
Comment. Note to CrossRef: wouldn't it spread the use of DOIs to open up access to this useful service? If we could all find DOIs for arbitrary citations (when there are DOIs), then we could build more versatile URLs for our users, give more users links to pre-paid copies of non-OA works, and help publishers, authors, and readers. If the Inera service, or a new layer on top of it, could provide the DOI-based URL as well as the DOI, it would save users that step and be even more useful.
Update (3/2/06). CrossRef Executive Director Ed Pentz writes, "CrossRef is listening....In your post "Citations In, DOIs Out" you suggest CrossRef opening up its Free-Text Query service. It's a good idea but our initial focus is on testing this with our member publishers (including the leading OA publishers) to enable better reference linking. I wanted to point out that CrossRef has a freely available form to look up DOIs and we have an OpenURL 1.0 resolver that enables metadata to DOI and DOI to metadata queries." (Thanks, Ed.)
Atiz is selling BookDrive, a desktop book-scanner with automatic page turning. The price is $35,000. (Thanks to Gizmodo via LIS News.)
Comment. The price is high and I have no idea about the quality of the scans. But Atiz shows what's coming. Book scanning won't be limited to giants like Google, the OCA, the EU, or even university libraries. Publisher groups like the AAP were always wrong to complain that the Google book-scanning project threatened to give one company "control" of book literature. Google wouldn't control the print editions and Google wouldn't control other people's digital copies of the same books. However, it was true that the barrier to making separate digital copies was insurmountable for most people and most organizations. Imagine what will be possible as the barrier comes down.
James Boyle, Cultural environmentalism? Financial Times, February 20, 2006. Excerpt:
The United States’ ludicrous proposal for a “Webcasting Right” shows how much life there is in the old style of politics without evidence or democratic debate. But I see grounds for optimism. The existence of forums like this one shows a recognition that these are issues which deserve a public airing, and about which reasonable people can disagree. There are now a host of civil society, scientific, and civil rights groups that deal with these issues – not just the trade associations who have long held sway. We have our equivalents of Greenpeace, but also the National Trust, or the Conservation Societies. Industry groups have, slowly, come to understand that their interests on these issues are not monolithic; think of the differences between IBM and Microsoft, or the hardware companies and the record companies. Scientists and scholars are now more actively involved in shaping the rules that regulate their activities. And as intellectual property is reaching out to touch everyone who is a participant in digital culture creation, citizens are asking for more of a say. If the debate is not yet balanced or fair, it is at least now a debate, not a one-sided lecture from the content industry. Three concrete events in the last year make me think that the shift will continue. The first is a matter of private action. Google’s plans to make books as searchable as the world wide web present a classic case in which the conflicts will be highlighted for society. Can Google do this without paying permission fees to every publisher, something that would doom the project? As I explain elsewhere, I think the answer is “yes” but either way, the ideas at stake will be made concrete in a way that should transform the public debate for the better. Second, the European Commission released a report on the Database Directive, a report I featured in a previous column. For the first time I can remember, a governmental body assessed intellectual property policy on the basis of empirical evidence, rather than faith and anecdote. In any other field – drugs, the environment, traffic safety – this would be unremarkable. In intellectual property, it is both shocking and heartening. Finally, the US Register of Copyrights released a report on Orphan Works – copyrighted material whose owners cannot be identified or found and which is thus extremely hard to use legally. Orphan works constitute the majority of 20th century culture and the Copyright Office made an extremely balanced set of recommendations that we need to amend copyright law to make using those works easier and less risky. These initiatives may seem banal. Arguing that making works searchable would be good, that policy should be balanced and based on evidence, that copyright should not stand in the way of the dissemination it is supposed to promote – it hardly sounds like a radical agenda. It isn’t. But from where we have been in the last 10 years, banality looks great.
Consumer International has released a new report, Copyright and Access to Knowledge, February 2006. Excerpt:
Access to knowledge is critical for developing countries that seek to educate their masses. Educational materials therefore need to be made accessible to the public. Unfortunately, the international copyright regime has developed in a manner to increasingly curtail access. A variety of efforts have been mounted to safeguard the public right to freely participate in the cultural life of the community, to enjoy the arts and to share in scientific advancement and its benefits. They include the efforts of developing countries for a review of the TRIPS Agreement, the push for a development agenda in the WIPO, the civil society campaign for an Access to Knowledge Treaty and the various initiatives to promote access to copyrighted materials. This report by CI seeks to contribute to these efforts. It examines the existing international instruments on copyright to identify the provisions that may be relied on by national lawmakers to improve access to educational materials in their respective countries. The international copyright instruments examined are the Berne Convention, the TRIPS Agreement and the WCT. The report also examines the copyright laws of 11 developing countries in the Asia Pacific region to ascertain the extent to which the national lawmakers have availed themselves of the flexibilities presented in these instruments. The 11 countries are Bhutan, Cambodia, China, India, Indonesia, Kazakhstan, Malaysia, Mongolia, Papua New Guinea, the Philippines and Thailand. The flexibilities that exist in the three instruments appear in three forms: - [A] The scope of copyright protection, [B] The duration of copyright protection, [C] The limitations and exceptions....The international copyright instruments have, over the years, progressively expanded the scope of copyright protection, namely the works that are protected by copyright and the rights that are granted to copyright owners. The scope of copyright protection specified in the international instruments is however only the “minimum standard” and countries are therefore free to widen the scope beyond their obligations under the international instruments which they are parties to. Developing countries are net importers of copyright materials and it is in their interest to maintain the scope of copyright protection at its minimum. In addition, the international copyright instruments specify the “bundle of rights” that should be granted to copyright owners. Unfortunately, all 11 countries studied have either expanded the scope beyond what they are required to do or given copyright owners more rights than necessary under the relevant international instruments.
The report then details 14 ways in which the 11 countries could make better use of limitations and exceptions in the international copyright agreements they have signed. It also urges these countries to consider strategies such as supporting the Access to Knowledge Treaty, open-access journals, open-source software, and Creative Commons licenses.
Also see Consumer International's accompanying Statement to WIPO to coincide with the meeting of WIPO's Provisional Committee on Proposals Related to a WIPO Development Agenda (Geneva, February 20-24, 2006). Excerpt:
CI contends that the copyright law of developing countries should provide for public access to educational materials where this is already permitted and that all available flexibilities in the international copyright instruments should be relied on to achieve this purpose.. The study found that this has not happened. For example:  The Berne Convention does not prohibit the utilisation of the whole of a work for the purpose of teaching. However, only three of the 11 countries allow such a possibility.  The Berne Convention does not restrict the number of copies of publications or sound or visual recordings that can be made for the purpose of illustrations for teaching. However, five out of the 11 countries studied, expressly restrict the number of copies of these materials for teaching purposes.  The Berne Convention does not place any limitation on the purpose for which quotations can be made. Despite this, five of the 11 countries permit quotations to be made for only certain purposes. In addition, the study reveals that the draft laws used by WIPO to provide technical assistance are failing the interests of developing countries. It is inconceivable for a United Nations body like WIPO to not act in the best interests of its member states. WIPO’s actions are a disservice to developing countries. The flexibilities available under the various international copyright instruments are critical to enable better access to knowledge and ensure that such knowledge is in the public domain. In view of the above findings, CI calls on the PCDA to do the following:  Implement the proposal of Chile, in particular Proposal 3 to commission a study to examine to what extent the intellectual property laws of developing countries provides for access to knowledge in the public domain; and  Implement a review of the WIPO draft laws on copyright to ensure that WIPO’s legislative advise to developing countries contain all the available flexibilities for access to knowledge in the public domain permitted by the international treaties.
Update. See Frances Williams, Bad copyright advice ‘stunts learning’, Financial Times, February 20, 2006. A news story on the CI recommendations. Excerpt:
Poor countries are being wrongly advised to enact tougher copyright laws than required by international treaties, making access to copyright publications prohibitively expensive, Consumers International charged on Monday....“As a result, copyrighted educational materials in these countries are expensive and consumers are being priced out of access to knowledge,” said a CI report. A book costing $27 in Indonesia was equivalent to a US student paying more than $1,000 in GDP per capita terms. The London-based group, which links more than 230 consumer organisations in 113 countries, said the United Nations’ World Intellectual Property Organisation was giving “thoroughly inadequate” advice to poor nations. Such countries were already under pressure from the US and other industrialised countries to provide ever stronger copyright protection. WIPO’s “misleading” draft laws were reinforcing this pressure by including rights not required by international treaties and by failing to point out flexibilities, especially those relating to use of copyright work for educational purposes.
Yesterday I noted that six postings from February 17-18 had disappeared from Open Access News. Blogger has owned up to the explanation: a buggy database. (Thanks to Alf Eaton for pointing this out.)
Bear with me. Here are new copies of the six missing posts.
Recommendations for an Oxford OA repository
Michael Fraser, Towards a Research Repository for Oxford University, version 1.1, January 23, 2006. (Thanks to Stevan Harnad.) Excerpt:
In 2005 the Digital Archiving Group, a working group of Oxford’s Information and Communication Technologies Committee (ICTC), suggested that a pilot project for the digital archiving of scholarly papers be developed. To support this suggestion, and to investigate the wider issues relating to eprints and institutional repositories, a workshop was organised on 10 June 2005 (see Programme in Appendix B). The purpose of the workshop was to investigate the opportunities and challenges of developing an open access institutional repository for research....
(originally posted 2/18/06, 2:54 pm)
Report on OECD meeting on access and DRM
Philipp Bohn, The Future Digital Economy: A session report on DRM, INDICARE, February 17, 2006.
Abstract: On January 30th and 31st, the Organisation for Economic Co-Operation and Development (OECD) and the Italian Minister of Innovation and Technology, Lucio Stanca, invited delegates from all OECD countries to Rome. Several speakers were scheduled to discuss digital content creation, distribution and access. One panel specifically addressed "Content diffusion: IPR, DRM, licensing, content security, standards". This article summarizes some key ideas and statements, primarily concerning DRM.
(originally posted 2/18/06, 11:00 am)
India's president on bandwidth and access
T. Jayaraman, Enhance bandwidth immediately: Kalam, The Hindu, February 17, 2006. Excerpt:
India must aim at enhancing bandwidth immediately, and "as a nation, we must get 1 gigabit per second connectivity," President A.P.J. Abdul Kalam said here on Friday....Addressing some of India's leading scientists, he said, "I have a vision that bandwidth should be free and made available to all those who need it." Calling on the Government to take the lead in making it available, he dwelt at length on equitable access to education and knowledge in the digital era. Describing it as the "primary goal" of virtual universities, he said availability of high bandwidth would ensure that the best resources were accessible to all participants. "Bandwidth is the demolisher of imbalances and a great leveller in the knowledge society."
(originally posted 2/18/06, 10:39 am)
Esther Dyson on open content
The Value of Attention, Open Business, February 17, 2006. An interview with Esther Dyson. After discussing open content in social networking, music, journalism, and non-academic books, Dyson makes this remark. It"s not about scholarship, but how far does it transfer?
What’s good for authors is not necessarily good for (traditional) publishers. End of story. They [publishers] have to figure out non-traditional value to add.
(originally posted 2/18/06, 9:49 am)
New Elsevier policy on NIH-funded authors
US National Institutes of Health (NIH) voluntary posting ('Public Access') policy: Elsevier facilitates author posting in connection with the voluntary posting request of the NIH (referred to as the NIH 'Public Access Policy') by submitting the peer-reviewed author's manuscript directly to PubMed Central on request from the author, immediately after final publication. Please e-mail us at NIHauthorrequest@elsevier.com that your work has received NIH funding (with the NIH grant/project number(s), as well as name and e-mail address of the Principal Investigator(s) and that you intend to respond to the NIH request. Upon such confirmation, Elsevier will submit to PubMed Central on your behalf a version of your manuscript that will include peer-review comments, for public access posting 12 months after the final publication date. This will ensure that you will have responded fully to the NIH request policy. There will be no need for you to post your manuscript directly to PubMed Central, and any such posting is prohibited [PS: my emphasis] (although Elsevier will not request that manuscripts authored and posted by US government employees should be taken down from PubMed Central). Individual modifications to this general policy may apply to some Elsevier journals and its society publishing partners.
Comment. The NIH request is directed to authors (grantees), not publishers. Compliance is the author's decision. Like any other publisher, Elsevier has the right to put conditions on what it will accept and to refuse to publish anything that doesn't meet its conditions. But it's the only publisher I know who explicitly prohibits its NIH-funded authors from submitting their manuscripts directly to PubMed Central, even when authors are willing to respect Elsevier's embargo. I have two objections. (1) Elsevier is making authors into the rope in its tug-of-war with NIH. Authors shouldn't have to choose between their funder and their publisher. Instead of prohibiting authors from depositing directly in PubMed Central, Elsevier could give authors good reasons to ask Elsevier to deposit their work for them and let authors decide whether take advantage of the offer. The NIH could also save authors from these dilemmas by mandating deposit and cutting publishers out of the loop, and I expect that publisher policies like this one will help NIH come to that decision. (2) Some Elsevier journals (like Reproductive Toxicology) in effect warn authors about this policy up front on their web sites, but other Elsevier journals apparently follow the same policy without warning authors. Instead, they surprise authors by showing them this paragraph only in the copyright transfer agreement after they've submitted their work and gone through peer review.
(originally posted 2/17/06, 10:48 pm)
OA to geospatial data
Public Geo Data has launched an online petition calling for OA to EU-collected geospatial data. In particular, it calls on the EU Parliament to reject the current draft of the INSPIRE Directive on European Spatial Data Infrastructure, which provides for "cost recovery" instead of OA. Only citizens of the EU may sign the petition.
(originally posted 2/17/06, 5:20 pm)
Jo Walsh, Why Europe Needs to Provide its Own Public Geodata, Directions Magazine, February 20, 2006. Excerpt:
Last week, I asked my friend Norm Vine, a stalwart of the open source GIS community, if he could introduce me to someone at the U.S. Geological Survey who might be prepared to make a public statement about public access to national mapping data being a good thing for the national economy. He seemed genuinely perplexed for a moment, as if I'd just asked him to put me in touch with a fish that might be prepared to make a public statement on how water is a good thing for the marine ecology. In the U.S., public access to geographic data, "geodata" for short, has always been taken for granted as being part of the national heritage. Norm suggested that in the U.S., one of the reasons to have a government is to have good map data. George Washington himself was a surveyor and mapmaker. The interior of the North American continent was unknown to the colonists. In order to establish autonomy in the face of the colonial powers, they had to create and share accurate spatial models of where they were. At that time, Europe was gripped by bitter feuds over scarce but well-mapped resources. Quarrels resulted in attempts to gain access to the external resources vital to victory....Philip II kept the national maps of Spain under lock and key. Maps were powerful military technologies for both attack and defense, and thus were kept as military secrets. The British Empire published its maps openly and they were reprinted widely. When you look at the modern maps of Newfoundland, of the whole North American Eastern Seaboard, it is the British-applied names that have stayed with us to the present day....Sharing geographic data across borders is a keystone in the European effort to collaboratively manage resources, create fairer governance structures and contribute to each others' economic prosperity. The national mapping agencies (NMA) of Europe sometimes seem to be living in the colonial past. Public geodata are kept under lock and key, through copyright and commercial licensing terms that are prohibitive to ordinary citizens and spare-time, free-software enthusiasts who want to undertake amateur GIS projects. Europe's governing agencies, especially those that collect census information and manage resource networks, find it hard to cooperate....The proposed INSPIRE Directive (press release) on establishing a common spatial data infrastructure in Europe is the latest and greatest in a long series of initiatives undertaken by NMA representatives to fix some of these problems. INSPIRE aims to establish common standards for describing the physical world and the things in it, and to establish a framework across which different agencies that collect data can share data with one another. A common framework is something that Europe badly needs to maintain integrity....Each time the INSPIRE Directive draft has gone to a new stage in the co-decision process, the thematic types of data that it covers decrease; therefore the options that the public will have to even view images of geodata, let alone get access to them and work with them in their own GISs, are decreasing. The second version of the INSPIRE Directive has a new emphasis on protecting the intellectual property rights of the agencies that collect and distribute public geodata....As Europe moves into the 21st century, it needs to design a common spatial data infrastructure that works for all of its citizens. INSPIRE is not that infrastructure. The NMAs that designed it are quite rightly fearful for their role, with increasingly viable commercial alternatives to the data they collect and provide on the one hand, and government pressure to privatize formerly state-owned information infrastructure and gain short term profit from it, on the other. INSPIRE does not reflect the full debate around, or the full potential in, spatial data infrastructure as a tremendous engine of research innovation, new kinds of economic activity, and a reformed practice of civic engineering. So far the debate has been largely polarized between "information wants to be free!" and "you get what you pay for!" There are plenty of alternative models that can exist....Many European academics, researchers, small business persons and open source software developers are crying out for public access to the geodata that describe their world. They offer many new and accurate insights into how Europe can overcome the description problems inherent in having 25 different spatial models in as many different languages. I started talking about this with Norm Vine, because I've been working with Benjamin Henrion of the Foundation for Free Information Infrastructures. Henrion worked hard to roll back the Software Patents Directive that would have put the brakes on the potential for small businesses and academics in Europe to create their own software. Henrion and I are putting together a wiki website at which people can: learn more about the history of INSPIRE; find out how to get involved in the lobbying process; find others who consider INSPIRE to be designed without proper public consultation and without consideration for the negative economic and social effects that it may have. We've also started a public petition to give to members of the European Parliament. We’re urging them to look again at what must appear to most people outside of the geographic information industry to be a pretty obscure technical directive. But they underestimate the impact it will actually have on how Europe is managed and governed. If you're in Europe, please support this effort by signing the petition, talking to your non-GI friends about it and asking them to sign it too. If you're outside Europe, keep watching this space. The decisions made here and now about the next generation of spatial data infrastructure may impact your rights to get access to public geodata describing the world around you, and faster than you think.
Roger Cohen, U.S.-German Flare-Up Over Vast Nazi Camp Archives, New York Times, February 20, 2006. Excerpt:
Tempers are flaring over a United States demand to open to scholars and researchers a huge repository of information about the Holocaust contained in the files of the International Tracing Service at Bad Arolsen, Germany. Based in part on documents gathered by Allied forces as they liberated Nazi concentration camps, the stock of files held by the organization stretches for about 15.5 miles, and holds information on 17.5 million people. It amounts to one of the largest closed archives anywhere. The collection is unique in its intimate personal detailing of a catastrophe, which is what makes the question of open access so delicate. The papers may reveal who was treated for lice at which camp, what ghoulish medical experiment was conducted on which prisoner and why, who was accused by the Nazis of homosexuality or murder or incest or pedophilia, which Jews collaborated and how they were induced to do so. Since the end of World War II the Tracing Service, operating as an arm of the International Committee of the Red Cross, has used the files to help people trace the fates of relatives who disappeared into the murderous vortex of Nazi terror. Now, more than 60 years after the end of the war, the United States says that task is largely done and it is time to open up the archive, copy it so that it can also be stored in other countries and make it available to historians. "The U.S. government favors opening up all records on the Holocaust," said Edward O'Donnell, the special envoy for Holocaust issues at the State Department. "Our objective is to open the archive, and we will continue to push." But that push has met a wall of legal and procedural objections - from Charles Biedermann, the Red Cross official who has been director of the Tracing Service for two decades, and from the German and Italian governments. The atmosphere within the 11-nation international commission that oversees the operation has become poisonous. At meetings to discuss the opening of the archive, German officials have asked whether it is really in anyone's interest to have accusations about particular Jews being murderers or homosexuals made public. Because German privacy laws are much stricter than those in the United States, German authorities are concerned that an opening could lead to lawsuits charging that personal information was handed out illegally. Wide access to the papers could also provoke new claims for compensation.
Comment. In this article, "open access" means open to scholars even if the documents are still offline and limited to paper. But I have no doubt that if the archive is opened in that sense, then it will eventually be scanned and become open in our sense as well.
Will Open-Access Data Hurt Journals? Linux Insider, February 19, 2006. An opinion piece signed by Medical Marketing and Media, which describes itself as "a monthly business publication that has been serving healthcare marketers since 1966....MM&M's...editorial advisory board [is] a panel of twelve industry executives representing healthcare manufacturers, advertising agencies, media, and marketing research." Excerpt:
Increasingly, physicians can find raw clinical-trials data and peer-reviewed, if not necessarily polished, presentations of studies online. Will open-source publishing and online posting of clinical-trials data devalue journals? The only real impact will be an increase in high-quality clinical information accessible to Internet search engines. The need to publish or report clinical-trial data is unlikely to impact journals. Data will either be presented raw or formatted into a scientific article and put through the traditional publishing process-an expansion of journal content. The call to have all NIH-sponsored research articles published through Pub Med Central within 6 to 12 months of original publication is unlikely to significantly impact traditional journals, where there is an urgency to communicate results...."Open-access" publishing, where authors pay to publish and access to content is free (for example, Public Library of Science), is limited to the peer-reviewed, scholarly literature and has yet to demonstrate economic viability. It is an attempt to develop a different business model for journals, not an attack on the journal concept....[PLoS Clinical Trials] will publish trials irrespective of outcome and impact -- possible because PLoS does not derive income from subscriptions. Second, PLoS Clinical Trials is an open-access journal, allowing anyone to read, redistribute and use articles freely for any lawful purpose. This initiative cannot be said to devalue journals but will increase the reporting and public availability and applicability of trial results....Neither of these initiatives [the NIH policy and OA to clinical trial data? one of these and the rise of OA journals?] will significantly affect that role. A major challenge to scientists today is to identify the most critical, accurate and important information within the voluminous amount of published literature. Peer review remains the cornerstone of ensuring that research findings are interpreted correctly and presented objectively....The growing concern over bias in the dissemination of medical knowledge may well highlight the value of the peer review and editorial review offered by high-quality journals. Authors generally want their articles to appear in the most respected journals, not the ones with the lowest barriers to access. Because good reputations take time to develop, most of the best journals are still conventionally financed print publications....That will never change completely. "Conventional" journals will continue to evolve in unconventional ways to preserve their advantage, making more content available free online while capitalizing on the strengths of print and online publishing. Open access constitutes a healthy challenge to journals, not a threat.
Comment. Four quick responses. (1) I fully agree that OA --to journal literature and to data-- is a healthy challenge to conventional jourals, not a threat, and that conventional journals can evolve in unconventional ways to respond to this challenge. My own view is that the most adaptive mutations will involve some forms of OA, new kinds of added value, or both. (2) By suggesting that conventional (non-OA) journals will survive because they perform peer review, the authors imply that OA journals do not perform peer review. This is false, of course. As of today, the DOAJ lists 2047 peer-reviewed OA journals worldwide and adds more almost every day. (3) The authors are right that most OA journals, because they are new, lack the prestige of more established conventional journals. But their conclusion that OA journals will never catch or surpass non-OA journals in prestige is an empty, unargued reassurance. (4) Their view that "authors generally want their articles to appear in the most respected journals, not the ones with the lowest barriers to access" invents a trade-off where there isn't one. Authors want both high prestige and wide access. And they can have both today, either through high-prestige OA journals (like PLoS Biology) or through the combination of high-prestige conventional journals and self-archiving in an OA repository.
JISC has launched a consultation on Internet Archaeology, the journal published by the Council for British Archaeology. JISC is considering an investment that would allow IA to become OA. The consultation is to learn the value of IA to users, the importance of OA for IA, and whether this investment should stand high or low on JISC's priority list. Responses are due by April 7, 2006.
A few new independent, Open Access journals have already launched this year on the BioMed Central platform. In addition, there's a title change and journal which has moved to it's third publisher in less than two decades. Geochemical Transactions is in the process of transferring the files originally published with the Royal Society of Chemistry and the American Institute of Physics to its new website. In the meantime, the content is freely available from its original locations.
Biology Direct - Fulltext v1+ (2006+); ISSN: 1745-6150.
Geochemical Transactions - Fulltext v1+ (2000+); ISSN: 1467-4866.
Substance Abuse Treatment, Prevention, and Policy - Fulltext v1+ (2006+); ISSN: 1747-597X.
Trials - Fulltext v7+ (2006+). Continues Current Controlled Trials in Cardiovascular Medicine; ISSN: 1745-6215.
Current Controlled Trials in Cardiovascular Medicine - Fulltext v1-6 (2000-2005). Continued by Trials; ISSN: 1468-6708
Hindawi Publishing has embraced Open Access publishing in a major way. Hindawi is an established, low-cost, for-profit STM journal publisher that has decided that OA for-profit publishing is a viable model, both for start-up titles and by converting formerly subscription-based titles. In December 2005, Hindawi had a slate of 12 OA titles. The list has since expanded to 14 titles with the stated intent to grow by another 15 journals by the end of 2006.
* Bioinorganic Chemistry and Applications - Fulltext v1+ (2003+); Print ISSN: 1565-3633, Online ISSN: 1687-479X.
* Differential Equations and Nonlinear Mechanics - Fulltext (2006+); Print ISSN: 1687-4099, Online ISSN: 1687-4102.
* EURASIP Journal on Audio, Speech, and Music Processing - Forthcoming; Print ISSN: 1687-4714, Online ISSN: 1687-4722.
* EURASIP Journal on Bioinformatics and Systems Biology - Forthcoming; Print ISSN: 1687-4145, Online ISSN: 1687-4153.
* EURASIP Journal on Embedded Systems - Forthcoming; Print ISSN: 1687-3955, Online ISSN: 1687-3963.
* EURASIP Journal on Wireless Communications and Networking - Fulltext (2004+); Print ISSN: 1687-1472, Online ISSN: 1687-1499.
* International Journal of Biomedical Imaging - Fulltext (2006+); Print ISSN: 1687-4188, Online ISSN: 1687-4196.
* International Journal of Image and Video Processing - Forthcoming; no ISSN posted.
* International Journal of Photoenergy - Fulltext v1-7 (1999-2005), (2006+); ISSN: 1110-662X.
* International Journal of Rotating Machinery - Fulltext v1-10 (1994-2004), (2005+); Print ISSN: 1023-621X, Online ISSN: 1542-3034.
* Journal of Biomedicine and Biotechnology - Fulltext v1-2 (2001-2002), (2003+); Print ISSN: 1110-7243, Online ISSN: 1110-7251.
* Journal of Nanomaterials - Forthcoming; Print ISSN: 1687-4110, Online ISSN: 1687-4129.
* Mathematical Problems in Engineering - Fulltext v1-8 (1995-2002), (2003+); Print ISSN: 1024-123X, Online ISSN: 1563-5147.
* PPAR Research - Forthcoming; Print ISSN: 1687-4757, Online ISSN: 1687-4765.
Six consecutive postings from February 17 and 18 have disappeared from Open Access News. I didn't delete them and I'm trying to find out what happened. I can reconstruct them and will re-post them soon. But in the meantime, here are the titles (in blog order, most recent first):
All six are missing from the OAN front page, where, because of their dates, they should still be on display. But oddly, only the last five listed (the first five chronologically) are missing from the OAN archive page for February 12-18.
If you look around the blogosphere, especially at the blog search engines, you'll find links to the missing posts. So they did make it out, through the web and my RSS and Atom feeds, before disappearing. The feeds no longer contain the posts because they are polling a source that no longer contains them. My apologies to readers who follow those links and can't find even a shadow of the missing substance. (Until I can re-post the substance, here's a shadow.) I'm as baffled and frustrated as you are.
It's too early to suspect mischief. But to rule it out, I'd appreciate hearing from any Blogger experts who might be able to suggest innocuous theories about what might have happened.
Teemu Arina, Drucker got something wrong: from formal to informal, Tarina, February 18, 2006. Excerpt:
Over 10 years ago in 1994 Peter Drucker gave a lecture at Harvard University’s John F. Kennedy School of Government about knowledge workers....When saying that “with knowledge being universally accessible there are no excuses for nonperformance“, he probably didn’t predict the importance of internet as an informal channel but rather education as something that is universally accessible to all. What he didn’t know is that the internet will finally open access to all through the fact that the lowered transaction costs for learning gets anyone in contact with anyone else. Especially those who are eager to learn will find much wiser people in online social networks than what is locally available. Social software will boost this even further.
Merrill Goozner, Roadblocks on the Medical Information Superhighway, GoozNews, February 18, 2006. Excerpt:
Among the four major medical journals, the British Medical Journal is my favorite. It has the most extensive news coverage of the fight against the infectious diseases that are ravaging the developing world, a subject in which I take a keen interest. And it consistently prints iconoclastic studies that take on the medical establishment. And, until this year, it was entirely free on the web. That made it distinct from the other big-time journals (the New England Journal of Medicine, The Lancet, and the Journal of the American Medical Association), which charge hefty fees to view individual articles of interest if you weren’t a subscriber. But, alas, BMJ’s open access policy went the way of the dodo bird this January, leaving medical consumers locked out of all the major journals. Accessing some of BMJ’s original research articles and all of its news section now require a subscription to the journal. It’s testimony to the poverty of the public policy debate in this country that no one in the mainstream media has yet raised this issue – access to cutting edge medical information – in response to the Bush administration’s push for “consumer-driven health care” through individual health savings accounts. How are consumers to choose wisely when and where to get health care if they are cut off from key sources of information?...[I]f I were to get deathly ill, I wouldn’t trust myself for a minute to have figured out all the options for speeding my own recovery. Even in routine care, I need professional help to sort out the issues. For instance, I had a colonoscopy three years ago and got a clean bill of health. I’m in my mid-50s. Do I need another one? Do I want to make that determination based on the size of my health savings account or some uninformed speculation evaluating my need to save that cash for a rainy day versus preventive medicine? Of course, I could go to the internet for advice. But if I search “colonoscopy frequency” on Google, I find a similar set of unspecific advisories on for-profit and non-profit websites....Before I trusted the internet to help me make this decision, I would prefer to read a solid review of all the most recent studies printed in a well-respected, peer-reviewed journal like the BMJ. But the change in policy doesn’t allow me that option anymore.
Comment. A common publisher argument against OA to medical preprints is that lay readers aren't equipped to judge them and need the benefit of peer review. Goozner has exactly the right response: OK, so give us OA to the peer-reviewed research!
BioMed Central (BMC) is a for-profit Open Access journal publisher. BMC hosts independent, Open Access journals, in addition to publishing the BMC series and a few others. Currently BMC hosts 81 active independent, Open Access journals. Here are the 21 titles independent OA titles incubating at BMC:
Algorithms for Molecular Biology; ISSN: 1748-7188.
Annals of Surgical Innovation and Research; ISSN: 1750-1164.
Carbon Balance and Management; ISSN: 1750-0680.
Cell Division; ISSN: 1747-1028.
Chinese Medicine; ISSN: 1749-8546.
Diagnostic Pathology; ISSN: 1746-1596.
Education for Evidence-Based Practice; ISSN: 1750-1598.
Implementation Science; ISSN: 1748-5908.
International Breastfeeding Journal; ISSN: 1746-4358.
Journal of Biomedical Discovery and Collaboration; ISSN: 1747-5333.
Journal of Brachial Plexus and Peripheral Nerve Injury; ISSN: 1749-7221.
Journal of Cardiothoracic Surgery; ISSN: 1749-8090.
Journal of Orthopaedic Surgery and Research; ISSN: 1749-799X.
Molecular Neurodegeneration; ISSN: 1750-1326.
Nanotechnology, Diagnostics, and Therapeutics; ISSN: 1750-1156.
Orphanet Journal of Rare Diseases; ISSN: 1750-1172.
Philosophy, Ethics, and Humanities in Medicine; ISSN: 1747-5341.
Radiation Oncology; ISSN: 1748-717X.
Scoliosis; ISSN: 1748-7161.
Synthetic and Systems Biology; ISSN: 1747-8332.
World Journal of Emergency Surgery; ISSN: 1749-7922.
Heather Morrison, Peer Review & Replication: A Tale of Two Experiments, Imaginary Journal of Poetic Economics, February 18, 2006. Excerpt:
[I]n science, replicability of a study is a far better means of assessing the accuracy of research results than peer review. In an open source science environment, many experiments could be replicated in less than the average time it takes to publish a peer-reviewed paper....In the electronic environment, it should be possible to develop means of tracking replications of a given experiment. This does not necessarily mean that peer review would not be necessary and desirable....
Comment. Exactly. By permitting wider and more rapid dissemination than conventional publication, OA permits wider and quicker uptake, confirmation, and disconfirmation. This benefits everyone who cares about research and it doesn't exclude any form of peer review (traditional or innovative). Let a thousand forms of quality control bloom --or more simply, let's take full advantage of OA and the internet. They don't strap us into a single model of quality control but free us to try new ones and use multiple models simultaneously.
Hochschulleitungen diskutieren Aktionsprogramm zu Open Access, a press release from the University Rectors Conference (Hochschulrektorenkonferenz, HRK), February 16, 2006. (Thanks to netbib.) A short summary of the OA discussion at a recent meeting of the HRK. Many of the rectors and guest speakers argued that German universities ought to adopt policies supporting OA through repositories and journals. Read the original German or Google's English.
Modern mathematical proofs changing due to collaborations, computers, a press release from Washington University, February 18, 2006. (Thanks to William Walsh.) Excerpt:
Steven Krantz, Ph.D., professor of mathematics in Arts & Sciences at Washington University in St. Louis, said that it is becoming more difficult to verify proofs today and that the concept of the proof has undergone serious change over the course of his 30-plus years career....Today, many mathematical papers claiming proof of a solved problem often are posted on a non-peer-reviewed, preprint server called arXiv, located at Cornell University and approved by the American Mathematical Association. "I think that arXiv is a great device for dissemination of mathematical work, " Krantz said. "But it is not good for archiving and validation. The reason that arXiv works so well is that there is no refereeing. You just post your work and that is it. Furthermore, those interested in certain subject areas are automatically notified of new postings. The work gets out there quickly, and it's free. Everybody has access to arXiv. But there is no peer review. "Publishing is a process that involves vetting, editing, and several other important steps. We must keep that issue separate from dissemination. And dissemination is important in its own right. But it's a separate issue."...
Comment. Krantz is right that arXiv is primarily an access or dissemination service, not a vetting or peer-review service. I'm aware that some physicists and mathematicians believe that the most important kind of vetting takes place independently of journals, and prior to journal submission, in preprint exchanges and their associated discussions. Whether that kind of vetting is adequate or inadequate is a good question. I won't weigh in on that except to point out that it's a question about the adequate or preferable forms of peer review, not the value arXiv or the nature of open access.