Open Access News

News from the open access movement


Saturday, October 15, 2005

November issue of Walt Crawford's Cites & Insights

The November issue of Walt Crawford's Cites & Insights is now online. This issue has a long section on Library Access to Scholarship, covering good general OA resources; selected news items; reflections on the access policies of the NIH, Wellcome Trust, and RCUK; the controversy between the ACS and PubChem; the ALPSP response to the RCUK policy; and recommended recent journal articles on OA-related topics. Excerpt:
The best sources for news and perspectives on open access continue to be Peter Suber’s Open Access News weblog, (and the monthly newsletter from Suber that’s publicized on the blog); Charles W. Bailey, Jr.’s Scholarly Electronic Publishing Weblog; and Charles’ other blog, DigitalKoans. Those aren’t the only sources. In her new job, Dorothea Salo’s been offering some fascinating posts at Caveat Lector, about the realities of running a DSpace installation. There are others....

As Peter Suber notes in SPARC open access newsletter 90 (October 2, 2005), [the Wellcome Trust open-access policy] “does not require publisher consent and therefore does not accommodate publisher resistance” --which should also be true of NIH and RCUK policies....

I’m alarmed --alarmed that a society of chemists [the ACS] is headed by someone [Madeleine Jacobs] capable of making such exaggerated claims. Jacobs goes on to note that NIH’s $30 billion budget dwarfs the ACS budget and says NIH “should use its money to support research grants to advance its mission.” (I would suggest that using one-one hundredth of one percent of that money to assure access to research results might be considered an effective way to advance NIH’s mission, but I’m not ACS.)...

[On the early data on the compliance rate for the NIH public-access policy:] Those are appallingly low figures, suggesting that the voluntary process may not be working....

[On the ALPSP response to the draft RCUK policy:] The response from ALPSP is so predictable that it’s hardly worth recounting....Disastrous scenario. Destroyed viability. Threat to peer review. And the indirect assertion that it is the responsibility of “cash-strapped libraries” to subsidize the non-publishing activities of professional societies. One wonders what ALPSP believes “cash-strapped libraries” will do if ALPSP and its allies succeed in making sure that there are no alternatives to current journal prices and practices. Stop buying monographs altogether? Lay off staff? Or, ahem, cancel subscriptions even if that means less access?....

[On the ALPSP concession that publishers don't conduct peer review but only organize it:] When submissions and refereeing are handled electronically, that cost [of organization] should amount to a modest spreadsheet or database (say, MySQL or Access) and a tiny amount of someone’s time to track papers and results: The kind of thing that a good administrative assistant in an academic department could handle in a few hours a week for a midrange journal handling 100-200 submissions a year.

British Library plans for National Digital Library

Mark Chillingworth, BL recruits to define digital library, Information World Review, October 14, 2005. Excerpt:
Ruth Jones has been appointed head of development at the British Library (BL), a new role that will involve redefining the role of the BL as a National Digital Library. Jones was the founding general manager of Extenza e-Publishing, seeing the company through start-up to sell-off, and will use this business experience in her new role. “I think it’s interesting that they brought someone in with a commercial background. I think they wanted someone with a strong empathy with the information industry,” she said. Jones is the latest in a long line of BL recruits from the private sector. The BL strategy for 2005 to 2008 makes the digital library a priority. “Our challenge is to redefine the role and purpose of the British Library in the information revolution of the 21st century,” it said. Jones told IWR that BL’s centrality to the UK information industry made its digital ambitions pivotal. “All routes lead there,” she said. The National Digital Library is already in a pilot stage with 23 journal publishers contributing electronic content to the archive. Jones said the difference between the National Digital Library and the growing number of archives being launched by commercial publishers is the BL’s remit to preserve content in perpetuity. [PS: What about open access?] A team of six licensing experts and five product development experts have been pulled together to direct development, liaise with publishers and work with all areas of the BL in development. Jones said BL Direct, its web-based document delivery service, is the platform from which she will launch the digital library. The trial is currently taking in PDF content, but is looking at a long-term technology that will provide access to future generations. Full-text XML is the preferred technology, but Jones said there needs to be a move towards international standards on archiving digital content.

Forthcoming thesis on OA citation patterns

A pseudonymous grad student with a blog has posted a note about her dissertation topic:
My thesis focuses on informal citation patterns (citation on the web--including blogs, list-serves, syllabi, web sites, etc., as well as vitas, course pages, and other online articles) as a result of publishing in the OA journal, PLoS Biology, comparing these citation patterns to a similar print-only journal citation pattern.
She poses several questions about the topic and would probably welcome answers and suggestions through the comment section of her blog.

OA from the start of a research project

Heather Morrison, Immediate Open Access: Every Step of the Way, Imaginary Journal of Poetic Economics, October 14, 2005. Excerpt:
Here is a thought for the SSHRC Consultation on Open Access: why wait until publication for open access? Indeed - why wait until the research is completed, or even begun? What if a researcher were to post the details of a SSHRC funded research grant just as soon as confirmation of the funding was received? What if this were linked to background information, and supplemented by research data as soon as it was available? Then each draft of the research report as it became available - along with openly available peer reviews, then the final peer-reviewed versions when this was available? With this approach, Canadians (and others, of course) could watch the research as it progressed. For some researchers, this might help to attract needed study participants. In some cases, perhaps students at various levels could conduct some of the research - or, perhaps the students could carry out parallel studies. Would there be excitement? There is no suspense with a completed study - but what about research in progress, where the results are not yet known?

Friday, October 14, 2005

Fred Friend responds to the Kaufman-Wills report

Fred Friend has posted a response to the Kaufman-Wills report published by ALPSP. Excerpt:
The recent ALPSP report on "The facts about open access" is a valuable contribution to our understanding of the journals route to open access, and I pay tribute to the sponsors of this report. It would have been more helpful if the word "journals" had been added to the title, because the report does not cover open access itself but one particular - albeit important - route to open access....By and large the report is fair and balanced, with some perhaps inevitable bias towards the existing journal publication system. The third "conclusion", that "peer review and copy-editing may be less rigorous with full open access journals", is not factual, being based upon an assumption that internal review and new forms like post-publication review are less rigorous than conventional peer review. And the "online usage" statistics in Table 23 do not appear to have been adjusted to reflect the fact that DOAJ journals usually contain fewer articles, an adjustment which would give a fairer picture of the article download and full text page view situation. It would also have given a fairer picture of open access journal income if the facts in Table 24 about the substantial numbers of print subscriptions open access journals are often able to attract had been highlighted. Many of the facts in the report, as is acknowledged in Sally Morris' good "Introduction", reflect the youth of open access journals, and therefore cause no surprise. The finding I find most fascinating is that "most journals surveyed are planning to test or adopt a different business model in the next three years". The study undertaken by Mary Waltham for JISC on learned society business models demonstrated that the current dependence upon institutional subscriptions cannot be relied upon to provide a secure future, and the news that so many journals are looking for change is good news for us all. My appeal to publishers is to discuss their plans more openly with those organizations and individuals who wish them well. I agree with Sally that we need more factual studies of the changes taking place in scholarly communication, and there may be ways in which the work of JISC and other organizations can build upon the results of the ALPSP study.

Jan Velterop responds to the Kaufman-Wills study

Jan Velterop has posted a response to the Kaufman-Wills report published by ALPSP. Excerpt:
[1] The ALPSP, AAMC and HighWire journals considered are well established; on average, they published their first print issues 40 years ago. The DOAJ (Directory of Open Access Journals) titles are much newer; half started publication during the last decade. Might this mean that the comparison was not exactly ‘like-for-like’? Furthermore, the comparison was between subscription journals and fully open access ones; no hybrid journals (i.e. journals that offer the authors a choice) were considered, where that model is obviously the most suited for established journals. [2] Most ALPSP journals made a surplus (75%). 41% of the full OA journals made a loss; 24% broke even, and 35% made a surplus. Might this mean that 1/4 of the long-established ALPSP journals still doesn’t make a profit and that almost 60% of the new open access journals at least break even, and well over half of those are profitable? If the comparison would have been between these new open access journals and equally new traditional journals, headlines such as “Open Access Publishing ‘Unprofitable’” (as reported in The Bookseller) would not likely be justified. [3] Furthermore, if one takes a look at the open access journals in the Directory of Open Access Journals, one sees quite a few subsidised journals, of the sort that would also be subsidised were they traditional journals. The ALPSP study hardly, therefore, seems to offer a comparison that enlightens us. Or maybe it does, if indeed nearly 60% breaks even, which is given the sort of journals we are looking at quite phenomenal. [4] Of the ALPSP group of journals, 34% are planning to test or adapt a new online business model sometime in the next three years. That's great. It means that quite a few are willing to look forward rather than back.

BMC responds to the Kaufman-Wills report

BioMed Central has released a response (October 14) to the Kaufman-Wills report published by ALPSP. Excerpt:
BioMed Central welcomes objective research into open access publishing. Unfortunately, however, the report published by ALPSP this week ("The Facts about Open Access") contains significant factual inaccuracies. We also disagree with many of the reports interpretations and conclusions. The two most serious problems with the report are that it inaccurately describes the peer review process operated by BioMed Central's journals, and it also draws unjustified conclusions concerning the long-term sustainability of open access journals. The overview of the report incorrectly states that BioMed Central does not operate external peer review on most of its journals. In fact, all of BioMed Central's journals operate full peer review using external peer reviewers....the BioMed Central/ISP group of journals is reported to offer online manuscript submission on a lower percentage of journals than other journal groups. The report picks up on this as a surprising finding, suggesting implicitly that open access journals are lagging behind in this regard. In fact, BioMed Central offers online submission of manuscripts on every one of its journals. Not only that, but BioMed Central's manuscript submission system is widely praised by authors, many of whom tell us that it is the best online submission system they have used....

ALPSP Chief Executive Sally Morris comments in her introduction to the report that "Over 40% of the Open Access journals are not yet covering their costs and, unlike subscription journals, there is no reason why the passage of time - evidenced in increasing submissions, quality or impact - should actually change that". She goes on to suggest that this calls into question the sustainability of the open access publishing model. The suggestion that the economics of open access journals are unlikely to improve over time is not supported by the evidence in the report, and runs strongly counter to BioMed Central's direct experience. According to BioMed Central Publisher, Dr Matthew Cockerill, "The fact that many open access journals currently operate at a loss is simply a sign that these are early days. There is every reason to think that the passage of time will profoundly improve the ability of open access journals to cover their costs. Between September 2004 and September 2005, for example, the journal BMC Bioinformatics almost trebled the number of submissions it received. It also increased its article processing charge during that same time period. Both factors have helped move BioMed Central much closer to overall profitability, and this progress is continuing."

Further evidence for a promising future for open access journals is given in the study's findings on revenue expectations and trends. 92% of open access journals were meeting or exceeding revenue expectations, in comparison to 91% of AAMC journals, 83% of ALPSP journals and 76% of surveyed HighWire journals. Similarly, the study finds that revenues from the last fiscal year to the current fiscal year are "trending upward" for 71% of 209 surveyed open-access journals, compared to between 27% and 67% of subscription-based publishers that were surveyed.

Google reaches out to librarians

Google has launched a Librarian Center to help librarians "keep up to date with Google". (Thanks to Richard Ackerman.)

Interview with Laurent Romary on OA

Laurent Romary, Disposer d’un espace où les données circuleront librement, Libre Accès à l'information scientifique & technique (from INIST-CNRS), October 14, 2005. An interview on OA, especially OA to data. Romary is the director of the French National Institute for Research in Computer Sciences and Control (INRIA).

OA increases citation impact outside academic literature too

Alec Banning, Possible correlation between citation and cost of content, Librarians at the Gate, October 13, 2005. We know that OA increases citation impact for academic literature. Now we have evidence of the same connection for newspaper columns. Citations to three popular New York Times columnists have decreased since the Times decided to make them TA rather than OA. It appears --no surprise-- that the connection between OA and citation impact is a general phenomenon.

The Adelphi Charter to protect the public domain and advance OA

The Royal Society for the Encouragement of Arts, Manufactures and Commerce (RSA) has published the Adelphi Charter on Creativity, Innovation and Intellectual Property (October 13, 2005).

From the charter web site:

The Adelphi Charter on Creativity, Innovation and Intellectual Property responds to one of the most profound challenges of the 21st century: How to ensure that everyone has access to ideas and knowledge, and that intellectual property laws do not become too restrictive. The Charter sets out new principles for copyrights and patents, and calls on governments to apply a new public interest test. It promotes a new, fair, user-friendly and efficient way of handing out intellectual property rights in the 21st century. The Charter has been written by an international group of artists, scientists, lawyers, politicians, economists, academics and business experts.

From the charter text itself:

Creativity and investment should be recognised and rewarded. The purpose of intellectual property law (such as copyright and patents) should be, now as it was in the past, to ensure both the sharing of knowledge and the rewarding of innovation. The expansion in the law’s breadth, scope and term over the last 30 years has resulted in an intellectual property regime which is radically out of line with modern technological, economic and social trends. This threatens the chain of creativity and innovation on which we and future generations depend....[3] The public interest requires a balance between the public domain and private rights. It also requires a balance between the free competition that is essential for economic vitality and the monopoly rights granted by intellectual property laws. [4] Intellectual property protection must not be extended to abstract ideas, facts or data. [5] Patents must not be extended over mathematical models, scientific theories, computer code, methods for teaching, business processes, methods of medical diagnosis, therapy or surgery. [6] Copyright and patents must be limited in time and their terms must not extend beyond what is proportionate and necessary. [7] Government must facilitate a wide range of policies to stimulate access and innovation,including non-proprietary models such as open source software licensing and open access to scientific literature. [8] Intellectual property laws must take account of developing countries’ social and economic circumstances....There must be an automatic presumption against creating new areas of intellectual property protection, extending existing privileges or extending the duration of rights.

For background, see the charter FAQ or James Boyle's column in the October 14 issue of The Guardian, Protecting the public domain, Excerpt from Boyle:

We have forgotten the fundamental truth that Jefferson, and Macaulay understood so well. Property rights are only half of the system. Just as important is the realm of material that is not owned, the public domain, the raw material from which the next invention, novel or song will spring. This debate has been an uneven one over the past few years. The impressive report of the UK Commission on Intellectual Property Rights, which the government seems to have forgotten already, marked a momentary departure from faith-based policy. There have been isolated victories, defensive levees raised here and there against the rising tide of monopoly rents. What has been missing is a positive statement of what good intellectual property policy is. But perhaps things are changing a little. On October 13 in the Great Room of the RSA [Royal Society for the Encouragement of Arts, Manufactures and Commerce], a first stab at answering that question is due to be released. Called the Adelphi charter, it is an attempt to lay out those principles. Central among them are the ideas that policy should be evidence-based and that it should respect the balance between property and the public domain, not eliminate the latter to maximise the former. Full disclosure: I was among those who came up with the idea for the charter even though I can claim scant authorship credit in the text that resulted: the steering committee's members ranged from a Nobel laureate who helped sequence the human genome, to an executive who worked in streaming media. But somehow it seems fitting that the charter will be launched in a room that bears (and bares) eloquent testimony to the fact that these ideas are neither new nor radical.

"Kahle's goal is universal access to all knowledge"

Becky Hogge profiles Brewster Kahle in the October 17 New Stateman. Excerpt:
Like other successful net entrepreneurs, Kahle has ploughed his spoils into a non-profit endeavour, and the result is the Internet Archive, an attempt to achieve what the ancient Greeks and Egyptians tried at the library of Alexandria: to make a permanent record of all human knowledge. But this time around the library will be on the internet, universally accessible and - crucially - flameproof....Yes, Kahle's goal is universal access to all knowledge, and so the Internet Archive aims to make every book ever written available over the web. "The ancient library of Alexandria collected 75 per cent of all the books of all the peoples of the world in 300BC. Our opportunity is to do that again, but then to one-up the ancients by making it available universally. It is technologically within our grasp and it could be one of the greatest achievements of humankind." In building this library, Kahle has had to sue his own country. The case, which is awaiting a hearing date, argues that copyright regulations enacted in anticipation of the digital age are contrary to the First Amendment. He explains: "I think it will come as a surprise to most people that the library system of physical books we grew up with has been made risky, if not illegal, in the digital world. This makes no sense. We needed some clarification as to what we as a library were allowed to do with, for example, out-of-print and orphan works, things that are traditionally found on the shelves of libraries. In the US, the way you ask a question like that is you file a lawsuit." By chance, I catch him on the morning he has announced the Open Content Alliance, a partnership between the Internet Archive, the search engine Yahoo! and the University of California to digitise historical works of fiction. The announcement comes nine months after Google launched a similar, huge-scale digitisation project. The two projects differ in the way they propose to offer the digital books once they have scanned them. Whereas Kahle's team would make full-text, searchable copies available to all users of the web, Google will allow access only from the libraries in which the original books are stored, and as extracts through its commercial service Google Print. When I ask Kahle how he feels about Google's project, he takes a deep breath. "We applaud the enthusiasm of Google to make steps in this area and digitise materials. But we would rather see an open system applied for the open content arena, in the same tradition as the open networks that brought us the internet. That fits much better with civic institutions like libraries."

Thursday, October 13, 2005

TA article on OpenDOAR

OpenDOAR or directory of Open Access repositories, Information Services and Use, 25, 2 (2005). The author is not indicated and no abstract is online, at least so far.

More on the Kaufman-Wills report

Randy Dotinga, Open-Access Journals Abound, But Will They Survive? Forbes, October 13, 2004.
To make money, some open-access journals rely mainly on advertising. But others require study authors [PS: or their funders or employers] to pay printing fees. This is very different from the traditional approach of journals such as the New England Journal of Medicine or Science, which charge subscription fees. In some cases, especially among obscure journals with small audiences, subscription costs can run as high as $4,000 or $5,000 a year [PS: or over $20,000/year]. "Most research is done for the benefit of the public, yet the public cannot access it," said Dr. Gavin Yamey, senior editor of Public Library of Science Medicine, a leading open-access journal that charges study authors $1,500 to run their studies. Earlier this year, there were an estimated 1,525 open-access journals in the world, making up perhaps 5 percent to 10 percent of the total. The ones that charge authors have been controversial because critics fear they'll support shoddy research to make money. [PS response.] "In our capitalist society, one of our basic tenets is who pays the fiddler calls the tune," Dr. Jeffrey Drazen, editor-in-chief of the prestigious New England Journal of Medicine, said at a meeting of health journalists earlier this year. But Yamey said editors at his journal don't know whether the authors paid for placement or not (some can get waivers if they can't pay.) And Yamey, along with other open-access editors, reverses the conflict-of-interest accusation, pointing out that many traditional journals can easily be corrupted by advertisers such as pharmaceutical companies.

But what about the viability of open-access journals? Can they make it financially by providing content for free, regardless of whether they accept advertising or charge study authors? A new study suggests that staying alive financially will be a challenge. Researchers studied survey results from 495 open-access journals and found little evidence that many editors have found a way to stay afloat. In fact, 40 percent of the surveyed journals are failing to break even. "Many of them say they have no business plan," said Mark Frankel, co-director of the American Association for the Advancement of Science's Project on Science and Intellectual Property in the Public Interest, which helped fund the study. "They're committed to the social good, they want to get the (research) literature out, but they haven't thought about where they want to be financially." The good news? The growth of open-access journals has convinced traditional journals to allow more people to view their content for free. "The principle of open access is a good one," Frankel said. "It's difficult to disagree with."

The British Academy comment on the draft RCUK policy

The British Academy has publicly released its August comment on the draft RCUK open-access policy. (Thanks to Gerard Lowe.) Excerpt:
[1]...The British Academy responds as the UK national academy for the humanities and social sciences, as a funder of research (with both public and private funds), and as a learned society with its own publishing programme. [2] In April 2005 the Academy published a policy review document on E-resources for research in the humanities and social sciences, which addresses the issues raised in the RCUK statement. The report supported ‘the principle of wide and ready access to research outputs and other research resources’. In particular it stressed how important it was for the humanities and social sciences to engage with open access issues, so that the agenda was not over-dominated by the natural sciences. [3] The RCUK position statement appears to be driven primarily by considerations that relate to the natural sciences....In the humanities, the dissemination of scholarship is less dominated by journal articles and conference proceedings: monographs continue to play a key role. Scholarship can be less driven by the very latest published findings: articles published 30–50 years ago remain important....[4]...The RCUK position implies that an alternative system will have to be devised and implemented. The statement acknowledges that new models will require new solutions, but provides little firm evidence in support of its optimism that these solutions will be found. There are doubts that need to be addressed. [5] The cost in money and time of establishing and maintaining institutional or other repositories should not be underestimated. The statement is vague about likely costs, where the funding will come from, and indeed whether this will be more cost-effective than the existing model....[W]ill there be adequate support for individual researchers seeking to deposit their material? And it is surely doubtful whether learned societies across the humanities and social sciences are equally willing or geared up to take on any ‘kite marking’ responsibilities — at least without any reimbursement of the associated costs. [6] The statement is also vague about the costs associated with open access journals. A typical ‘author-pays’ fee of £1500 might not constitute a significant addition to a typical research grant in the natural sciences, but it would form a significant percentage increase on the small individual grants that are common in the humanities and social sciences. Where is this additional funding to come from? Indeed much output in the humanities does not derive from research grant funding at all: is it likely that funds will be available just for fees? [7]...There is also the question as to whether institutional repositories are best suited to meet the needs of individual researchers, and whether parts, or even all, of the academic community might be better served by subject repositories....[8] With such doubts about future models, one would expect that the existing publishing model should not be undermined in the meantime. The RCUK position accepts that articles should be deposited in e-print repositories ‘subject to copyright and licensing arrangements’, but makes clear its view that such restrictions should be as liberal as possible. The Academy is not surprised that some university presses are continuing to assert limitations to defend the value that they provide through the peer review process — for example, imposing a delay in access....[9] An equivalent requirement to deposit articles is not being imposed on British Academy research grants awarded in the academic year 2005/06 because the terms and conditions have already been set and publicly announced. The position will be kept under review, particularly in light of the availability of suitable repositories.

Comment. Four quick replies. (1) On #4: The RCUK isn't seeking a solution yet to be found. It's funding a solution that it's already found. (2) On #5: The cost in time and money of maintaining institutional repositories should not be overestimated. Repositories not already funded are likely to be funded by JISC. Moreover, since the network of interoperable repositories will supplement, not supplant, "the existing model", the call for a comparison of their cost-effectiveness is misleading. (3) On #7: Nothing in the RCUK policy rules out subject repositories or the simultaneous deposit of RCUK-funded research in more than one repository. (4) On #8: We'll have to agree to disagree about whether RCUK should close the copyright loophole in the current draft. Since it allows publishers to impose embargoes of arbitrary length, the loophole effectively removes the teeth from the OA "mandate" and thereby puts publisher prosperity ahead of research productivity.

JISC will fund the launch, filling, and searching of OA repositories

Mark Chillingworth, JISC win digitisation and repository funding, Information World Review, October 13, 2005. Excerpt:
An extra £80 million is to be invested in digital information for education and research by the Joint Information Systems Committee (JISC) it was announced today. JISC has been awarded the funds by the government's Spending Review and the Higher Education Funding Council for England (HEFCE). HEFCE's Strategic Development Fund has awarded JISC £30 million over two years, whilst the Spending Review, part of the Treasury Department has awarded £50 million over two years to enable JISC to enhance the digital infrastructure of the UK. "The UK leads the world in the use of ICT to support education and research, but continued investment is crucial to the future competitiveness of both," said Sir Howard Newby HEFCE chief executive of the award. JISC has outlined five strands that will receive funding. Universities will receive policy advice and technical guidance in setting up digital repositories, with JISC using the funding to develop a critical mass of repository content and a national structure that will allow users to search across federated repositories. Funding will also be applied to investment in the digitisation of scholarly resources in partnerships.

Also see the JISC press release (October 13, 2005).

Another journal adopts author-choice OA

David S. Reeves, C.W. Drummond, and M. Hill, Journal of Antimicrobial Chemotherapy: optional open access and not-for-profit, Journal of Antimicrobial Chemotherapy, October 11, 2005. Only this abstract is free online, at least so far:
All JAC articles are currently freely accessible 12 months after publication, a form of deferred open access. From 2006, JAC will offer the option of author-pays open access, so that individual articles can be made open access immediately upon publication. In addition, JAC will allow the deposition by authors of post-prints of the accepted version of their article as encouraged by granting bodies such as the US National Institutes of Health (NIH), and others. We are adopting such policies to support our authors who must comply with the requirements of their funding bodies and institutions, however, we do not believe that deposition of the post-print form of the article is the most useful step for the progression of research, as we shall note in this article. These changes will enable JAC to deliver the potential for expanded access to articles at a rate determined by the desire among the author community to do so, but without compromising the long-term viability of JAC and the services we offer.

Milestone for HighWire Press

HighWire Press has posted a million free online articles and is still going strong. Excerpt from its press release (October 11, 2005):
On Oct. 6, the millionth scholarly journal article was made freely available to users worldwide by publishers hosted by HighWire Press, a division of the Stanford University Libraries and Academic Information Resources. HighWire is the largest archive of free, full-text, peer-reviewed research literature, and the million-article milestone culminates a decade of collaboration among scholarly publishers, libraries and the research community for the common good. "Stanford started HighWire in 1995 with two aims: to help nonprofit and responsible scholarly publishers compete as publishing entities in the Internet age, and to improve access to scholarly information," said Michael A. Keller, university librarian and publisher of HighWire Press. "A million free full-text articles on our servers comprise a strong validation of this effort and testimony of the importance of responsible, not-for-profit, scholarly publishers in the information economy." As the online host for more than 800 scholarly journals, HighWire Press has championed free access to both archival and recent journal articles for scholars and the public alike since its founding. The publishers of more than 230 journals have made back issues freely available. Most of these journals make their content free on a rolling one-year basis or sooner; some allow articles to be viewed immediately on publication....

Access to journal articles that report research results has been a controversial topic in recent years. Debate has focused in part on changes in the commercial publishing industry, including consolidation of producers, proliferation of journals and large subscription cost increases for institutions. Scholars, librarians and funders of research have raised concerns that institutional subscriptions have grown too expensive overall, resulting in subscription cancellations and inadequate access to the results of research for scholars worldwide. Among them are advocates of the open access movement, who favor alternative business models for publishing research funded by public agencies. Some scientists have called for an overhaul of scholarly publishing with so-called "author-pays" systems, which other scientists and librarians fear may destroy the financial basis for branded, reputable, edited and authoritative journals published by scholarly societies. "The million articles in the HighWire Free Back Issues Program demonstrate that there is a third way between the extremes of prohibitively expensive publication and immediate, unmediated posting of content direct to the open web," said John Sack, director of HighWire Press. "New business models will likely emerge but must be seriously tested over time, not only evangelized, before we can accept as demonstrated fact that they meet the needs of research and society. We and the publishers we support are testing new models continuously; this experimentation includes open-access journals, 'open choice' decisions by individual authors, author manuscript publishing, free access to developing countries and to patients, as well as other models that address access problems and take advantage of the opportunities that the new technology allows."

(PS: Some HighWire content is free online and some isn't. The press supports the access policies adopted by its participating journals. Therefore, it's true that HighWire is a third way between no-OA publishers and all-OA publishers. However, it's misleading to say in the same context that HighWire is "a third way between the extremes of prohibitively expensive publication and immediate, unmediated posting of content direct to the open web." That gives the false impression that OA is all about "unmediated posting of content direct to the open web" --i.e. that OA journals are not peer reviewed or that OA advocates recommend bypassing peer review. OA is not about bypassing peer review. It's about the free online dissemination of peer-reviewed research.)

Chicago's Oriental Institute commits to OA

The University of Chicago's Oriental Institute has committed to open access. (Thanks to Ross Scaife.) Here's the announcement on its electronic publications page (apparently October 7, 2005):
Starting in 2005, the Oriental Institute is committed to digitizing all of its publications and making them available online, without charge. The minimum for each volume, old and new, current and forthcoming, will be a Portable Document Format (PDF) version following current resolution standards. New publications will appear online at or near the same time they appear in print. Older publications will be processed as time and funding permits.

More on OA and increasing citation impact

The Chronicle of Higher Education has posted the transcript of yesterday's online colloquy on impact factors. The questions came from participating readers and the answers from Anurag A. Agrawal, an assistant professor of ecology and evolutionary biology at Cornell University. Agrawal has served on the editorial boards of five journals and recently published a letter in Trends in Ecology and Evolution "decrying some common editorial practices designed to raise citations and journal-impact factors." Since Agrawal uses ellipses in his answers, my own ellipses are in square brackets. Excerpt from the transcript:

Question from Diane Sullenberger, [Executive Editor of the] Proceedings of the National Academy of Sciences: How is online access to research papers, in particular, free online access, affecting research impact?

Anurag A. Agrawal: Not clear... it does seem that the free availability of articles has got to make things a bit more fair. Nonetheless, there is always going to be a prestige factor.... those articles in high impact journals or by authors from powerhouse universities will likely get more attention simply because of the source. Free on-line access should also further bias what is being cited however... some journals do not have the archives available as PDF files (for various reasons) ... given the current climate of scientists going to the physical library less and less, the are more likely to cite papers from journals with free on-line access.

Question from Rebecca Minnillo, Society for Investigative Dermatology: [...] What(if any) effect will open access have?

Anurag A. Agrawal: [...] Open access journals have the flavor of "science for the sake of science" which I applaud. However, I do not think they are immune to the potential abuses.

Richard Monastersky (Moderator): Editors and publishers of new open-access journals say they are hurt by the way impact factors are calculated because you need several years of citation data before an impact factor can be determined. So a start-up journal faces a tough time getting submissions from scientists who are concerned about impact factors. One interesting fact is that the Public Library of Science Biology journal saw its submissions double after it received its first (and quite high) impact factor this summer.


Wednesday, October 12, 2005

Italy's ICTP launches an OA repository

The Abdus Salam International Centre for Theoretical Physics (ICTP) in Trieste has launched an OA eprint archive. (Thanks to Subbiah Arunachalam.) From the announcement:
Because of relative isolation that still persists in developing countries, scientists are often unaware of the extent and nature of science that is being done in their regions. They also have some difficulty in publicizing their work quickly. To provide assistance in these matters, the Abdus Salam International Centre for Theoretical Physics (ICTP) has organized an Open Access Archive to allow the scientific work of any scientists from any country to be posted free of charge....Authors may upload pre-prints, reprints, conference papers, pre-publication book chapters and CV. Before archiving the previously published work, the authors must review the copyright agreement with the publishers....If you are a scientist subscriber of the free electronic Journal Delivery Service (eJDS), residing in a developing country, and do not have a reasonable fast connection to the Internet, you can get any document from this archive via e-mail.

Presentations at Rome fisheries conference

Abstracts of the presentations at the conference, Information for Responsible Fisheries: Libraries as Mediators (Rome, October 10-14, 2005) are now online. At least four are OA-related: those by Charlotte Hess, Paul Nieuwenhuysen, Pauline Simpson, and Fodé Karim Kaba.

Helping libraries improve their bargaining power with publishers

Marydee Ojala, Pricing/Controlling Costs, Online Insider, October 11, 2005. Her advice: libraries need more bargaining power with publishers and talking about OA is one way to get it. (PS: Probably true, but acting on OA is even better.)

Another German overview of OA

Dominic Tate, Open Access: Eine neue Epoche, a presentation at the conference, Arbeitsgemeinschaft für medizinisches Bibliothekswesen (Graz, September 26-28, 2005). (Thanks to medinfo.)

Making journal web sites more useful (short of OA)

Collin Brooke, Mirror, Mirror on the Web, October 11, 2005. How a journal made its web site more useful (short of adopting OA), by making a database of article metadata and using social or folksonomy software to help users find relevant work.

German overview of OA

Katja Mruck und Günter Mey, "Open Access": Freier Zugang zu wissenschaftlichem Wissen, Linksnet Forum Wissenschaft, September 6, 2005. (Thanks to Klaus Graf.) An overview of OA and an argument for its adoption. In German.

University of Zurich adopts OA mandate

Stevan Harnad reports that the University of Zurich Institute of Pharmacology and Toxicology has adopted a policy to require faculty to deposit their refereed articles in the institution's OA repository. From Stevan's announcement:
The University of Zurich has just registered its Institutional Self-Archiving Policy....The 14th institution with a [self-archiving] policy, Zurich is the 4th (after Southampton, Queensland University of Technology, and CERN) with a mandate rather than merely a recommendation. This is also the second mandate in a Swiss institution.
(PS: Kudos to all involved at the University of Zurich. By my count, however, Zurich is the fifth university with an OA mandata, not the fourth. Portugal's University of Minho has had such a policy since late 2004.)

More on the Kaufman-Wills study of OA journals

Lila Guterman, Survey of Open-Access and Subscriber-Based Journals Finds Changes Afoot in Both Business Models, Chronicle of Higher Education, October 12, 2005 (accessible only to subscribers). Excerpt:
The first large-scale comparison of open-access journals with traditional, subscription-based journals was published on Tuesday, and it reveals an industry in flux. The study was published in book form as The Facts About Open Access by the Association of Learned and Professional Society Publishers. The association financed the research with three other organizations, all of which are affiliated with traditional journals. The comparison was based on a survey of nearly 500 journals, and the results were first reported in March (The Chronicle, March 25). "It's too early to tell whether full open access is a viable business model," said Cara S. Kaufman, who conducted the survey and is a principal at the Kaufman-Wills Group, publishing consultants in Baltimore. She found that more subscription-access journals break even or produce surpluses than do open-access journals. She also learned that more than half of all journals, whether traditional or openly accessible, were considering changing their business models within three years....And in what may be the report's most surprising finding, Ms. Kaufman discovered that a larger fraction of traditional journals than open-access journals charge authors fees to publish. Because the most visible publishers in the open-access movement, the Public Library of Science and BioMed Central, charge authors fees, the movement itself has often been associated with that business model. Some critics of the movement have charged that, under open access, peer review will suffer. Ms. Kaufman found that more open-access journals than traditional journals forsake external peer review, instead having staff members provide the reviews. Such a review "may be less rigorous," Ms. Kaufman said.

Tuesday, October 11, 2005

More on the Kaufman-Wills report for ALPSP

Stephen Pincock, Open access report published, The Scientist, October 11, 2005. Excerpt:
A report on open access publishing released today (October 11) has raised concerns about peer review, the standard of editing and the financial future of some open access journals. However, open access advocates argue the latest report is biased by its funders, including the Association of Learned and Professional Society Publishers (ALPSP), whose membership includes publishers of subscription journals....According to the report, comparable numbers of open access journals and traditional journals conduct peer review of articles, but more open access journals (28%) appear to rely on internal editorial staff for peer review, not outside experts....Open access publishers may also have impending financial troubles, the report noted. More than 40% of surveyed open access journals reported they were suffering shortfalls, and 24% said they were breaking even, leaving only 35% that report a surplus. In comparison, 81% of the HighWire and AAMC journals reported a profit and 75% of the ALPSP journals were running a surplus. More than half of fully open access journals do not charge author fees, another unexpected finding, [Sally] Morris [of the ALPSP] told The Scientist....The report also found that just 72% of open access journals copyedit manuscripts....In the current investigation, study parameters were defined by the co-sponsors and research and analysis was conducted by Kaufman-Wills Group, a group of independent consultants. The study was carried out in two stages: the first surveyed 128 ALPSP-member journals, 34 journals of the Association of American Medical Colleges, 85 journals hosted by HighWire Press and 248 journals from the Directory of Open Access Journals. In the second phase, 22 publishers were interviewed to provide a series of detailed case studies. Matthew Cockerill, publisher at BioMed Central, The Scientist's sister company, argued that the report seemed to push a specific point of view. "It is well known that ALPSP is taking an antagonistic stance on open access," he told The Scientist. "The report appears to have an agenda…to discredit open access." Cockerill added that the authors "basically miss the point" in many ways. In terms of copyediting of research articles, for example, "it's a very open question in terms of whether the benefits of copyediting outweigh the downsides such as inefficiencies introduced. It's about the quality of the research." It also makes little sense to project the journals' future financial solvency using a "snapshot from a moment in time," he said. Jan Velterop, an advocate of open access with the large German publisher Springer, said he also disagreed with the report's findings. "We are absolutely convinced that with open access we can have good profit margins," he told The Scientist. "But it may be that it is harder to make an excessive profit, that is true." Velterop said he was disappointed that learned societies were taking such a negative view of open access. "I find it astonishing that there are some societies who really should have the promotion of science at the heart of their mission but are seeming to push back the development rather than coming up with imaginative ways to push it forward," Velterop said.

Rollyo search engines on OA and eprints

Klaus Graf has created a Rollyo search engine on OA. Andrew Jones has created one for eprints. Make your own custom search engine or make suggestions to Klaus and Andrew on how to refine theirs.

Springer improves Open Choice

Springer has improved the terms of its Open Choice program. First, open-choice authors may now retain copyright. Second, open-choice articles are released under a license which Springer says is identical to the Creative Commons Attribution-NoCommercial license.

(PS: Congratulations to Jan Velterop who, I presume, was behind these changes.)

New group pushes the OpenDocument Format

The OpenDocument Fellowship is a new advocacy organization pushing the OASIS Open Document format. From the press release (October 10):
The Open Document Fellowship promotes the new internationally agreed standard for digital documents, Open Document Format (OpenDocument). The Fellowship, formed in September, has attracted support and interest from around the globe. The Fellowship's aims include providing factual information about the Open Document Format, such as the degree to which companies and their products are committed to supporting the format, and making sure that OpenDocument can be always supported by any software application or company. The organisation also supports the development of software tools to complement the format....Founding member Adam Moore, Friends of OpenDocument Inc., said: "As a true Open Standard, OpenDocument is available for the benefit of all. It levels the competitive playing field and provides wider opportumities for innovation, diversity and choice. This choice and diversity is a natural evolutionary consequence of the market maturity of general productivity software. We believe all responsible citizens in the digital market place will embrace ODF as the central focus for document production." Users whose data is stored in OpenDocument will never again face the problem of not being able to access data because the application that created it is no longer available to them. Open standards already enable users of different computer systems (both hardware and software) to access the Internet and communicate with each other. ODF enables users of different computer systems to freely exchange and use files.

UK report on CC licenses for public-sector resources

Today the UK Common Information Environment (CIE) released its report and 13 appendices on the use of Creative Commons licenses for UK public-sector resources. The members of CIE are now considering its recommendations. From the executive summary:
The Common Information Environment (CIE) commissioned this study to investigate the potential for Creative Commons licences to clarify and simplify the process of making digital resources available for re-use....During the study, workshops were held for key stakeholders in two groups - rights holders, primarily representatives of CIE organisations, and users of CIE produced digital resources, including the public, teachers at all levels of education, museum and library staff....The study concluded that many resources produced by CIE organisations could be made available under a common licence and that Creative Commons would allow a substantial amount of CIE resources to be made available for reuse. Other existing common licences, such as Creative Archive and Click-Use could be used if Creative Commons cannot be applied but their use should be minimised to avoid removing many of the key benefits of the Creative Commons Licences.....Recommendations: [1] Resources should be made available for reuse unless there is a justifiable reason why they should not. [2] The reuse of resources should be as unconstrained as possible. For example, resources should be made available for commercial reuse as well as non-commercial reuse wherever possible. [3] The range of permitted uses of resources should be as wide as possible, for example, including the right to modify the resource and produce derivative works from it. [4] Reuse should be encouraged by permitting others to redistribute resources on a world-wide basis. [5] Resources should be made directly available and discoverable electronically whenever possible. [6] The conditions of use for each resource should be linked directly to the resource so that they are reusable at the point of discovery.

The report was prepared by Intrallect and the AHRC Research Centre for Studies in Intellectual Property & Technology Law at the University of Edinburgh.

Update. Also see the JISC press release (October 14) on the new report.

Business models for open businesses including OA journals and publishers

The OA HSRC Press in South Africa has gotten a good write-up at Open Business. Since I've blogged the HSRC press here before (1, 2), what I'd really like to point out is that Open Business is ready to cover the business of scholarly publishing.

OB describes itself as "a platform for sharing innovative Open Business ideas - entrepreneurial ideas which are built around openness, free services and free access." Its motto is Let's share business models. To that end, it has a blog, a summary of the models, a growing bibliography, and a wiki to "help create the authorative source on open business models".

Ever since Cara Kaufman and Alma Wills showed that only 47% of existing OA journals charge author-side fees, it's been clear that the majority of OA journals use business models that haven't been well-described in the literature. It would be very labor-intensive for researchers to contact all those journals, interview them on their business models, and post the results somewhere. The OB wiki is an ideal solution. How about creating a section in the wiki on peer-reviewed OA journals and slowly fleshing it out with the business models actually in use and some notes (data preferred) on how well they work?


Monday, October 10, 2005

Evolution of scholarly publishing up to OA

Ana Maria Ramalho Correia and Jose C. Teixeira, New Initiatives for Electronic Scholarly Publishing: Academic Information Sources on the Internet, a presentation at the RTO IMC Lecture Series on Electronic Information Management, Sofia, Bulgaria, September 8-10, 2004.
Abstract: This paper will trace the evolution of scholarly communication from the 17th century up to electronic journals, e-prints, e-scripts, electronic theses and dissertations and other digital collections of grey literature that emerge, with different degrees of acceptance in several disciplines, in the context of the new publishing models of the present day. The open access or archiving/depositing of electronic copies of scientific and scholarly research papers, theses and dissertations and other academic materials into networked servers, aims to ensure the widest possible dissemination of their contents by making them freely available on the public Internet, facilitating full use by readers without financial, legal or technical barriers, other then those related with gaining access to the Internet itself. The free and unrestricted online availability of this literature gives readers an opportunity to find and make use of relevant literature for the advancement of science and technology. From the point of view of authors, open archiving brings increased visibility and widens the readership and impact of their research work. For researchers in poorly resourced organizations and/or countries, access to this open archive material has great potential value, as it facilitates the retrieval of research results through the Internet. At the same time, it also allows scientists in these organizations/countries to participate in the development of the global knowledge base. The paper will outline the strategic factors that impact on the acceptance of these scholarly materials, available through open access or archiving/depositing . Several international initiatives aiming at the creation of a global network of cross-searchable research materials are underway. Some of the more relevant ones will be reviewed. The impact that the availability of these novel scholarly materials is having, on changing roles and responsibilities of information managers, will also be highlighted.

AIP's OA Author Select program under way

The American Institute of Physics (AIP) has published the first open-access article in its Author Select program. (Thanks to George Porter.)
Rafael B. Dinner, M. R. Beasley, and Kathryn A. Moler, Cryogenic scanning Hall-probe microscope with centimeter scan range and submicron resolution, Review of Scientific Instruments, October 2005.

From the AIP press release (October 10):

AIP has introduced the Author Select/Open Access option in three of its journals (Journal of Mathematical Physics and Chaos: An Interdisciplinary Journal of Nonlinear Science being the other two), and plans to extend the program to all of its journals in 2006. "Open Access is far from being a 'trend' in physics publishing," said AIP Publisher Martin Burke. "But we are more than happy to offer researchers an open access option within our well-known, high-impact, and heavily cited 'traditional' journals."

Another OA textbook initiative

The Potto Project is a new initiative to produce open-access textbooks. It has published its first two books, Fundamentals of Compressible Flow and Fundamentals of Die Casting Design. Potto books will use either the GNU Free Documenetation license or the home-grown Potto license.

Open source and open access in the draft WSIS documents

William New, Open Source Agreed In UN Information Society Summit Preparations, IP-Watch, October 10, 2005. Excerpt:
Encouragement for the use of free and open source software and open standards for science and technology has quietly worked its way into the draft texts being prepared for the November second phase of the World Summit on the Information Society (WSIS). The draft WSIS texts are lengthy and detailed, and intellectual property (IP) issues play a comparatively small role overall, but the stakes are high enough to draw top government IP officials and industry lobbyists to the meetings. Agreement on the issue was reached at the 19-30 September WSIS preparatory committee meeting held in Geneva. The provisions will be included in those forwarded to senior officials at the 16-18 November summit in Tunis. In the introductory chapter of the texts, called the "political chapeau" an agreement was reached on paragraph 21 after bolstering the neutrality of the reference to different types of software models....Paragraph 21 of the political chapeau now reads: “Our conviction that governments, the private sector, civil society, the scientific and academic community, and users can utilize various technologies and licensing models, including those developed under proprietary schemes and those developed under open-source and free modalities, in accordance with their interests and with the needs to have reliable services and implement effective programmes for their people. Taking into account the importance of proprietary software in the markets of the countries, we reiterate the need to encourage and foster collaborative development, inter-operative platforms and free and open source software, in ways that reflect the possibilities of different software models, notably for education, science and digital inclusion programs. (Agreed)”...Also in the political chapeau, paragraph 11 states: [We affirm that the sharing and strengthening of global knowledge for development can be enhanced by removing obstacles to equitable access to information for economic, social, political, health, cultural, educational and scientific activities and by facilitating access to public domain information, including by universal design and use of assistive technologies, in this context we underline that media play an important role.] The paragraph has gone through several changes and remains in brackets.

PS: For the purposes of OA, Paragraph 12 is just as relevant: "[12. We recognise that access to information and sharing and creation of knowledge contributes significantly to strengthening economic, social and cultural development, thus helping all countries to reach the internationally-agreed development goals and objectives, including the Millennium Development Goals.]" It too remains in brackets for the time being.

OA treatise on HIV medicine

Flying Publisher has released an OA medical treatise, HIV Medicine 2005. (Thanks to SciDev.Net.)

For authors without repositories

Heather Morrison, Until we all have institutional repositories: an idea for the transition, Imaginary Journal of Poetic Economics, October 9, 2005. Excerpt:
One of the quandaries for funding agencies wishing to mandate open access to the results of research they have funded, is that not all researchers currently enjoy the services of an institutional repository. There are likely a number of ways of addressing this issue on a temporary basis. Here is one thought: why not investigate whether any of the libraries which already have an IR repository up and running would consider a contract to create a special section for the funding agency? This would provide a means of immediately addressing this situation, which may well be temporary in nature, at less cost than might be incurred if the funding agency were to set up their own central repository.

(PS: Another possibility is for universities with IRs to allow deposits by faculty from selected IR-less institutions. Another is for universities without individual IRs to launch a consortial IR. Another possibility, though still forthcoming, is the universal repository I'm helping to launch at the Internet Archive.)

Google Scholar beats competition in social sciences

Susan Gardner Susanna Eng, Gaga over Google? Scholar in the Social Sciences, Library Hi-Tech News, 22, 8 (2005) pp. 42-45. (Thanks to T.J. Sondermann.)
Abstract: Purpose – To provide a summary of the main features of Google Scholar. Design/methodology/approach – Reviews, contextualizes and provides a summary of Google Scholar. Findings – This article compares the results of a sample search on “homeschooling in Google Scholar against the results in three fee-based article index databases in the social sciences: PsycINFO, Social Science Citation Index, and ERIC. Comparisons are done in the areas of content, currency, relevancy, and overlap. Google Scholar yields more results and a greater variety in its types of sources along with a higher rate of relevancy, but less currency. Ultimately, Scholar’s lack of quality control and inability to let the user manipulate data make it less effective than the fee-based databases at finding scholarly material in the social sciences. Originality/value – Provides a useful summary for information professionals.

Testimonial

A blog posting by KansasLibraryNerd, dated yesterday:
I only wish that we had [OA] sources like this when I was in college. One good thing about these "open access" sources is that even those who do not have access to a college's resources can do research. I have done just a few searches through OAIster, but I think there are some "pearls" to be found as they say.

HyperJournal users guide

The folks behind HyperJournal have released a draft version of HyperJournal for Dummies: A Beginner's Guide to HyperJournal 0.4 (PDF).

Tasini-based class action suit still simmering

Carole Ebbinghouse has a detailed update in the October 10 issue of Information Today on the class-action lawsuit, derived from the Tasini case that went to the Supreme Court, in which authors are suing publishers and database vendors (including Elsevier, Thomson, and Amazon) for unpermitted electronic publication of their work.

More on the OCA

Scott Carlson and Jeffrey Young, Yahoo Works With Academic Libraries on a New Project to Digitize Books, Chronicle of Higher Education, October 14, 2005 (accessible only to subscribers). Excerpt:
Yahoo officials say that the project is not a response to Google's partnership with five major research libraries to scan millions of books, and that some planning for the Yahoo project was under way before Google announced its plans last December. The new archive is called the Open Content Alliance, and it was conceived in part by Brewster Kahle, director of the Internet Archive, a nonprofit digital library. The archive will be doing much of the actual scanning for the project, using a process it has developed in recent years....Leaders of the project stressed that no books that are under copyright will be scanned unless the copyright holders give explicit permission....That plan means the Open Content Alliance will be limited mostly to out-of-copyright works — and to works by publishers who are willing to experiment with giving their content away online. The project will allow generous access to the materials it holds, however — in some cases even allowing users to download the full texts of books. Neither Yahoo nor any other group involved has been given exclusive rights to the content, according to the project's leaders. In fact, the books will be made available in ways that can be searched by other search engines, David Mandelbrot, Yahoo's vice president for search content, said in an interview....To help jump-start the project, Mr. Mandelbrot said, Yahoo will pay for the scanning of an 18,000-volume collection of American literature at the University of California system. Yahoo is also developing the technology to search the books. Yahoo does not expect to profit from the arrangement, said Mr. Mandelbrot, but sees it instead as a "philanthropic effort" to put more content online. "Any monetization that's able to be generated specifically as part of this program would only be used to fund additional digitization of public-domain or copyrighted works," he said....Mr. Greenstein said the University of California would add materials by selecting and scanning certain collections. The project will probably cost the university system $500,000 for the first couple of collections, he said....The Internet Archive has been working with the University of Toronto for the past year in a pilot project to test its scanning process, Carole Moore, the university's chief librarian, said in an interview. So far, she said, about 2,000 books have been scanned, and more than 1,000 of those are already available through a section of the Internet Archive. She said Toronto has coordinated with six other Canadian university libraries, as well as the Library and Archives of Canada, to select books by Canadian authors to be scanned for the project. "We're trying to contribute for everyone a certain amount of Canadian material," she said. Leaders of the project hope that more and more libraries will add unique portions of their collections, so that jointly the new central digital library can one day hold nearly every public-domain work. "We're trying to nail bringing public access to the public domain," said Mr. Kahle. "We want people to be able to do great things with the classics of humankind."

Another defense of Google Library

Jason Fry, Authors' Second Chance, Wall Street Journal, October 10, 2005. Excerpt:
Amazon.com's "Search Inside This Book" feature, which works in a similar way, is an opt-in program, so why is Google insisting publishers opt out? Things make more sense if you start off by thinking about how to build a search engine....Of course Google is storing entire books -- you can't make a usable search engine if you've only got bits and pieces to search against....(Some have noted that Google asks permission to include paid content in its index, but that's another red herring. These days, most creators of paid content on the Web are trying to figure out how to make their content searchable by the wider world and still get paid for it. Being shut out of Google is a problem that needs a solution, not a desired state of affairs.) The important thing isn't what's stored but what's shown -- and what Google's showing shouldn't worry anyone. (Granted, reference books, cookbooks and other works whose small chunks have great value may well be an exception, in which case publishers can remove them.) Google Print bills itself as a tool for finding information, not for reading, and from a few hours playing with it, I'm inclined to agree. I tried to use Google Print searches to cheat my way through two current favorites...Neither attempt worked for very long nor was anything close to a pleasurable reading experience. As for the opt-in model, we're all lucky that Google and its rivals didn't build their Web search engines that way: A search engine built via opt-in would be a failure, demanding a quixotic, ruinously expensive pursuit of ever-multiplying sources of new information. The only sane way to build a search engine is to index everything automatically and let Web-site operators that don't want to be included opt out....Besides, what Google's doing appears to be well within the realm of the "fair use" provisions of copyright law, meaning the company doesn't have to ask permission for the basics of Google Print anyway. I'm not a lawyer, but Jonathan Band is, and his analysis of the copyright implications of Google Print is a fascinating read....Mr. Band's conclusions: Google Print does have commercial purposes, but Google's not looking to profit from book sales; building a search engine requires it to "use" the complete work; and it's highly unlikely that Google's use will hurt demand for the books stored and searched....Many a frustrated author can tell you that being published is just the start of the dream of making it as a writer: If your publisher doesn't back your book, or it doesn't quickly connect with the reading public, it'll soon fall out of print and very few people will ever hear of you or your ideas again. That's exactly the frustration that's driven many writers to the Web, where anyone can publish and be guaranteed a world-wide audience for his or her thoughts. But it's not the Web itself that makes that guarantee -- it's the search engines that tame the Web's terabytes upon terabytes of information by making it all searchable.

In praise of institutional repositories

Richard Gallagher, Why We Need Institutional Repositories, The Scientist, October 10, 2005. Excerpt:
Despite rapid explosion of knowledge in the life sciences, the full promise of digitization, storage and curation is nowhere close to being fully realized. The large-scale discipline-specific repositories that quickly became mainstream in information-intense branches such as genomics and proteomics are just the tip of the iceberg. The other seven-eighths comes in the shape of institutional repositories, such as MIT's DSpace, which provide the most comprehensive mechanism for digital preservation and dissemination. DSpace and other wide-ranging digital archives are truly transforming. They will house research data, journal articles, theses, teaching and learning materials, information for the general public, symposia and lectures, and informal accounts of life in the lab. While they are primarily being developed at universities, there is no reason why customized repositories shouldn't be introduced in other contexts, including industry....This new generation of institutional repositories does not compete with existing databases, it complements and extends them. At the same time, it reaffirms the position of an institution (in the case of a university) as a scholarly center and community hub.

An author defends Google Library

Bill Thompson, Defending Google's licence to print, BBC News, October 10, 2005. Excerpt:
It is a great idea, and the resulting catalogue will rapidly become the starting place for researchers around the world. But it might not happen, because the project is currently stalled after three US authors sued Google for scanning their copyright material....I am an author, so I have an interest in this, and even though I have many doubts about Google's operations and ideology, I have to support them in this one....Google is big enough to stand up to the pressure, and it can afford even more expensive lawyers than the Authors Guild. But there is a danger that now they are a public company obliged to enhance shareholder value, they will instead cut a deal with them to share revenue or pay to license this way of using the text. We would get our catalogue then, but have lost something equally valuable - the ability for anyone else to do what Google has done. After all, why should I not be able to scan the books I own and index them so that I can find out exactly where William Gibson first used the term "cyberspace"?...Rights holders have often tried to use the law to stop innovations that do not benefit them directly. In the 1970s the movie industry tried to kill video recording, and the record industry has only just begun to realise that online distribution is the key to their future survival rather than the devil's handiwork. Now some authors and their trade association are standing in the way of progress, even though it is clear that the main beneficiaries of the project will be publishers and the authors themselves. They will sell more books or, if they are academics, see their research read and cited more often. There will be a general increase in the quality of university research simply because vital sources will be overlooked less often. The world will be a better place....If existing copyright law can stop Google, or anyone else with the inclination, from creating such a phenomenally useful tool, then copyright law is wrong and must be changed. There are three things Google can do now to make it clear that profit is not the only thing motivating them in the project. First, they can change the name. "Google Print" makes publishers and authors think of print on demand and printing extracts, but even the US Authors Guild might have been happy to see the launch of "Google Catalog". Second, and more crucially, they can open the project up to anyone who wants to participate. That means publishing the technical specifications and any relevant APIs so that - for example - I can search the database from within my web application without having to show their ads on my site. Finally, they should let any library in the world have a copy of the electronic versions of all the books they hold that are in the catalogue. Not a link to the Google website but a real copy, to be held locally and used by the library under a very relaxed and permissive licence. Otherwise we will just have to kick off an open source scanning project to create a free catalogue of our own - I have got a few thousand books to get started with - and let the publishers sue ten thousand of us.

New OA repository for Earth science

Earth-prints is a new OA repository for Earth science research. (Thanks to Andrea Marchitelli.) From the web site:
Earth-Prints is an open archive created and maintained by Istituto Nazionale di Geofisica e Vulcanologia with the collaboration of Programma Nazionale Ricerche in Antartide. This digital collection allows users to browse, search and access manuscripts, journal articles, theses, conference materials, books, book-chapters, web products. The goal of our repository is to collect, capture, disseminate and preserve the results of research in the fields of Atmosphere, Cryosphere, Hydrosphere and Solid Earth. Earth-prints is young and growing rapidly. Check back often.

German radio broadcast on OA

Mathias Schulenburg, Informationen für alle, Deutschlandfunk, October 9, 2005. Transcript of a German radio program on OA.

The problems with impact factors

Richard Monastersky, The Number That's Devouring Science, Chronicle of Higher Education, October 1, 2005 (accessible only to subscribers). Excerpt:
Outside academe, few people have even heard of [impact factors]. Mr. [Eugene] Garfield, though, now compares his brainchild to nuclear energy: a force that can help society but can unleash mayhem when it is misused. Indeed, impact factors have assumed so much power, especially in the past five years, that they are starting to control the scientific enterprise. In Europe, Asia, and, increasingly, the United States, Mr. Garfield's tool can play a crucial role in hiring, tenure decisions, and the awarding of grants. "The impact factor may be a pox upon the land because of the abuse of that number," says Robert H. Austin, a professor of physics at Princeton University. Impact-factor fever is spreading, threatening to skew the course of scientific research, say critics. Investigators are now more likely to chase after fashionable topics — the kind that get into high-impact journals — than to follow important avenues that may not be the flavor of the year, says Yu-Li Wang, a professor of physiology at the University of Massachusetts Medical School. "It influences a lot of people's research direction." That influence has also led to a creeping sense of cynicism about the business of science publications. Journal editors have learned how to manipulate the system, sometimes through legitimate editorial choices and other times through deceptive practices that artificially inflate their own rankings. Several ecology journals, for example, routinely ask authors to add citations to previous articles from that same journal, a policy that pushes up its impact factor. Authors who have received such requests say that the practice veers toward extortion and represents a violation of scientific ethics. What's more, investigations into impact factors have revealed problems with the basic data used by ISI, the company that tabulates citation statistics and journals' impact factors. Started by Mr. Garfield in Philadelphia, ISI was bought in 1992 by the Thomson Corporation, which has tried to transform the citation enterprise into a more profitable operation by buying up databases and promoting its products. With alarming frequency, editors are finding fault with the impact factors that Thomson has issued....Mr. Garfield and ISI routinely point out the problems of using impact factors for individual papers or people. "That is something we have wrestled with quite a bit here," says Jim Pringle, vice president for development at Thomson Scientific, the division that oversees ISI. "It is a fallacy to think you can say anything about the citation pattern of an article from the citation pattern of a journal." Such warnings have not helped.

(PS: Monastersky's article doesn't touch on the overlap between IF issues and OA issues. Insofar as funding agencies and university hiring and promotion committees use IFs, they deter publication in journals with low IFs and journals with no IFs at all. Since it takes at least two years for a journal to have an IF, these practices deter publishing in any new journal, regardless of the reasons why it is new --whether it is exploring a new research niche, new metholodology, or new business model. Since most OA journals are new, these practices deter publishing in OA journals, even OA journals that might be higher in quality and impact than conventional journals in the same field.)

On Wednesday, October 12, at 1:00 pm Eastern Time, the Chronicle will host a live colloquy on the issues raised by Monastersky's article.

Also see Monastersky's companion article in the same issue of the Chronicle, Impact Factors Run Into Competition. Excerpt:

The publisher Current Science Group started an online service three years ago called the Faculty of 1000, which uses a stable of hand-picked specialists to rate new biology papers. A group of about 2,000 scientists evaluates the most important papers they read each month, from some 800 journals, says Vitek Tracz, chairman of Current Science Group, who started the project. The experts assess a paper and give it a numerical rating, which is used to calculate a Faculty of 1000 factor. Unlike impact factors, the Faculty of 1000 factor "is calculated from a value judgment given by identified experts," says Mr. Tracz, "and the reason for their value judgment can be traced." "We know from Faculty of 1000 that there are many important papers that get published in journals with low impact factors," he says....Mr. Tracz's other enterprise, a group of more than 100 open-access journals published by BioMed Central, recently developed another rating model. Last month the company's Web site started labeling papers "highly accessed" if they receive a large number of hits — a distinction that will always apply to the paper and hence could be placed on a CV. Clinicians seeking to stay on top of medical developments can participate in a new service run by McMaster University, in Ontario, called the McMaster Online Rating of Evidence, or MORE. The system comprises 2,100 practicing doctors, who rate articles on the basis of relevance and newsworthiness. If a paper gets a sufficient number of top ratings, MORE sends out an e-mail message about the study to readers who have signed up for news in that particular discipline. The system reaches more than 200,000 people. Among the newest rankings to appear is the h-index. Jorge E. Hirsch, a professor of physics at the University of California at San Diego, devised this method to quantify individual scientists' output by determining the highest number of papers each researcher has published that receive the same number of citations.

(PS: Monastersky didn't mention one promising initiative that's still under development: the Open Access Citation Index. More later.)


Sunday, October 09, 2005

German review of Google Scholar

Helmut Dollfuß, Google Scholar: Kleiner Fisch oder zukünftiger Hecht im medizinischen Literaturteich? Medizin Bibliothek Information, September 2005. (Thanks to Klaus Graf.) The article is in German but has this English-language abstract:
Google established itself as one of the most popular www-portals. In 2004 Google Scholar was released. The new spin-off should give researchers a better grip on scholary articles on the internet. The history of Google and his most successful page ranking will explain some strengths and weaknesses of the new search engine. The missing possibility to sort hits by date and unsatisfactory coverage of fulltext online resources are the main faults particulary in the field of medical literature research. Here PubMed will make the standard furthermore. However, the new Google [search engine] is speedy, free and easy to use. Morover it allready provides some special features as linking to library holdings and implementation of Open-URL.

Australian presentations on OA

The presentations from Australia's NSCF rountdable, Open Access, Open Archives and Open Source (Sydney, September 27, 2005), are now online. (Thanks to Colin Steele.)