News from the open access movementJump to navigation
Dorothea Salo, Spaghetti that didn't stick, Caveat Lector, February 16, 2006. Excerpt:
Open Access News spread the word today that the first Report on the NIH Public Access Policy...is out. Compliance rate? A desperately pathetic 3.8%. Three point eight percent of the literature that was eligible for archiving under this policy actually got archived. You begin to see what repository rats are up against? The NIH did its level best to communicate the policy to researchers, and they’re decently competent at outreach. As far as I know, publishers didn’t spread much FUD among researchers. Even so, a big fat nothing happened, because the policy had no teeth and researchers don’t understand and don’t care about the economics or socioinformatics of publishing. I part ways with Stevan Harnad on a lot, but he’s dead right about one thing at least: if researchers don’t have to provide open access, they mostly won’t. I can cajole and jolly and educate and reason with them all I want, but I won’t have nearly the impact of a policy with teeth. We can’t coddle researchers on this; it’s tantamount to coddling Elseviley Verlag. Fortunately, it looks as though the NIH policy is likely to sprout teeth. Because of that, I’m actually not at all saddened that this particular spaghetti-strand didn’t stick when thrown at the wall. We now have cogent evidence that “voluntary” open-access policies aren’t worth spit. That removes a fairly big pillar that Elseviley Verlag likes to hide behind.
Sergey Dmitriev, Have We Seen the Back of the Hard Copy Reader? St. Petersburg Times (Russia, not Florida), February 14, 2006. (Thanks to LIS News.) Excerpt:
All of this means that the market of e-content (that at present basically means mobile content) is one of high growth potential. The hard-copy format is doomed to fall in its share of the market. The Internet University of Informational Technologies gives a few of its courses in the traditional way, despite their complete accessibility on the web site. The university’s administration believes that free publication on the web in no way influences book sales, because those who read books and those who read books on a computer are different categories....Sooner or later open source will settle down in the mobile world. Motorola released a mobile with Linux, and Nokia is already selling a Linux-based internet tablet. On such platforms the installation of DRM systems is pointless....In due course much can be expected from the much-hyped Google project called ‘Print.’ It can therefore be said that there is plenty of similar, accessible content. Moving the text to pocket computers and mobile phones by oneself is less and less complicated. A shift in our system of values is now taking shape. When a new generation that now downloads illegal music from the internet and uses open source software will age, it will more likely change the law than its way of thinking....It means that content in itself is not recognized as a good. And of interest to the market, from a commercial point of view it can only be as a contextual part of a more complicated service. Such thinking, and its corresponding business model, is clearly visible in such companies as Google: on one hand their services are impossible to copy, on the other for end users they are free.
Jim Jacobs, Thoughts on Google Book Search, Diglet, February 16, 2006. Excerpt:
Yesterday, I went to the Stanford EE Computer Systems Colloquium to hear Daniel Clancy, the Engineering Director for the Google Book Search Project....Clancy mentioned that Google was NOT going for archival quality (indeed COULD not) in their scans and were ok with skipped pages, missing content and less than perfect OCR -- he mentioned that the OCR process AVERAGED one word error per page of every book scanned!. The key point that I took away from this is that Google book project IS NOT an alternative to library/archive/archival/preservation scans....When I asked if there would be links to libraries on ALL results pages, he hemmed and hawed a bit and wouldn't say one way or the other. He mentioned about the difference between the publisher-supplied content and the library-supplied content and seemed to hint that the publisher-supplied content is subject to stricter licensing agreements....92% of the world's books are not generating revenues for copyright holders or publishers!...Someone asked what had surprised him the most since he started. One thing he was surprised about was that about 70% of the book project use was coming from India.
From the Authors Guild February 14 press release, Trademark Dilution Revision Act Would Weaken Protections for Free Expression:
A bill that would drop express protection for "noncommercial use" of a trademark and would weaken the protections for those who use trademarks in news commentary will be considered by the Senate Judiciary Committee on Thursday. The legislation has already passed the House....Trademarks, including business names, brands, and slogans, are unavoidable and proliferating in daily life. Writers of fiction and nonfiction inevitably incorporate trademarks into their work, sometimes to comment on the particular business using the trademark, but frequently the use is merely incidental to the nonfiction or fiction writer's story ("Tom went to a McDonald's, had a Coke, and waited for the Harley to arrive."). Just as fair use provisions of copyright law permit writers to make certain uses of copyrighted works in their own works, so do fair use and related provisions of trademark law permit writers to use trademarks in their works....The new law would weaken these protections, exposing writers to greater potential liability for their use of trademarks. This would needlessly chill expression.
Comment. Imagine writing up your research on the safety of Vioxx, the security of a new Microsoft browser, or the lawfulness of Google's Library Project, and finding that you have to use the ™ symbol or even pay royalties just to name the product.
Jonathan Kerry-Tyerman, No Analog Analogue: Searchable Digital Archives and Amazon's Unprecedented Search Inside the Book Program, Stanford Technology Law Review, February 2006. (Thanks to Ray Corrigan.)
Abstract: This paper begins with an overview of Amazon's prior experiments with e-books, the way in which the Search Inside the Book database is created, and how that database manifests itself to the Amazon user. Part II analyzes the Search Inside the Book program under current copyright law and concludes that the program does infringe copyrights in the indexed works. Part III argues that programs like Search Inside the Book, though infringing, actually serve the purposes of copyright law, and should not create liability for the providers of such programs. Finally, part IV applies the fair use doctrine to Search Inside the Book, assuming that the existing copy-protection measures are improved as indicated and ultimately finding this unconventional program protected as fair use.
James Boyle, More rights are wrong for webcasters, Financial Times, February 17, 2006. Excerpt:
I teach intellectual property law....Are we doing a good job of writing [IP] rules? The answer is no. Three tendencies stand out. First and most lamentably, intellectual property laws are created without any empirical evidence that they are necessary or that they will help rather than hurt. Second, the policymaking process has failed to keep track of the increasing importance of intellectual property rights to everything from freedom of expression and communications policy to economic development or access to educational materials. We still make law as though it were just a deal brokered between industry groups....The public interest in competition, access, free speech and vigorous technological markets takes a back seat. What matters is making the big boys happy. Finally, communications networks are increasingly built around intellectual property rules, as law regulates technology more and more directly; not always to good effect. The World Intellectual Property Organisation has now managed to combine all three lamentable tendencies at once. The Broadcasting and Webcasting Treaty, currently being debated in Geneva, is an IP hat trick.
The Dutch DARE project has set itself the goal of depositing 100,000 full-text eprints in the DARE network of OA repositories in the year-long period from October 1, 2005, to October 1, 2006. This would triple the number of items on deposit in the national repository network. From yesterday's announcement:
Dubbed 'hunDAREd thousand’, a new major project was launched on 1 October 2005 in which all Dutch universities, the Netherlands Organization for Scientific Research (NWO) and the Royal Netherlands Academy of Arts and Sciences (KNAW) intend adding 100,000 full-text documents to the DARE (‘digital research’) archives. The project runs until 1 October 2006. By that time the DARE partners hope to have a total of 150,000 academic publications, dissertations, pre-prints, datasets and other research material available for all on www.DAREnet.nl. This will make the documents searchable on other search engines as well. The institutions involved are participating in the project in their own ways. Various universities are focusing on certain faculties or, at the request of their Boards, are providing more researchers for Cream of Science. Others hope to stimulate the actual supply of material, encouraged by their chancellors. Researchers’ own websites are being combed and journals published by research institutes and schools are being included. With this added effort the universities want to improve the structure of and streamline the process of supplying research results. To do so, a link is being established with Metis, the research information system used by all universities.
The presentations from the JISC-SURF-CURL International Workshop on e-Theses (Amsterdam, January 19-20, 2006), are now online. Also see the responses to the workshop questionnaire, and Neil Jacobs' report on the questionnaires and workshop in the January Ariadne. SURF reports that the workshop gave rise to a European Taskforce for E-Theses, though it gives few details. The taskforce doesn't seem to have a web site yet.
Two of the three presentations from the SPARC/ACRL Forum at the ALA Midwinter Meeting, Authors and Authority: Perspectives on Negotiating Licenses and Copyright (San Antonio, January 21, 2006), are now online. (The third used no slides for online posting.)
100 years of biosciences research captured in digital archive, a press release from JISC, February 17, 2006. Excerpt:
The NLM Board of Regents (BOR) met on February 7-8 to discuss the November 2005 recommendations from the the Public Access Working Group for strengthening the NIH public-access policy. The BOR sent a letter to NIH Director Elias Zerhouni on February 8, summarizing its own recommendations. The letter is not yet online. Excerpt:
The report of the November 15 Working Group meeting reveals that the current rate of participation in the voluntary Policy is very low (less than 4%). Since there is evidence that the submission system is relatively easy to use and that the majority of NIH-funded researchers appear to know about the policy, technical difficulties or lack of awareness do not appear to be primary reasons for non-compliance.
Comment. This is important. When Congress first asked NIH to develop an OA policy (July 2004), it asked the agency to mandate OA and limit embargoes to six months. When NIH chose instead (September 2004, May 2005) to request OA without requiring it, and to permit embargoes up to 12 months, it found that it couldn't get even 4% of its grantees to comply with the request. Examining the compliance data, the Public Access Working group recommended (November 2005) strengthening the policy and now the NLM Board of Regents joins the recommendation (February 2006). Both recommendations are merely advisory, but the burden has clearly shifted to the NIH either to strengthen the policy or justify continuing with a weakened policy that doesn't meet its own goals. We're one step closer to an OA mandate for the world's largest funder of medical research.
Daniel A. Nagy, DRM beyond copyright enforcement – alternative models for content distribution, INDICARE, February 15, 2006. Abstract:
In this article, we propose an alternative content distribution framework, which provides the necessary incentives for creating digital content without resorting to copyright enforcement. The proposed business model relies on peer-to-peer digital payment for which technical solutions already exist. Existing DRM technologies may actually be recycled for the purposes of the proposed business model, while removing the incentive misalignments currently plaguing the industry.
The February issue of D-Lib Magazine is now online. Here are the OA-related articles.
Today at OA Librarian, Ilkay Holt has two postings on OA in Turkey (first, second). Joint excerpt:
The Congress of Informatics Technologies IV, Academic Informatic 2006 took place on February 9th - 11th at Pamukkale University in Denizli, Turkey. It was a very successful conference with variety of subject matters in seperate sessions from e-learning to open access. In the e-library sessions, the ones on open access and institutional respositories were organized around creating an awareness among information professionals about open access and its benefits. At the end of the e-library sessions, the Berlin Declaration was accepted and it was decided that a leading committee on open access and institutional respositories would provide research institutions with necessary information. This committee will be formed with participants from the Turkish Librarians' Associations, Turkish Academic Network and Information Center. After the conference, a press release was distributed covering these decisions and the benefit of open access....
Google has posted the Congressional testimony of Elliot Schrage, the company's VP for Global Communications and Public Relations, explaining why Google decided to cooperate with Chinese censorship demands.
(PS: I'm not blogging this story in depth; it's voluminous and slightly offtopic for OAN. But I'm following it personally and will occasionlly post a key document or development.)
Declan Butler, Virtual globes: The web-wide world, Nature, February 15, 2006. Google Earth is becoming a platform for OA geospatial data. Butler explores how scientists are using GE and how --because GE is free, fun, and spectacular-- this science is reaching the public.
Also see the accompanying commentary on using GE for humanitarian relief after natural disasters, Nature's editorial on the exciting potential for 'virtual globe' software, and Butler's blog posting describing the cluster of related articles.
In an open letter dated yesterday, Max Steuer resigned as an editor of Emerald's Journal of Economic Studies in part because of its high subscription price. (Thanks to Ted Bergstrom.) Excerpt from his open letter:
You may want to know that I resigned from editing the Journal of Economic Studies on 6 January 2006. Shortly before that date it was suggested to me that the financial policy of the journal is inconsistent with the culture and practices of the academic community. It was careless of me not to look into this before taking on the job. I simply assumed that the fees charged and other aspects of policy were roughly in line with academic conventions. This turns out not to be the case. On the 6th of January I met with a representative of Emerald Publications to discuss the position. I wanted to be sure of the position, and if possible to affect a change in policy. It was clear that the pricing policy was and is very different from that of many well-known economics journals. In particular, the current price of £6,000 plus vat for six copies is far out of line. It was also clear from our discussion that no change in policy was to be forthcoming. As we know, the contributors and referees of academic journals are on the whole not paid and regard taking on work, particularly refereeing, as part of being members of a scholarly community. I feel badly at having asked many people to devote time to the journal....The policy of the Journal of Economic Studies is not determined by the Board of the journal, but by the owners....Board members should consider their positions.
Richard R. Nelson, The Market Economy, And The Scientific Commons, text of a talk given at the University of Michigan School of Law, January 26, 2006. (Thanks to John Wilbanks.) Nelson does not discuss OA to literature or data, but focuses on threats to the scientific commons from patents, exclusive licenses, and high licensing fees. Excerpt:
[I]t is widely recognized that the power of market stimulated and guided invention and innovation often is dependent on the strength of the science base from which they draw....This science base largely is the product of publicly funded research, and the knowledge produced by that research is largely open and available for potential innovators to use. That is, the market part of the Capitalist engine rests on a publicly supported scientific commons. The message of this essay is that the scientific commons is becoming privatized. While the privatization of the scientific commons up to now has been relatively limited, there are real dangers that unless halted soon important portions of future scientific knowledge will be private property and fall outside the public domain, and that could be bad news for both the future progress of science, and for technological progress. The erosion of the scientific commons will not be easy to stop. Here I want to call the alarm, and to suggest a strategy that has some promise....
Stevan Harnad, OA IRs are not peer-reviewed publications: They are access-providers, Open Access Archivangelism, February 16, 2006. Excerpt:
On Wed, 13 Feb 2006, Sarah Kaufman wrote in JISC-REPOSITORIES:...having spoken to academics within this institution, it has become apparent that potential depositors may be wary of depositing into a digital repository as they fear that a repository that includes pre-prints may not appear 'credible'. Has anyone else dealt with this sort of concern, and how you responded to those that have voiced this concern? Do any repositories exclude items that have not gone through the peer-review process? If you accept items that have not gone through the peer-review process, do you apply any forms of quality control on the item?
The NIH progress report to Congress on the public-access policy is now online (dated January 2006). It's a scanned image, so I can't cut and paste an excerpt and I don't have time to rekey the important parts. Here are the highlights.
The number of articles deposited in PubMed Central under the policy from its launch on May 2, 2005, to December 31, 2005 (namely, 1,636), the total number of articles covered by the policy that should have been deposited in the same period (43,000), and the embargo periods requested by authors (60% authorized release immediately upon publication, 23% requested embargoes of 10-12 months, and 17% requested something in between). The compliance rate is a miserable 3.8%. "Lack of awareness does not appear to be the primary reason for the low submission rate."
PMC usage increases as its size increases.
NIH sees no evidence that its public-access policy "has had any impact on peer review".
The cost of handling submissions and administering the policy was $1 million for fiscal 2005. If the compliance rate grows to 50%, the cost would grow to $2 million/year. If the compliance rate were 100% (65,000 articles/year), the cost would be $3.5 million/year.
The report describes the NIH's outreach efforts to educate stakeholders about the policy: NIH staff, grantees, grantee institutions, and the journals where grantees publish their work.
Finally, the report describes the November 15, 2005, decision of the Public Access Working group (PAWG). Ten out of 11 members wanted grantees to deposit the final, published versions of their articles. Nine of 11 voted to mandate deposit and public access. Eight of 11 voted to shorten the permissible delay to six months, with some flexibility for rare exceptions. (The minutes of the PAWG meeting are also online.) The report says nothing about the PAWG recommendations except that "[t]he NLM Board of Regents will consider the opinions of the Working Group at its next meeting."
Update. The NLM Board of Regents meeting mentioned in the final sentence took place on February 7-8, 2006. In the meeting, the Board endorsed the PAWG recommendations. For details, see my blog posting later on February 16, above.
Neil Jacobs, Digital Repositories in UK universities and colleges, FreePint, February 16, 2006. (Thanks to Garrett Eastman.) Excerpt:
Building on a previous development programme (Focus on Access to Institutional Resources - FAIR), the current Digital Repositories development programme [from JISC] consists of some 25 projects that are exploring the role and operation of repositories. Many of these are concerned with how repositories can help academic researchers both do and share their work more effectively. Open access is a key driver and demands are growing for the outputs of publicly-funded research to be freely available on the web....Repositories have a key role to play, since they both enable open access, and help universities and colleges manage the intellectual output of their researchers....In terms of active development, work is underway to help universities set up and populate repositories (Sherpa), to establish a Scottish research repository infrastructure (IRI Scotland), and to investigate the questions of different versions of academic papers (Versions).
In the rest of the article, Jacobs describes each of the other projects in the Digital Repositories Programme.
Update. Also see the JISC press release on Jacobs' article, February 16, 2006.
The February 5 issue of Groklaw has two articles on DRM and academic publishing:
The articles are at the same URL. You'll have to scroll down for Barton's; there's no internal link. (Thanks to Ross Scaife.)
Excerpt from Bixler:
Academic publishing is a more interesting case because the market dynamic is different from commercial publishing. Academic publishing generally serves niche markets which are inherently unprofitable. The mission of academic publishing tends to focus on the dissemination of knowledge instead of on pleasing shareholders....Michael Jensen of the National Academies Press says that they have considered the issues of DRM and have decided not to use it in their publications because they would prefer to maxmimise availability of their content, they do not want to lock it down and also they do not want to deal with the customer service issues that may come with DRM. Jensen also says that they have started putting books on their Web site for free reading/browsing in 1994, have more than 3,500 books online and now have a significant amount of traffic at 1.25 million hits per month. In the past, they have implemented watermarking on their downloadable PDF (Portable Document Format) files but abandoned that practice 8 months ago since they have found so few issues with online copyright infringement that it was not worth the trouble....According to the AAUP, university presses are subsidised and on average make about 85% of their revenues on sales. Given this, it is easy to understand why some university presses with uncertain subsidies are less enthusiastic about the idea of easily available copies which current technology enables. They can ill afford any significant loss to their already pinched revenues. But, at this point, any loss due to unrestricted digital copies is hypothetical....Ultimately, since it costs money to edit and produce print works, the questions are about business models. At the same time, the customers of academic works value the ability of free access to print works and would frown on any technogical restrictions which make this more difficult. If free copying is available, will current business models still work? If not, can an alternate model compatible with free copying be found? If DRM is inevitable, then can it at least be made minimally intrusive and user-friendly?
Excerpt from Barton:
With respect to DRM, what strikes me as both interesting and a challenge for university presses is the tension between the mission they serve and the business model under which they operate. Their mission, of course, is the certification and dissemination of scholarship. The business model comprises a number of things, most notably, the direct recovery of costs from readers. The tension follows from the most common cost recovery strategy: by restricting access to scholarship to only those readers who have paid for access, presses limit distribution and therefore, potentially, dissemination. I say "potentially" because in some disciplines I imagine presses reach the 200 people in the world capable of or interested in reading the most arcane of their publications. (Libraries purchase access for the communities they serve. Access nearly always implies that someone has paid for it.)...All would be well if purchasing power were unlimited. It isn't. And consequently scholarship is not thoroughly nor, one should note, equitably distributed. To the extent that this is true, university presses are failing their mission....[I]n the electronic world, both notification and second-copy distribution costs are dropping dramatically. Let's assume for a moment that universities, scholarly societies, or other sources of funding were to pay for certification (managed peer review) and first copy costs. Then there would be no significant costs remaining and no need for DRM as a means of extracting payment in exchange for access. In effect, these funding sources are already paying for the production of scholarship. And compared to those costs, the cost of publication is tiny. There is certainly a precedent for this approach to publishing: a portion of research grants routinely go to paying for the page charges commonly assessed by scientific journal publishers. Moreover, as Steve Izma points out, the same funding sources are paying much of the DRM fees. It seems like madness to suffer the transaction costs involved in this cost recovery model. The way out of this? I do not expect to see it coming from within the university press or the library communities. Budgetary expectations are too entrenched. I think that it is more likely that we will see a new generation of scholars organizing peer review and publication amongst themselves and deciding for their peers that this counts towards tenure and promotion. And they will teach their graduate students where to look for the best scholarship (as their teachers taught them). They will publish to whomever can find them and the good stuff by virtue of its citation network will rise to the top of Google's hit list (assuming Google doesn't make you pay to get there).
Tom Coates, Native to a Web of Data, a large slide presentation on the future of the web. (Thanks to Richard Ackerman.) Much of Coates' presentation can be taken as advice for those providing OA to data:
Cory Doctorow, Why Publishing Should Send Fruit-Baskets to Google, BoingBoing, February 14, 2006. Excerpt:
Google's new Book Search promises to save writers' and publishers' asses by putting their books into the index of works that are visible to searchers who get all their information from the Internet. In response, publishers and writers are suing Google, claiming that this ass-saving is in fact a copyright violation. When you look a little closer, though, you see that the writer/publisher objections to Google amount to nothing more than rent-seeking: an attempt to use legal threats to milk Google for some of the money it will make by providing this vital service to us ink-stained scribblers. Opponents of Google Book Search (GBS) argue that publishers should have been consulted before their works were scanned, but it's in the nature of fair use that it does not require permission -- that's what a fair use is, a use you make without permission. They argue that GBS should pay some money to publishers because anyone who makes money off a book should kick some back -- but no one comes after carpenters for a slice of bookshelf revenue. Ford doesn't get money from Nokia every time they sell a cigarette-lighter phone-charger. The mere fact of making money isn't enough to warrant owing something to the company that made the product you're improving....The reality is that the biggest threat to book-writers and publishers is that their works are simply invisible to people who get all their information from the Internet. Google Book Search makes our books visible to those people. In so doing, Google will save our asses from oblivion. Instead of sending legal threats to Google, I think that writers and publishers should be sending them fruit-baskets and thank-you notes....GBS puts books on a near-equal footing with other information resources, the ones that are currently kicking the hell out of us. When a customer performs a Google search, she can get results, right there on her screen, from real, actual books, books that can often be purchased with a single click. This is our single best hope for extending our industry's lifespan for a decade or two.
Ross Atkinson, Introduction for the Break-Out Sessions: Six Key Challenges for the Future of Collection Development. A talk at the Janus Conference on Research Library Collections, Cornell University, October 10, 2005. Atkinson is the Associate University Librarian at Cornell. (Thanks to William Walsh.) Excerpt:
Collections attract scholars, graduate students, government support, donor funding --and add prestige to the institution. This rationale for collection building --the collection as institutional capital-- is a primary motivation, even though it is seldom specifically discussed. One point we must bear in mind with respect to this rationale, however, is that it entails or implies the existence of a separate collection at each institution which can in effect compete with all others. The new environment into which we are now moving, on the other hand, is likely to be increasingly characterized by a much more unified collection to which all users would have access. Indeed, what perhaps so fascinates us and unnerves us about open access, I think, is that it might serve as a first, decisive step in the direction of a more unified, less institutionally based collection. While there is no question whatsoever that open access represents a supremely valuable trend ideologically --perhaps the ultimate aim of all collection services-- libraries continue to wrestle with its implementation and implications, including its effect on institutional identity. However, such a concern about identity, if I am correct in sensing it, is a red herring --because of what we might call the “axiom of non-equivalence.” By this I mean the trivially simple fact that individual libraries are not the same, nor will they ever be. Each has vastly different resources --not only financial, but also human and creative resources, including different visions and values. The fact is, therefore, that all scholarly publishing could convert to open access tomorrow --every scholarly publication could be made openly accessible-- and still, the accessibility, the collection service, the ability of the user to find, understand, use and apply the individual object, would vary enormously from one institution to the other. Any morbid fear we might harbor, therefore, of becoming mirror images of each other as we move toward a more unified collection is unfounded, and we cannot allow it to deter us from moving in that direction, if we decide that direction is in the best interest of our user communities....
The Electronic Frontier Foundation (EFF) has written an open letter to members of Congress proposing A Code of Conduct for Internet Companies in Authoritarian Regimes. The proposal wouldn't ban doing business in authoritarian regimes and probably wouldn't even ban the controversial practices by Yahoo and Google now making news. But it would ask companies to retain as little identifiable data on users as possible; to notify users when removing or hiding web sites; when forced to suppress information within a country, to make the same information available elsewhere; to document government censorship requests and the laws, if any, that require compliance; to keep these records even if they cannot be made public until after a regime change; to avoid "actively and knowingly providing services that facilitate censorship or repression"; and to offer users encryption and circumvention technologies. This is a very good start.
Becky Hogge, The year of free culture? Open Democracy, February 14, 2006. Excerpt:
As the UK government reviews patent and copyright law to boost Britain's creative economy,...democratic access to knowledge ought to benefit too....Three separate reviews, two conducted by the British government, and one by a leading thinktank, will examine current patent and copyright law, and their findings will have direct implications for the democratic future of the information society....What has this got to do with technology or democracy? The dissemination of intellectual property, or access to knowledge, is one of the key pillars of democracy. As information courses ever more rapidly through the internet, barriers to access are gradually reduced....If Britain is to come out top in the knowledge economy, it needs to understand new ways of managing intellectual property that have emerged with the information society, existing under the controversial umbrella term "free culture". One such development is the Gnu project's general public licence (or "copyleft"), which nurtures collaborative authorship of software within the open source movement. More recently, Lawrence Lessig's Creative Commons licences have shown how relaxing key areas of traditional copyright can democratise creativity in a new remix culture, facilitated by the wider availability of digital sound and moving image production tools. Within the scientific community, the open access publishing movement is exploring how best to disseminate scientific knowledge in the internet age. And new proposals to decouple research and development from production in the pharmaceutical industry, inspired by the success of the Sanger Institute in mapping the human genome, are gaining gradual support. Will the British government take note of these innovations in their reviews?...What is crucial now is that defenders of the public good vested in the democratic dissemination of information step forward to make their voices heard.
Mallory Bowman and Chris Brown, Libraries face debt: Journal subscriptions may be in jeopardy, Louisville Cardinal Online, February 14, 2006. Excerpt:
University of Louisville students may soon lose access to a number of scholarly journals provided through the library system if funds cannot soon be obtained. Rising costs of subscriptions for both hard copies and electronic versions of scholarly journals and research databases have created budget woes for the library, which is now $500,000 - $600,000 behind in subscription payments, said U of L Dean of Libraries Hannelore Rader. While no subscriptions to scholarly journals have been permanently deactivated, if funds are not identified soon, cuts will have to be made, she said....“This year we thought, ‘Oh my God, can we get any more money?’” she said. “In order to pay for these in years to come, our base budget needs at least another million dollars.”...John Drees, director of U of L’s Office of Communication and Marketing, said the university is underfunded by $52 million compared to its average benchmark institutions....[Rader] said part of the problem is that she knows the exact cost of a subscription only when a bill hits her desk. “We really have no idea how much these journals are going to cost us from year to year,” she said. “Inflation for these journals is usually 10 to 20 percent every year, and we don’t know how much they are going to cost us until we get the invoices.”...Rader said the library’s materials budget was $7 million for the 2005-2006 fiscal year, and that the budget this year wouldn’t cover all the subscriptions that the library holds. She explained that large databases, especially those based outside of the United States, have a monopoly on many of the scholarly journals....Mary Beth Thomson, associate dean for Collections and Technical Services at University of Kentucky Libraries, said her school’s libraries are facing the same situation. “This is a problem that’s not unique to U of L or UK. It’s not unique to the state of Kentucky. It’s a nationwide problem,” Thomson said. “The fact is that the cost of scholarly communication is increasing, and our budgets cannot keep up.”...Rader said the U of L and UK are working together to ensure that at least one of institutions has access to certain journals. “We’re trying to do something, [like] keeping one copy in the state,” she said. Students and faculty will have limited access to journal articles provided through interlibrary loan, however, since Rader says copyright laws strictly limit the number of articles that can be shared between schools.
Hans van Ginkel, Toward a smarter information superhighway, Asahi Shimbun, February 14, 2006. Van Ginkel is Under-Secretary-General of the United Nations and the Rector of the United Nations University. Excerpt:
Universities...are increasingly recognizing that the real expansion today in international knowledge exchange does not relate to physical mobility of individuals, but rather to the mobility of knowledge itself. Openness and sharing knowledge have always been the hallmarks of the most successful universities around the world....MIT offers a rather interesting model of an educational institution responsive to globalizing trends. In particular, I would like to refer to their work with free online learning through the OpenCourseWare program. Many universities use distance and online learning as an important measure to increase student numbers....This represented a revolutionary step forward since up until then the trend had been for universities to restrict access to their knowledge behind password-controlled learning management systems. It is MIT's boldness in promoting this open approach that I believe contributes to its success and high ranking among international educational institutions. This same kind of openness is highly relevant to Japanese universities and would contribute to their climbing up the international university rankings. Hence, I am delighted to see that a number of Japanese universities have decided to work together to promote the idea of OpenCourseWare in Japan. The Japan OCW Alliance includes Keio University, Kyoto University, Osaka University, Tokyo Institute of Technology, Tokyo University and Waseda University. This alliance was launched in May 2005, so it is relatively young, but has already placed a large amount of educational content online. It represents an important showcase of quality educational materials and, as professor Yuichiro Anzai, president of Keio University, has remarked, the Alliance is indicative of a new leadership role for Japanese universities. This view is echoed by President Kazuo Oike of Kyoto University who views the Japan OCW Alliance as contributing to the accumulation of intellectual capital on the World Wide Web. Moreover, as President Hiroshi Komiyama of the University of Tokyo remarked, "our goal is to give back to society the fruits of our educational activities. We would also like to assist those who are eager to learn by themselves, and those who would like to take part in a dynamic creative activity."...We at UNU share the same vision. At the 2002 Johannesburg World Summit on Sustainable Development, I called for the creation of a Global Learning Space to support science-based education through various projects, including the U.N. Water Virtual Learning Center, the Global Virtual University and the Asia Pacific Initiative. The latter includes close collaboration with Keio University and a network of universities in Southeast Asia and the Pacific to develop and run educational programs in the region on issues of sustainable development. More recently, at the World Summit on the Information Society held in Tunis in November 2005, I called for an "Information Society Open to All" where the provision of open educational resources would represent a core component....The way forward in this globalizing world, I would argue, is to promote openness within the education system.
Si Chen, When Billions Aren't Enough, Open Source Strategies, January 21, 2006. (Thanks to PhotoSydney via Jean-Claude Bradley.) Excerpt:
Our favorite anti-open source article, "Winning the Linux Wars", suggested that Microsoft partners should be "Playing the R&D card" by emphasizing that "Microsoft invests north of $6 billion a year on R&D. There is nobody in the Linux world that does that." Well, Merck (MRK) invests about $4 billion a year in R&D. Bristol-Meyers (BMY) $669 million. Eli Lilly & Co. (LLY) $2.7 billion. Pfizer (PFE) $1.8 billion. Sanofi-Aventis (SNY) a whopping $10.2 billion, or nearly half of its $20.5 billion in revenues. Together, that's about $19.5 billion a year in research and development. Apparently, though, that's not enough. This Friday (January 20, 2006), The Wall Street Journal's "Science Journal" ran article entitled "In Switch, Scientists Share Data to Develop useful Drug Therapies" [PS: blogged here 1/2/06] which pointed out that there is a "crisis in 'translational science,' or turning basic discoveries into therapies," and that only twenty new drugs were approved by the US Food and Drug Administration in 2005. One billion a drug, approximately. More importantly, the article points out some interesting trends:  The pace of basic biomedical research is outstripping the pace of translational research. In other words, we're learning about genetics and biology faster than we're able to make drugs based on that knowledge.  As a result, foundation grants have not produced concrete results of cures for illnesses.  In response, the foundations are now shifting funds from basic research into therapies, taking over a role once left to industry....What is the unique collaboration? Sharing knowledge. The foundation which sponsored the research is apparently requiring the scientists it funds "share results in real time," rather than keep their discoveries proprietary. As a result, it has made the scientists feel more accountable for their work and therefore become more engaged in curing diseases.
Sara Schroter, Importance of free access to research articles on decision to submit to the BMJ: a survey of authors, BMJ, February 14, 2006. NB: "This is version 2 of the paper. In this version we have clarified the role of the BMJ in publishing studies carried out by its staff."
Abstract. Objectives. To determine whether free access to research articles on bmj.com is an important factor in authors’ decisions on whether to submit to the BMJ, whether the introduction of access controls to part of the BMJ’s content has influenced authors’ perceptions of the journal, and whether the introduction of further access controls would influence authors’ perceptions.Version 1 of the paper was published on January 9 (blogged here on January 10), 2006.
Heather Morrison, Thomas Krichel: a man with ideas, and drive! OA Librarian, February 13, 2005. Another installment in Heather's celebration of librarians working for OA. Excerpt:
Money doesn't make the world go round. Ideas do! So said Thomas Krichel, Assistant Professor, Palmer School of Library and Information Studies, Long Island University, at the First E-LIS Workshop....May I add: it is people with ideas - and the drive and determination to see their ideas realized - that really make the world go round! People like Thomas Krichel - E-LIS team member, early open access pioneer, and founder of the world's second largest archive (after arXiv), RePec. Thomas is recognized twice on Peter Suber's Open Access Timeline. On February 1, 1993, Thomas launched the Working Papers in Economics (WoPEc), with the deposit of an open access working paper (not his own), the first in economics. On May 12, 1997, Thomas launched RePEc, Research Papers in Economics, which as of today holds over 362,000 items of interest, 261,000 of which are available online....WoPEC and RePEc both emerge from traditional practices in economics, which has a strong working papers culture. The distributed archives approach also reflects practice in the field. This model works so well because it fits the discipline, rather than the other way around....The American Economics Association has been collecting information about working papers for years, but their collection was not as comprehensive as RePEc's. Now, information about RePEC is being added directly to the Association's EconLit. Why would a volunteer-based organization like RePEc choose to give away their work to a profit-making publisher for free? Because, says Thomas Krichel, this works to the advantage of both: EconLit is more valuable, and placing your work in RePEc is the best way to ensure your working papers are included in EconLit, which enhances the success of RePEc....Thomas' advice on what students and librarians should be learning for the future: open source software!
Elsevier CEO Erik Engstrom gave a presentation on Open Access at the Reed Elsevier Investor Seminar 2005 last November 21, 2005. (Scroll to p. 39 of the seminar report.) Since the slides are not easy to cut and paste, I'll link to William Walsh's excerpts, since he's already done the work. (Thanks, William.)
Comment. I see no signs of animosity in the talk. But I do see several misunderstandings: (1) Engstrom frequently refers to the "author pays" model, even though this is an inaccurate label for the business model he has in mind. (2) He repeats the inaccurate observation that OA journal launches peaked in 2001 and have declined since. (3) On author attitudes, he cites the CIBER 2005 report, which seems to have asked authors whether they'd be willing to pay a fee out of pocket, and does not cite the Key Perspectives reports, which better understand the way that OA journals actually operate. (4) He relies on the discredited Cornell calculation, which assumes that all OA journals charge author-side fees and that all fees would be paid by universities. (5) He relies on the discredited assumption that OA journals exclude indigent authors, unaware that most OA journals charge no fees at all and most of the others waive them in cases of economic hardship. (6) He cites the Kaufman-Wills report for its retracted conclusion about weak peer review, but seems unaware that the same report also showed that fewer than half of OA journals charge processing fees, and that a greater percentage of non-OA journals charge author-side fees than OA journals.
Liz Lyon, Digital Libraries and e-Research: new horizons, new challenges? A presentation at the 8th Bielefeld conference, Academic Library and Information Services - New Paradigms for the Digital Age (Bielefeld, February 7-9, 2006).
Carol Hixson, If We Build It, Will They Come (Eventually)? Scholarly Communication and Institutional Repositories, The Serials Librarian, 50, 1/2 (2006). Prepublication.
Abstract: Specialists in serials have been dealing with the effects of an imbalance in the scholarly communication process for some time. The increase in scholarly output coupled with the decreasing ability of libraries to provide access to that output due to spiraling journal costs has created tensions for libraries and their communities. By advocating and providing a means to provide open access to scholarly output, institutional repositories have been promoted as one strategy for redefining the scholarly communication model. Since January 2003, the University of Oregon libraries have been exploring this approach. The paper will discuss the challenges and opportunities that such repositories face and examine their effectiveness in changing the nature of scholarly communication. This will be done primarily through a case study of the experience of the University of Oregon Libraries.
John Cox, Access to Scholarly Literature: Publishing for an Extended Readership, The Serials Librarian, 50, 1/2 (2006). Prepublication.
Abstract: Access to scholarship and research has become controversial. It is described in apocalyptic terms as 'open'-or good and moral, or 'toll-gated'-with the life-blood of the system ebbing away. The real world is more complex. Publishing is not a homogenous activity, because it reflects the varied needs of scholars. This paper will be based on evidence from surveys and from published inquiries. It will pose, and attempt to answer, some questions about the future of publishing scholarly information, including open access, in the context of what publishers are actually doing. It will describe the challenge that faces publishers and librarians in meeting both scholarly and societal needs.
Ray English, Director of Libraries at Oberlin College and champion of OA, has received the ACRL Academic/Research Librarian of the Year Award. From today's announcement:
“Ray English is an influential librarian,” said award committee chair Les Canterbury. “He is a leader in various organizations on state and national levels including the Oberlin Group of Liberal Arts Colleges, OhioLINK, ACRL, and other units of the American Library Association. Under his direction, Oberlin College has led a Mellon Foundation initiative involving six academic libraries that's designed to attract a more diverse population to the library profession through undergraduate internships. “English's greatest impact as a librarian, perhaps, and the area of his work that stands out to the selection committee, is his advocacy for open access to the results of scholarly research. The breadth and depth of his knowledge of issues related to dissemination of scholarly output, and his commitment to access to information, led to his leadership role in information policy-setting arenas. He has been a primary leader of the ACRL scholarly communications program, has been active in the Scholarly Publishing and Academic Resources Coalition (SPARC), and has fostered close cooperation on scholarly communications issues among ACRL, SPARC, and the Association of Research Libraries. In addition, and on a larger stage, he has influenced, as an expert contributor, national policy on public access to federally-funded research, including the recent National Institutes of Health Public Access Policy.”
PS: Ray is also one of the most active and effective members of the Open Access Working Group. Congratulations, Ray!
Update. Also see Rani Molla, Librarian Ray English Wins National Award, Oberlin Review, February 24, 2006.
Today is the fourth anniversary of the Budapest Open Access Initiative (BOAI), the first initiative to call for OA journals and OA archives as complementary strategies, the first to be accompanied by significant funding, the first to use the term open access, and the first of the major public statements (along with the Bethesda and Berlin statements) to form the consensus definition of OA that now structures the OA movement. The BOAI emerged from a December 2001 meeting in Budapest convened by the Open Society Institute, which committed $3 million to implement the BOAI vision. The BOAI was officially issued on February 14, 2002. (Disclosure: I helped draft the BOAI and receive support from OSI.)
I wish I had time to review the last four years of OA activity and show the influence of the BOAI. I don't, but I can offer these pieces: Open Access in 2005, Open Access in 2004, and Open Access in 2003. I didn't write an OA review for 2002, but I did review OA archiving activity in the first six months after the BOAI launched. And for the details missing from these reviews, there's always my timeline.
Happy birthday, BOAI, and Happy Valentines Day.
Frederick J. Friend, Google Scholar: Potentially Good for Users of Academic Information, Journal of Electronic Publishing, Winter 2006.
Abstract: Use of the Google search engine is commonplace amongst all sectors of the academic community. The development of the specialist Google Scholar search service will benefit the academic community in bringing to their attention content more relevant to their needs. The vast number of Web sites containing potentially relevant information requires a search engine ranging over many millions of Web sites but with the ability to target very specific types of information. The Google Scholar service has the potential to grow if it develops close contacts with both providers and users of academic information. Use of Google Scholar will benefit the authors and managers of open access content, but there are opportunities for all types of academic content providers in the way Google Scholar is set up. Google Scholar will face competition and have to keep pace with user expectations and technological developments.
Paula Berinstein, Ad-Supported Free Books Arrive, Information Today NewsBreaks, February 13, 2006. Excerpt:
Perhaps information really does want to be free. Citing the desire to create new revenue streams for authors, mega-publisher HarperCollins has announced the first free Web-based, ad-supported, full-text business book. Go It Alone! The Secret to Building a Successful Business on Your Own by Bruce Judson is now available on the author’s Web site, where an affiliate link to Amazon, not the publisher, can also be found. Not only can the book be read at the site, but it can also be searched. HarperCollins Publishers is calling the project a test of a new business model. Some self-published authors also offer ad-supported books online, but HarperCollins’ move is the first by a major publisher. For now, the project is limited to the one book, with publisher and author sharing the advertising revenue. The author’s contract was specially amended to accommodate the arrangement. Company spokesperson Erin Crum said: “We are exploring how online advertising programs can add value for publishers and authors. The results will be measured by the income generated through ads, number of page views and visitors to the site, and by sales of books from the site. If successful, this kind of digital product might be a new format that supplements the paperback edition.”...HarperCollins has previously indicated the desire to control its own digital assets rather than let others scan and store them. (Until now, the company has sent its books to others for digitizing, such as Amazon, which produces Search Inside the Book pages for its online bookstore.) Nevertheless, Go It Alone! is hosted not on HarperCollins’ servers but on the author’s, which are under the control of a Web site company. The company says that they will evaluate where future free books should be served from on a case-by-case basis....On Judson’s site, the book, displayed in HTML rather than as a PDF, is flanked
Richard Ackerman, Is the research library obsolete? Science Library Pad, February 12, 2006. Excerpt:
Research libraries on the other hand, don't play any of [the] roles [played by public libraries or undergrad-focused academic libraries]. There is no public to serve. There is no community meeting place role. There are no confused or desperate undergrads to help. So shouldn't a research library just  digitize and index all of its current (out of copyright) paper holdings, and then send the paper into storage in some climate-controlled cave somewhere,  provide good licensed access to the necessary publisher websites for its researchers,  close down[?]
Charles W. Bailey, Jr., What Is Open Access? A preprint of a chapter forthcoming in Neil Jacobs (ed.), Key strategic, technical and economic aspects, Chandos Publishing, 2006. From Charles' blog description of it:
This chapter provides a brief overview of open access (around 4,800 words). It examines the three base definitions of open access; notes other key OA statements; defines and discusses self-archiving, self-archiving strategies (author Websites, disciplinary archives, institutional-unit archives, and institutional repositories), and self-archiving copyright practices; and defines and discusses open access journals and the major types of OA publishers (born-OA publishers, conventional publishers, and non-traditional publishers).
Update. See the HTML edition of Charles' chapter, which has more live links than the PDF edition. Also see his 2/20/06 blog posting explaining the differences between this chapter and his other introductory treatments of OA.
Alireza Noruzi, Google Scholar : the new generation of citation indexes, LIBRI 55, 4 (2005) pp. 170-180. Self-archived February 11, 2006.
Abstract: Google Scholar provides a new method of locating potentially relevant articles on a given subject by identifying subsequent articles that cite a previously published article. An important feature of Google Scholar is that researchers can use it to trace interconnections among authors citing articles on the same topic and to determine the frequency with which others cite a specific article, as it has a "cited by" feature. This study begins with an overview of how to use Google Scholar for citation analysis and identifies advanced search techniques not well documented by Google Scholar. This study also compares the citation counts provided by Web of Science and Google Scholar for articles in the field of "Webometrics." It makes several suggestions for improving Google Scholar. Finally, it concludes that Google Scholar provides a free alternative or complement to other citation indexes.
Lori Andrews, The Patent Office as Thought Police, Chronicle of Higher Education, February 17, 2006 (accessible only to subscribers). Excerpt:
The boundaries of academic freedom may be vastly circumscribed by the U.S. Supreme Court this term in a case that is not even on most universities' radar. Laboratory Corporation of America Holdings v. Metabolite Laboratories Inc. is not a traditional case of academic freedom involving professors as parties and raising First Amendment concerns. In fact, nobody from a university is a party in this commercial dispute, a patent case between two for-profit laboratories. But at the heart of the case is the essence of campus life: the freedom to think and publish. The saga began when researchers from Columbia University and the University of Colorado Health Sciences Center developed a test to measure the level of homocysteine, an amino acid, in a patient's body. In research on thousands of people, the investigators learned that a high level of homocysteine is correlated with a vitamin deficiency: low levels of cobalamin or folate. Other tests for homocysteine existed and were used for a variety of medical disorders. But considering theirs to be an improvement, the researchers applied for a patent. In their application, they also claimed that, because they were the first to recognize that a high level of homocysteine is connected to a vitamin deficiency, they should be allowed to patent that basic physiological fact. Thus they would be owed a royalty anytime anyone used any test for homocysteine and concluded that an elevated level signified a vitamin deficiency. They received U.S. Patent No. 4,940,658 — known as the '658 patent — and later licensed it to Metabolite Laboratories....[A]fter LabCorp published an article stating that high homocysteine levels might indicate a vitamin deficiency that could be treated by vitamins, Metabolite sued LabCorp for patent infringement and breach of contract, and was awarded more than $5-million in damages. LabCorp appealed to the U.S. Court of Appeals for the Federal Circuit, which hears all patent appeals. Astonishingly, it held that LabCorp had induced doctors to infringe the patent by publishing the biological fact that high homocysteine levels indicate vitamin deficiency. The court also ruled that the doctors had directly infringed the patent by merely thinking about the physiological relationship....By considering publishing and thinking about a law of nature to be actionable under patent law, the Federal Circuit court has severely threatened academic freedom. Professors everywhere should be concerned about the case, and how the Supreme Court will rule on LabCorp's appeal. The decision has set off a rush to the patent office to assert ownership over other scientific facts and methods of scientific and medical inquiry....Upholding the '658 patent would discourage the sharing of scientific information through publication.
Sukhdev Singh, Indian biomedical journals at NIC - issues for migration to OJS, a slide presentation delivered at the National Workshop on Journal Publishing in India (Bangalore, February 10-11, 2006).
D.K. Sahu, MedKnow Publications: open dissemination of research, a slide presentation delivered at the National Workshop on Journal Publishing in India (Bangalore, February 10-11, 2006).
From the front page of the Paleoanthropology Society web site:
The online journal PaleoAnthropology is now being published jointly by the Society and the University of Pennsylvania Museum. As a result, beginning immediately, the journal will be accessible free of charge to everyone, including non-members of the Paleoanthropology Society.
From the Society's Spring 2006 message to members:
Dues for 2006 have been set at $20....Our situation has been complicated by the fact that the Society’s journal PaleoAnthropology will no longer be published by Penn Press [and will be OA]. Student and regular subscriptions through Penn Press had included both access to the journal and Society dues. For those who made payments for 2006 subscriptions to Penn Press, $20 will be credited to dues; those who paid $30 will be contacted concerning a $10 refund.
Here are a couple of blogger comments. First from John Hawks:
I think this is super cool, and I can't overstate the importance of this in my decision about where to send research. Nothing galls me more than having to stick that "subscription only" reminder next to things I link to, because I know there are a lot of interested people who can't get them.
This from Kambiz Kamrani:
When news is posted at 1:03am on Sunday mornings, you know it's important... and if you know what's really important to me (you should by now)... open access makes the list....This is extremely good news for the open access movement, and for anthropology in general. Now published works in PaleoAnthropology will be freely accessible to the public. I wonder if someone has told Open Access News of the good news?
This from Duane Smith:
I'm very glad to see this for two reasons. First, I personally like having access to these articles. But perhaps more important, I think having them publicly available through open access will go a little way to defuse the nonsense about the cultic nature of science. While I'm not so naive as to think that such articles will be widely read, I do think that scientists should make their works more readily available to the public. At a minimum, the public might come to recognize the difference between a work of science and a piece of religious or political propaganda.
Bush Axing Libraries While Pushing For More Research, a press release from the Public Employees for Environmental Responsibility (PEER), February 10, 2006. (Thanks to LIS News.) Excerpt:
Under President Bush’s proposed budget, the U.S. Environmental Protection Agency is slated to shut down its network of libraries that serve its own scientists as well as the public....In addition to the libraries, the agency will pull the plug on its electronic catalog which tracks tens of thousands of unique documents and research studies that are available nowhere else.....At the same time, President Bush is proposing to significantly increase EPA research funding for topics such as nanotechnology, air pollution and drinking water system security....“How are EPA scientists supposed to engage in cutting edge research when they cannot find what the agency has already done?” asked PEER Executive Director Jeff Ruch, noting that EPA Administrator Stephen Johnson is moving to implement the proposed cuts as soon as possible. “The President’s plan will not make us more competitive if we have to spend half our time re-inventing the wheel....Access to information is one of the best tools we have for protecting the environment....[C]losing the Environmental Protection Agency libraries actually threatens to subtract from the sum total of human knowledge.”
PS: Also see the OMBWatch report, Dismantling the Public's Right to Know: EPA's Systematic Weakening of the Toxic Release Inventory, December 1, 2005, or this blogged excerpt.
ACRL seeks applicants for the first Scholarly Communication Institute, co-sponsored with the Association for Research Libraries, to be held in Los Angeles, July 12-14, 2006. The deadline for application is March 1, 2006. Acceptance to the Scholarly Communication program is limited to 100 individuals, and selection is on a competitive basis. Read details about the program as well as the online application form and instructions. FAQ
Participants will work with experts in the field to understand how to better engage faculty at their institution around the crisis in the systems of scholarly communication. You will also learn about the emergence of new models for scholarly communication as well as strategies for creating systemic change. These will include: * Faculty activism (e.g. editorial board control, author rights, copyright management, and self-archiving) * New publishing models * Digital repositories * Legislative and policy advocacyRead more »