Open Access News

News from the open access movement


Saturday, August 09, 2008

Video on Google Scholar and OA

Google Scholar and Open Access to Scholarly Literature is a video of a one-hour presentation by Donna Gunter, apparently recorded at the University of North Carolina - Charlotte on November 30, 2007 and posted to blip.tv on August 8, 2008.

Upgrade to Sudan Open Archive

A bigger, better digital library for Sudan - SOA 2.0, Rift Valley Institute, undated but apparently recent. (Thanks to Afrora.com.)
An expanded version of the Sudan Open Archive is now available online. The new version, SOA 2.0, features an improved user interface and open access to over a thousand books and documents. SOA 2.0 also incorporates an internet guide with regularly updated links to several hundred Sudan-related websites. The Archive is a searchable, full-text database that covers all aspects of Sudan, making a wide range of material available in digital form for the first time. SOA 2.0 includes dictionaries, historical material on human rights and environmental issues and an extensive collection of recent reports on local peace meetings. There are also key documents concerning the current political process in Sudan. Among the books and reports in the Archive are "The Dhein Massacre" by Ushari Mahmud and Suliman Baldo, the Report of the Abyei Boundaries Commission, dictionaries of Sudanese and Juba Arabic and F.W.Andrews' three-volume "The Flowering Plants of Sudan".
See also our past posts on the archive.

U of Melbourne plans for OA

Linda O’Brien, Mark Brodsky, and Margaret L. Ruwoldt, Melbourne’s scholarly information future: a ten-year strategy, University of Melbourne, July 2008.  (Thanks to Colin Steele.)  Excerpt:

...In this digital age, sharing our scholarly works -- our teaching, our research, our collections -- has never been easier. By disseminating our scholarship widely the world is now our community; our reach can be broad ranging, can be of greatest impact, touching those who can derive maximum value from it.

Where possible, we will seek to make our research data, teaching materials, creative works and publications available in open and interactive ways....

By 2015 we will know we’re on track if...

  • Management and dissemination of research data and digital collections is painless.
  • Our scholars find it easy to disseminate their scholarly works in open ways, increasing their research impact and contributing to global knowledge.
  • Digital versions of our research output are openly available in interactive ways wherever appropriate, bringing these works to life through engagement for mutual benefit.

By 2020 we will know we’re on track if...

  • Melbourne is known for the depth of its disciplinary research and its innovative interdisciplinary research, profiled through open access to our scholarly output and recognised through improved international research rankings....

For much of the thinking behind the new 10-year strategy, also see a second report from the same authors, Information Futures Commission: final report of the Steering Committee, University of Melbourne, July 2008.

7 OA repositories compared

Francis Deblauwe, OA Academia in Repose, iCommons.org, August 7, 2008. Compares arXiv, arXiv Math, CERN Document Server, CDL eScholarship Repository, Connexions, Directory of Open Access Journals, Dspace@Cambridge, and the Trance Project.
I looked for the number of full-text OA documents and the like available in the repositories, and the date for the numbers in question. ... The repositories' slope, a measure of how fast they have grown on average over their lifespan, is plotted against the start dates ...
Update. See also the comments by Stevan Harnad:
Summary: Re: Deblauwe, F. (2008) OA Academia in Repose: Seven Academic Open-Access Repositories Compared: A useful way to benchmark OA progress would be to focus on OA's target content -- peer-reviewed scientific and scholarly journal articles -- and to indicate, year by year, the proportion of the total annual output of the content-providers, rather than just absolute annual deposit totals. The OA content-providers are universities and research institutions. The denominator for all measures should be the number of articles the institution publishes in a given year, and the numerator should be the number of articles published in that year (full-texts) that are deposited in that institution's Institutional Repository (IR). (If an institution does not know its own annual published articles output -- as is likely, since such record-keeping is one of the many functions that the OA IRs are meant to perform -- an estimate can be derived from the Institute of Scientific Information's (ISI's) annual data for that institution.)

New edition of university Web rankings

A new edition of Ranking Web of World Universities has been released, which reflects (according to the researchers) universities' relative commitments to OA.

See also previous posts on the Webometrics rankings.

Overview of CC from Digital Curation Centre

The JISC-funded Digital Curation Centre released on August 6 an overview of Creative Commons Licensing as part of its Legal Watch Papers series. Papers in the series are OA from the DCC site. (Thanks to Charles Bailey.)

See also the paper on Sharing Medical Data — the Legal Considerations, released April 25.

Friday, August 08, 2008

OA series on commodities and empires

The Commodities of Empire Project, from the Open University's Ferguson Centre for African and Asian Studies and London Metropolitan University's Caribbean Studies Centre, provides OA to its working papers series. The papers in the series are from 2007 and 2008. Thanks to Rachel Laudan.)

Cochin University of Science and Technology launches an IR

India's Cochin University of Science and Technology (CUSAT) has launched an institutional repository.  For details, see today's article in The Hindu:

Supported by the Department of Scientific and Industrial Research, the repository is the first of its kind offered by any university in the State, claimed the university authorities.

An institutional repository is an online locus for collecting, preserving and disseminating the intellectual output of an institution, particularly a research institution, in digital form.

Explaining the objectives of the initiative, C. Beena, Assistant Librarian, Cusat, said that the facility would provide pre-prints and post-prints of research journal articles and digital versions of theses and dissertations....

Ms. Beena said that institutions like the Indian Institute of Management- Kozhikode (IIMK), Vikram Sarabhai Space Centre and the Naval Physical and Oceanographic Laboratory (NPOL) have their own institutional repositories....

The main objectives of having an institutional repository include creating global visibility for an institution’s scholarly research and to collect content in a single location.

It also provides open access to institutional research output by self-archiving it. Online repositories also help in storing and preserving other institutional digital assets, including unpublished theses or technical reports.

Will the OECD open data guidelines also support commercial research?

Klaus Hoeyer, Mette N. Svendsen, and Lene Koch, OECD guidelines on open access: commercialization in disguise?  Trends in Biotechnology, August 5, 2008.  Only this abstract is free online, at least so far:

Abstract:   The Organisation for Economic Co-operation and Development (OECD) has published a new set of principles and guidelines to promote open access to datasets and results from publicly funded research. However, there is reason to think twice about the implications of making demands for transparency and open access for publicly funded research only. How will such demands affect incentives and research agendas? Might this new regulation of publicly funded research have undesirable effects on the quality and value of research? Placing the OECD guidelines in a broader context of research regulation, we argue that they might provide a further push toward collaboration with commercially sponsored research and reinforce incentive structures that favour the creation of commercial value.

PS:  For background, see the OECD Declaration on Access to Research Data From Public Funding from January 2004, and its Principles And Guidelines For Access To Research Data From Public Funding from April 2007.

AAUP names new president

Alex Holtzman is the new president of the American Association of University Presses (AAUP), the successor to Sanford Thatcher.  From today's announcement from the AAUP:

...In his inaugural speech to AAUP members in Montreal, Holzman addressed one of the major concerns facing academic scholarly publishers, the debate over open-access publishing. "What we need here - and what I'm happy to say is beginning to show some signs of happening in various quarters - is for the moderates to take back the discussion," he said. "The Ithaka report was right - university presses are first and foremost units of the university and we must work with others in the university to solve our problems. When I see the various experiments presses are trying with open access, often jointly with their libraries, when I see more administrators grasping the fact that abandoning all restrictions on intellectual property can be a double-edged sword for the university, well, then I truly believe at least some optimism is justified."

Comment:  I don't know of any university presses which experimented with "abandoning all restrictions on intellectual property" and I don't know of any OA advocates who recommended it (for books or journals as opposed to data).

More support for OA in the humanities

SCI 6 Focuses on Models for Humanities Research Centers, CLIR Issues, July/August 2008.  Excerpt:

In mid-July the University of Virginia hosted the sixth annual Scholarly Communication Institute (SCI 6). This year's institute focused on humanities research centers --specifically, the models that are best able to support and advance technology-enabled humanities scholarship. Some 30 representatives from higher education, funding agencies, and scholarly societies attended....

Participants agreed on the need for more investment in open access collections, tools, and services. Better cross-campus collaboration is needed among scholars and libraries, media learning centers, information technology (IT) centers, and centers for the arts, humanities, or technology. Improved information sharing among centers is also desirable. Participants discussed the possibilities of new organizational models that would encourage integration of theory and practice, multiple disciplines, and different media and formats. They also discussed the need for an array of institutional support services and expertise.

The SCI will post a full report of SCI 6 later this summer....

Profile of Oxford's Open Access Repository System

Mike Cave, Sean Loughna, and John Pilbeam, Open Access Repository System for Forced Migration Online, ALISS Quarterly, undated but apparently August 2008.  Excerpt:

Based at the Refugee Studies Centre (RSC), University of Oxford, Forced Migration Online (FMO) is a web-based portal providing extensive scholarly resources on the situation of forced migrants worldwide. Established in 2002 it is freely available and is used by academics, researchers, students, policy makers and practitioners. FMO hosts a ‘digital library’ repository that comprises over 220,000 pages of resources on forced migration issues. Many of these resources are rare or unique. In addition, FMO hosts many published items, including, uniquely, the full back runs of the five leading academic journals in the field. FMO is also the sole host of other discrete special collections, such as the full text of materials referenced in the Sphere Project’s handbook Humanitarian Charter and Minimum Standards in Disaster Response. Other resources include an organisations directory, photos, videos, podcasts, and specially commissioned thematic and geographic research guides. These resources have a global focus and while some of the documents date as far back as 40 years, new items are being added constantly. FMO was developed with funding from the Andrew W Mellon Foundation and is currently supported by the Department for International Development (DFID).

With new funding provided by the Joint Information Systems Committee (JISC), the Open Access Repository System (OARS) project aims to enhance and consolidate the software platform on which FMO is built. This 18 month project, due to be completed in March 2009, will make FMO technically easier to mange, as well as rendering the repository globally interoperable with other open systems, including the University of Oxford’s institutional repository. By employing open source software (which was not available when FMO was first constructed), this project seeks to ensure FMO’s long-term preservation....

PS:  For background, see our previous posts on OARS and FMO.

Most publishers demand the maximum embargo under NIH policy

From the August 7 issue of Library Journal Academic Newswire:

...[C]oncerns that the public access policy would basically result in a de facto one year embargo period appear to be proving true so far, as NIH officials confirmed to the LJ Academic Newswire last week that most publishers are opting for the maximum, 12-month embargo period for access in PMC....

Comment.  This should surprise no one.  It happened under the older, voluntary policy (one, two) and it's happening again under the new, mandatory policy.  Remember that the NIH is an outlier:  it's the only OA-mandating funder of medical research in the world which permits a 12 month embargo.  Every other OA-mandating funder of medical research caps the embargo at six months:  the Arthritis Research Campaign (UK), British Heart Foundation, Canadian Breast Cancer Research Alliance, Canadian Institutes of Health Research, European Research Council, Cancer Research UK, Chief Scientist Office of the Scottish Executive Health Department, Department of Health (UK), Fund to Promote Scientific Research (Austria), Genome Canada, Howard Hughes Medical Institute, Joint Information Systems Committee (UK), and the Wellcome Trust (UK).  This matters because delaying public access to publicly-funded research is a compromise with the public interest, and delays are more harmful in medicine than in any other field.  Now that the OA mandate is law, researchers and their institutions are learning how to comply with it, and TA publishers are accommodating it, it's time to talk about stepping down the maximum permissible embargo from 12 months to nine and then six. 


Thursday, August 07, 2008

Interview with Flat World Knowledge

Dian Schaffhauser, Textbook Publishing in a Flat World, Campus Technology, August 6, 2008. (Thanks to Garrett Eastman.) An interview with Eric Frank, co-founder of OA textbook publisher Flat World Knowledge.

See also our past posts on FWK.

OA cell signaling journal adds society sponsorship

Cell Communication & Signaling, an OA journal published by BioMed Central, has been adopted as the official journal of the Signal Transduction Society, according to a blog post August 6 by BMC. See the journal's editorial on the move:
The Signal Transduction Society (STS) is delighted to join BioMed Central with an open access journal. Over the last years, our society members have increasingly appreciated that access to scientific information generated by publicly funded academic research must not be restricted by commercial interests. With overwhelming support of the society members, the presidial council and advisory board of the STS are therefore now taking action and moving from an access-restricted print journal to online open access publishing.

We are convinced that unduly limiting the flow of scientific knowledge has a negative impact on the development of benefits for mankind and believe that BioMed Central is a highly valuable platform that is vital not only to scientists but to society in general and hence deserves our strong support. ...
Update. This post was corrected, per Matthew Cockerill, to clarify that the journal was previously OA and published by BMC; the news here is the society's adoption.

Springer journal participates in PMC

From George Porter on SOAF this morning:

The PMC-News mailing list distributed a note this morning indicating that Genomic Medicine has been added to PubMed Central....

Fulltext v1+ (2007+) 12 month embargo...

Genomic Medicine is a Springer title....

The appearance of this title as a full participant in PubMed Central seems quite noteworthy.  It is the first, to my knowledge, journal owned by one of the major commercial publishers (Elsevier, Wiley, Springer, Taylor & Francis, ...) to participate [fully] in PubMed Central.  Previously, selected author fee supported articles from several publishers have been deposited in the PMC Open Access Subset....

How many published papers could have been self-archived?

Sigbjørn Hernes has calculated that at least 47% of the research articles published in Norwegian journals in 2005-2006 could have been (and could still be) deposited in OA repositories.  The percentage might be much higher:  for 27% of the articles, Hernes couldn't ascertain the publisher's self-archiving policy.

Hernes' paper is in Norwegian.  Unfortunately it's a PDF and I can't link to a machine translation.  My summary is based on Google's English translation of the front page of the OpenAccess.no wiki.  I don't know whether Hernes calculated how many of the papers which could have been deposited were actually deposited.  Perhaps a reader of Norwegian could drop me a line, or post a note to SOAF, with some of the other important details.

Update (8/8/08).  Charles Bailey has painstakingly harvested the text from the PDF, run it through Google Translate, and sent me the result.  (Thanks, Charles.) 

Hernes identified 19,070 articles published in Norwegian journals in 2005-2006.  While 9,110 or 47% were published in green journals which permitted postprint archiving, only between 143 and 408 have been self-archived to date (between 0.016% and 0.45% of those eligible). 

Update (9/17/08).  Here are some corrections and updates to my original account of this study.  I'm very grateful to Jan Erik Frantsvåg for his help.

  • The Hernes study didn't focus on Norwegian journals but on the journals, published anywhere, recognized by the Norwegian research-funding system for granting research funds to Norwegian colleges and universities.  Unfortunately, there is no easy way to tell whether articles archived in NORA were published in this special set of journals.  NORA contains 446 articles published in 2005-06, of which 171 are tagged as peer-reviewed.  Because of inconsistent coding of metadata, the number of peer-reviewed articles could lie anywhere between 171 and 446, and the number of those published in the special set of journals could also lie between 171 and 446.  There were 9,110 articles published in those journals during the same period, eligible for self-archiving.  Hence, the percentage actually self-archived lies between 1.9% and 4.9%. 
  • These numbers reflect some new deposits made since Hernes published his report, in part from the lapse of time and in part from the addition of 15 new repositories to the NORA system. 
  • Of the articles in the Hernes study that were not self-archived as postprints, 18% were published in journals which allow only preprint archiving, 7% in journals which allow no self-archiving at all, and 27% in journals with unknown policies.  It's possible, even likely, that many journals in that 27% segment allow postprint archiving. 

Summary of some recent university OA mandates

Eve Gray has summarized some the major recent university policies to mandate OA for their research output.  The details won't be new to readers of OAN.  But if your institution isn't already considering an OA mandate, then consider sharing Eve's summary with selected colleagues.  It could be a very useful eye-opener.

Some of the OA initiatives from CERN and high-energy physics

Robert Aymar, Scholarly communication in High-Energy Physics:  past, present and future innovations, apparently a preprint, July 2008.  Aymar is the Director-General CERN.  (Thanks to Charles Bailey.) 

Abstract:   Unprecedented technological advancements have radically changed the way we communicate and, at the same time, are effectively transforming science into e-Science. In turn, this transformation calls for an evolution in scholarly communication. This review describes several innovations, spanning the last decades of scholarly communication in High Energy Physics: the first repositories, their interaction with peer-reviewed journals, a proposed model for Open Access publishing and a next-generation repository for the field. We hope that some of these innovations, which are deeply rooted in the highly-interconnected and world-wide flavour of the High-Energy Physics community can serve as an inspiration to other communities.

From the conclusion:

HEP has spearheaded (open) access to scientific information, with half a century of circulation of preprints and almost two decades of inception and adoption of repositories. The publishing and library landscapes in HEP are now in a new period of change, built on this tradition of successful, user-driven, innovations: the cross road of open access and peer-reviewed literature and the inception of a next-generation repository adapting the current technological advances to the research workflow of HEP scientists.

SCOAP3 is a unique experiment of “flipping” from Toll Access to Open Access all journals covering the literature of a given discipline....

INSPIRE, a new e-infrastructure for information retrieval in HEP is being designed and will soon be deployed. It aims to answer to the evolving needs of HEP scholars, deploying novel 16
text- and data-mining, as well as Web2.0, applications. This new e-infrastructure might provide inspiration to many other communities that are currently exploring ways to improve the dissemination, discovery and organization of research results, primarily focusing on author self-archiving....


Wednesday, August 06, 2008

OA rice info in Vietnamese

The Vietnamese Rice Knowledge Bank is a new OA resource by the Vietnamese Ministry of Rural Development and Agriculture. From the August 6 description in Nhan Dan:
... The website offers information and knowledge concerning rice in Vietnam, which are provided by the International Rice Research Institute. It also updates Vietnamese scientists' research results.

The readers get easy access to knowledge of rice varieties, rice processing, nutrition, intensive cultivation and so on.

The website is a result of co-operation between the Vietnam Agricultural Science Institute, the National Agriculture and Fishery Extension Center and some Vietnamese universities with financial and technical assistant from the International Rice Research Institute and the Asian Development Bank.

More on author attitudes on OA

Florian Mann, et al., Open Access Publishing in Science: Why It Is Highly Appreciated But Rarely Used, Communications of the ACM, forthcoming. (Thanks to Glen Newton.) More information on the research is available from the study's Web site. Excerpt:
... We surveyed 481 researchers from three heterogeneous research disciplines of whom more than 85% liked the idea of Open Access publishing. This result is underscored by the widespread international support for Open Access Initiatives ...

Almost 90% of the participants of our study believe that Open Access publishing will serve this purpose [of easier access to scientific knowledge], which more than 96% consider desirable. ... In our study 94% of the respondents agree that Open Access publishing would be helpful in granting better access to developing countries. ...

Three-quarters of the participants answered [that OA could result in greater readership and potentially greater impact for their research] ... 43% believe that their work, when published Open Access, will then be more frequently cited.

Asked for an overall evaluation of the usefulness of Open Access publishing, more than 70% of the respondents answered favorably. ...

In our study, almost two-thirds (66%) of the respondents have used Open Access publication media for accessing research results at least once in their academic career. But only about one quarter (28%) of the researchers have used them for actual publishing the results of their work. ...

Despite their positive general attitude, the majority of the survey participants (61%) fear that Open Access publishing might jeopardize their chances of promotion and tenure. At the same time, 63% worry about Open Access publishing damaging their chances for research funds. A possible explanation for these results is that the current impact factors of Open Access outlets are seen as insufficient by more than 60%. For almost three-quarters (72%) this is reason not to publish their work Open Access. ...

Lastly, an overall increase in productivity through Open Access publishing is only seen by 36%, whereas 44% even think that productivity will decrease. ...

When asked about the publishing behavior of their peer researchers of the same discipline, only 14% agreed that Open Access publishing is a common practice. The picture shifts when asked about researchers of other disciplines. Here, 40% believe that Open Access publishing would be practiced. ... Only 8% state that their direct colleagues use Open Access media for publishing their work. ...

Only 26% held it for likely to publish in the form of Open Access in the coming six months, while 52% did not see this happen. When looking back at the percentage of respondents who state that they have already published in Open Access media (28%), the measured level of intention (26%) does not lead to the conclusion that Open Access publishing will become extraordinarily popular overnight. ...

Distributed chemistry from PubChem data

M. Karthikeyan, et al., Distributed Chemical Computing Using ChemStar: An Open Source Java Remote Method Invocation Architecture Applied to Large Scale Molecular Data from PubChem, Journal of Chemical Information and Modeling, April 11, 2008. See also the abstract in PubMed. (Thanks to Glen Newton.) Abstract:
We present the application of a Java remote method invocation (RMI) based open source architecture to distributed chemical computing. This architecture was previously employed for distributed data harvesting of chemical information from the Internet via the Google application programming interface (API; ChemXtreme). Due to its open source character and its flexibility, the underlying server/client framework can be quickly adopted to virtually every computational task that can be parallelized. Here, we present the server/client communication framework as well as an application to distributed computing of chemical properties on a large scale (currently the size of PubChem; about 18 million compounds), using both the Marvin toolkit as well as the open source JOELib package. As an application, for this set of compounds, the agreement of log P and TPSA between the packages was compared. Outliers were found to be mostly non-druglike compounds and differences could usually be explained by differences in the underlying algorithms. ChemStar is the first open source distributed chemical computing environment built on Java RMI, which is also easily adaptable to user demands due to its “plug-in architecture”. The complete source codes as well as calculated properties along with links to PubChem resources are available on the Internet via a graphical user interface ...

ANU recommends open data

ANU Data Management Manual:  Managing Digital Research Data at the Australian National University, Version 1.0, July 22, 2008.  Excerpt:

3.1.4 Exposure

Creating a website for your research and placing your publications and research data in an archive greatly increases the exposure of your research. Research has shown that Open Access (OA) publications receive 2-3 times as many citations as articles that are only available via journal subscription.

3.2 Benefits of Data Archiving & Sharing

Data sharing makes for good research as it allows for independent verification of results and conclusions and further analysis through the reuse of data.

An excellent list of the benefits of data sharing is given by the ICPSR's Guide to Social Science Data Preparation and Archiving [2005]:

  • Reinforces open scientific inquiry. When data are widely available, the self-correcting features of science work most effectively.
  • Encourages diversity of analysis and opinions. Researchers having access to the same data can challenge each other's analyses and conclusions.
  • Promotes new research and allows for the testing of new or alternative methods. Examples of data being used in ways that the original investigators had not envisioned are numerous.
  • Improves methods of data collection and measurement through the scrutiny of others. Making data publicly available allows the scientific community
    to reach consensus on methods.
  • Reduces costs by avoiding duplicate data collection e orts. Some standard datasets, such as the General Social Survey and the National Election Studies, have produced literally thousands of papers that could not have been produced if the authors had to collect their own data. Archiving makes known to the field what data have been collected so that additional resources are not spent to gather essentially the same information.
  • Provides an important resource for training in research. Secondary data are extremely valuable to students, who then have access to high-quality data as a model for their own work....

4.3.1 Data Sharing Methods

Data Dissemination is actively making your data accessible to others. Some researchers make their datasets available via their personal or group websites. Data sharing is done in 3 ways:

  • Email request - Interested researchers email and request the dataset. This is the most common way that data is shared.
  • Website - Researchers place datasets on their website that anyone can download.
  • Archiving -  Researchers place their dataset in an archive.

Archiving is the preferred option as most archives serve the dual purpose of data preservation and dissemination. Their archives usually have a search utility and are often indexed by the major web search engines, thus increasing the chances of other researchers using and crediting your datasets and publications. Archiving datasets also means the dataset owner does not need to maintain a website and can specify a wide range of access controls. If your dataset is online, then including the link in your publications will greatly increase its use and exposure....

4.3.2 Copyright & Licencing

The owner of any original data holds copyright over that data from the time the data is created....Licences grant permission for others to use the copyrighted data. Open Content Licences are an easy way for researchers to licence their data for others to use....The most notable open content licences are

  • Creative Commons - most popular open content licences
  • Science Commons - similar to Creative Commons but tailored for scientific data and publications.
  • GNU Free Documentation Licence - used by Wikipedia....

Comment.  Very well-done.  The only weak spot is that the authors don't seem aware that Science Commons now recommends the public domain, rather than open licenses, for data.  See for example its Protocol for Implementing Open Access Data from December 2007.

Bishops block OA to English translations of public-domain Latin texts

The International Commission on English in the Liturgy (ICEL) is angering Catholics with its copyright policy on translations of public-domain Latin texts.  ICEL is an initiative of the Catholic Bishop's Conferences.  (Thanks to Gino D'Oca.)  Jeffrey Tucker explains:

...It’s one thing to claim exclusive rights over presentation, art work, or commentary. But the required texts themselves? Surely they are the property of all....

The Latin texts are age old and gift to the entire world. But by authorizing vernacular translations, the texts of Mass themselves become bound up with national copyright laws in which the state collaborates with private producers to create and protect publishing monopolies....

The incredible fact is that ICEL does not anywhere in its policies explicitly account for digital posting or rendering of its texts....This is a serious issue and a major problem that needs to be addressed, and soon....

Creative Commons offers many opportunities for retaining a proprietary relationship with texts (attribution) while still permitting a generous use, even when it involves a commercial relationship, as the blog Cantemus Domino points out. The Catholic faithful have a very strong interest in making sure that ICEL adopts some new policy in this regard....

Update (8/25/08).  ICEL has proposed free online access for sheet music, although with some provisos that may delay the access for several years or require stripping the texts from the music. (Thanks to Gino D'Oca.)

Big science creates natural pressures for open data

Richard Poynder, In search of the Big Bang, Computer Weekly, August 6, 2008.  Excerpt:

This month (August) the world's biggest particle accelerator, the Large Hadron Collider (LHC), will begin hurling subatomic particles called protons around a 27km circular tunnel running beneath the Swiss-French border, before crashing them into each other. By doing so, particle physicists hope to learn more about the physical universe. At the same time, they are reinventing the way they share their research with each other....

The next challenge will be managing the huge volumes of data generated....[A]bout 15 petabytes of data will be generated annually. If stored on CDs, this would create a 20km-high tower of discs.

Once collected, the data will be processed and used to perform complex theoretical simulations, a task requiring massive computing capacity. The problem, says [Rolf-Dieter Heuer, who becomes director of CERN in January], is that "no science centre, no research institution, and no particle physics lab in the world has enough computer power to do all the work".

CERN will distribute the data to a network of computing centres around the world using a dedicated computing grid. This will allow the workload to be shared....

But the biggest challenge will be how to store the data in a format that allows reuse....[B]ecause it is costing £4.75bn to collect the LHC data, it would be profligate not allow reuse.

"Ten or 20 years ago we might have been able to repeat an experiment," says Heuer. "They were simpler, cheaper and on a smaller scale. Today that is not the case. So if we need to re-evaluate the data we collect to test a new theory, or adjust it to a new development, we are going to have to be able reuse it. That means we are going to need to save it as open data....

Openness is not an issue for data alone, however. The research papers produced from the LHC experiments will also have to be open - which presents a different kind of challenge.

Today, when scientists publish their papers, they assign copyright to the publisher. Publishers arrange for the papers to be peer-reviewed, and then sell the final version back to the research community in the form of journal subscriptions.

But because of an explosion in research during recent decades, along with rampant journal price inflation, few research institutions can now afford all the journals they need. "Journal prices are rising very strongly," says Heuer. "So the reality today is that lots of researchers can no longer afford access to the papers they need." ...

As the LCH countdown began, the HEP community launched a number of OA initiatives. In 2006, for instance, CERN spearheaded a new project called SCOAP3, which aims to pay publishers to organise peer review on an outsourced basis, thus allowing published research to be made freely available....

A second initiative will see the creation of a free online HEP database called INSPIRE. This will be pre-filled with nearly 2 million bibliographic records and full-text "preprints" harvested from existing HEP databases such as arXiv, SPIRES and the CERN Document Server (CDS).

If SCOAP3 proves successful, the final full-text version of every HEP paper published will be deposited in INSPIRE, making it a central resource containing the entire corpus of particle physics research....

This suggests scholarly publishing is set to migrate from a journal-based to a database model, and one likely consequence will be the development of "overlay journals". Instead of submitting their papers to publishers, researchers will deposit their preprints into online repositories such as INSPIRE. Publishers will then select papers, subject them to peer review (for which they will levy a service charge), and "publish" them as Web-based journals - although, in reality, the journals will be little more than a series of links to repository-based papers.

"INSPIRE would be an ideal test-bed to experiment with overlay journals, because it will contain the entire corpus of the discipline," says Holtkamp....

What is key to current developments is the belief that scientific information must be openly available. Because science is a cumulative process, the greater the number of people who can access research, critique it, check it against the underlying data and then build on it, the sooner new solutions and theories will emerge. And as "Big Science" projects like the LHC become the norm, the need for openness will be even greater because the larger the project, the more complex the task, and the greater the need for collaboration - a concept neatly expressed in the context of Open Source software by Linus' Law: "Given enough eyeballs, all bugs are shallow."

Holtkamp adds, "I am pretty confident that Open Access will be the standard of the future for scientific papers, although it remains unclear when Open Data will become the norm."

Certainly, if the public is asked to fund further multi-billion-pound projects like the LHC, there will be growing pressure on scientists to maximise the value of the data they generate - and that will require greater openness.

OA to digitized special collections

Merrilee Proffitt and Jennifer Schaffner, The Impact of Digitizing Special Collections on Teaching and Scholarship: Reflections on a Symposium about Digitization and the Humanities, OCLC, July 2008.  (Thanks to Charles Bailey.)Excerpt:

...[W]e have a chance to enlist scholars to contribute the scans they create in the course of their research so that others can access them. Digital versions of unique materials can be “collected” by libraries and archives, along with the scholarly results....

...Call for action:

We can consider compromising on a temporary embargo on universal access to digitized special collections, as long as our contracts ensure that ultimately the content will be openly accessible. The special collections community should come together (much as the moving image community did in the “Lot 49” meeting), to articulate common principles for third-party contracts to digitize special collections materials....

Lack of OA endangers public health

Pesticide Problems Go Unnoticed by EPA, a press release from OMB Watch, August 5, 2008.  Excerpt:

The Center for Public Integrity (CPI) has discovered that two groups of common pesticides, generally considered to be "safer" chemicals, are responsible for one quarter of reported human pesticide poisonings, based on the U.S. Environmental Protection Agency's (EPA) own data. CPI spent several years demanding the release of the data through repeated Freedom of Information Act (FOIA) requests. A trade association representing the interests of the consumer specialty products industry denounced the report....

The CPI report, Perils of the New Pesticides, analyzes the number of reported human health problems, including severe reactions and deaths, linked to two families of pesticides, pyrethrins and pyrethroids, over the past decade....Pesticides made with these chemicals are found in thousands of common household products such as flea and tick poisons, ant and roach killers, delousing shampoos, lawn-care products, and carpet sprays.

The data reveal that reported incidents of fatal, major, and moderate exposures to the two classes of pesticides increased 300 percent since 1998...[and] accounted for more incidents than any other class of pesticide over the last five years. The EPA's reporting system receives up to 6,000 reports of pesticide exposures annually....

As a result of the investigation, the director of the EPA's Office of Pesticide Programs said the agency this year would begin a broad study of the human health effects of these chemicals and examine further any trends. According to CPI, the EPA originally had not planned to review the data until 2010....

CPI produced its analysis using EPA's Pesticide Incident Data System, an aggregation of more than 90,000 pesticide exposure incidents from 1992 through 2007. The data system has been regarded by right-to-know advocates as one of the most important databases to which public access was restricted. The database made the 1999 Top Ten Most Wanted Government Documents list produced by the Center for Democracy and Technology and OMB Watch. After repeated efforts to obtain information under FOIA, the agency finally released the database in early 2008....

Comments

  • Some problems are complicated but this one seems pretty simple.  Public health is more important than pesticide industry profits.  The EPA is a government agency dedicated to the public interest, not a trade association dedicated to the protection of an industry.  The EPA already had the data on the harms caused by these pesticides, but it wasn't releasing them, even to those who went to the trouble of demanding them under Freedom of Information Act.  The solution:  release the data.  Make the data OA automatically and immediately.  As a nation, we're moving in this direction for clinical drug trial data.  Let's do the same for pesticide data.  Let's do the same for all data collected by the government on the safety of foods, drugs, environmental contaminants, and consumer products. 
  • I commend the EPA for waking up to the problem once CPI pointed it out.  But how does the agency explain its failure to disclose the data in the first place and its resistance to the FOIA requests?  Didn't we have a right to expect that the EPA would notice and act on the information it collected, all by itself?  Should we have to wait for the heroic non-profit investigative reporters at the CPI to unearth the information from the agency's own database?

OA to healthcare innovations

The US Agency for Healthcare Research and Quality has launched the Health Care Innovations Exchange.  (Thanks to Ted Eytan.)  From the about page:

Launched in the spring of 2008...[t]he Innovations Exchange is being designed to address a major challenge in health care delivery: the slow diffusion of innovative strategies and activities across health care providers. Every day, health care practitioners develop new, effective ways to provide better care, but that information does not move easily beyond institutional walls or across health care silos (e.g., from hospitals to nursing homes, or from private physician practices to community health clinics). As a result, great ideas are limited in their implementation—and providers are constantly reinventing the wheel because they are unaware that tested solutions already exist.

AHRQ's solution is to provide health care practitioners with:

  • A standardized way to document innovations,
  • A centrally located and searchable [OA] repository that all health care decision makers can use to quickly identify ideas and tools that meet their needs, and
  • A way to network with like-minded innovation adopters.

Users of the Health Care Innovations Exchange are expected to be a diverse group —physicians, nurses, administrators, allied health professionals, and others— with the common thread being their commitment to improving the delivery of health care.

From the page on submissions, Share Your Innovations:

...Why Participate in the Health Care Innovations Exchange?

Contribute to the Greater Good.  The Innovations Exchange provides an avenue for you to bring about positive changes in patient care far beyond the walls of your own institution. By sharing your innovation, you will be helping to build a rich resource that will complement and enhance the usefulness of the traditional professional literature on health care services....

From the page of criteria, How Innovations Are Selected:

...You must be willing to make enough information freely available to enable a user of the Health Care Innovations Exchange to understand the elements of the innovation and make a decision about adopting it. This requirement does not exclude innovations that incorporate commercial products or other materials for which there may be a fee or licensing requirements. It is not necessary for all information about the innovation to be publicly available, but the Editorial Teams will need access to information with sufficient detail to produce a comprehensive description....

The Innovations Exchange will aim to include innovations that are or were funded by the Agency [but will not limit itself to AHRQ-funded research]....

Labels:

What makes a successful repository?

Dorothea Salo, Repository tidbits, Caveat Lector, August 5, 2008.  Excerpt:

...Successful repositories have sufficient backing from their libraries and their university administrations to make something work. I can’t make it any simpler than that. Without that support, the best repository-rat in the world will not succeed. With it, you don’t need an Einstein....

Exactly what successful repositories make work varies quite a bit, according to the talents and creativity of the staff involved and the nature of the support provided. This is why it’s impossible to write the “winning recipe” for a successful IR. At Minho they instituted bribes. At Rochester back in the day, they hacked up some researcher pages. At Ohio State they have a well-established mediated-workflow system. At Harvard they’ve got a mandate....

The other thing that successful repositories have is leave and resources to experiment. They have to. The standard repository software package, as I have argued ad nauseam, is wholly inadequate to fuel a successful repository program. This means that the well-dressed repository manager has some combination of elbow room in her job description, developer time, student help, librarian alliances, and administrative weight to work with. Again, the exact combination will differ from institution to institution —but a manager without any of this might want to rewrite her résumé before her current job tarnishes it.

So much for the successful repository. Let’s talk for a moment about the typical repository...[which runs] on a wing and a prayer and the dedicated efforts of one FTE or less....

More on increased visibility and increased citation counts

J. P. Dietrich, Disentangling Visibility and Self-Promotion Bias in the arXiv:astro-ph Positional Citation Effect, a preprint self-archived on June 25, 2008.  (Thanks to Stevan Harnad.)

Abstract:   We established in an earlier study that articles listed at or near the top of the daily arXiv:astro-ph mailings receive on average significantly more citations than articles further down the list. In our earlier work we were not able to decide whether this positional citation effect was due to author self-promotion of intrinsically more citable papers or whether papers are cited more often simply because they are at the top of the astro-ph listing. Using new data we can now disentangle both effects. Based on their submission times we separate articles into a self-promoted sample and a sample of articles that achieved a high rank on astro-ph by chance and compare their citation distributions with those of articles on lower astro-ph positions. We find that the positional citation effect is a superposition of self-promotion and visibility bias.

PS:  Also see my 2005 article, Visibility beyond open access.

Update.  Also see Stevan Harnad's comment:

This interesting paper...[appears] in the physics Arxiv (astrophysics sector), where virtually all current articles in astrophysics are OA in preprint form (with no postprint OA problem in astrophysics either)....

The authors rightly point out that in a high-output field like astrophysics, visibility is an important factor in usage and citations, and authors need alerting and navigation aids based on importance, relevance and quality, rather than on random timing and author self-promotion biasses.

I would add that in fields -- whether high- or low-output -- that, unlike astrophysics, are not yet OA, accessibility itself probably has much the same sort of effect on citations that visibility does in an OA field like astrophysics. (Even maximized visibility cannot make articles accessible to those who cannot afford access to the full-text.)

List of Australian university repositories

Neil Godfrey has blogged a list with links of Australian universities' OA repositories. (Thanks to pintiniblog.)
I have compiled the following from a combination of the ROAR list, the ARROW list and web searches against a list of Australian universities. It is more up to date than the current ROAR list, but I have also restricted my list to university research and publications repositories. ...

Profile of the Personal Genome Project

Thomas Goetz, How the Personal Genome Project Could Unlock the Mysteries of Life, Wired, July 26, 2008.
... To [George] Church, [founder of the Personal Genome Project,] ... all of this [progress in genetics] is unfolding according to the same laws of exponential progress that have propelled digital technologies, from computer memory to the Internet itself ...

Exponentials don't just happen. In Church's work, they proceed from two axioms. The first is automation ... The second is openness, the notion that sharing technologies by distributing them as widely as possible with minimal restrictions on use encourages both the adoption and the impact of a technology. ...

In the past three years, more companies have joined the [genome sequencing] marketplace with their own instruments, all of them driving toward the same goal: speeding up the process of sequencing DNA and cutting the cost. ... This spring, Church's lab undercut them all with the Polonator G.007 ... What's more, both the software and hardware in the Polonator are open source. In other words, any competitor is free to buy a Polonator for $150,000 and copy it. The result, Church hopes, will be akin to how IBM's open-architecture approach in the early '80s fueled the PC revolution. ...

In contrast to the heavy lifting that genetic research requires now — each study starts from scratch with a new hypothesis and a fresh crop of subjects, consent forms, and tissue samples — the PGP will automate the research process. Scientists will simply choose a category of phenotype and a possible genetic correlation, and statistically significant associations should flow out of the data like honey from a hive. ... Genomic discovery won't be a research problem anymore. It'll be a search function. ...

In accordance with Church's principle of openness, all the [project] material will be accessible to any researcher (or lurker) who wants to plunder thousands of details from people's lives. Even the tissue banks will be largely accessible. After Church's lab transforms the skin into stem cells, those new cell lines — which have been in notoriously short supply despite their scientific promise — will be open to outside researchers. This is a significant divergence from most biobanks, which typically guard their materials like holy relics and severely restrict access.

For the PGP volunteers, this means they will have to sign on to a principle Church calls open consent, which acknowledges that, even though subjects' names will be removed to make the data anonymous, there's no promise of absolute confidentiality. As Church sees it, any guarantee of privacy is false; there is no way to ensure that a bad actor won't tap into a system and, once there, manage to extract bits of personal information. ...

To Church, open consent isn't just a philosophical consideration; it's also a practical one. If the PGP were locked down, it would be far less valuable as a data source for research — and the pace of research would accordingly be much slower. By making the information open and available, Church hopes to draw curious scientists to the data to pursue their own questions and reach their own insights. ...

And the openness doesn't serve just researchers alone. PGP members will be seen as not only subjects, but as participants. So, for instance, if a researcher uses a volunteer's information to establish a link between some genetic sequence and a risk of disease, the volunteer would have that information communicated to them. ...

Tuesday, August 05, 2008

Stuart Shieber on OA at Harvard

Interview with Stuart Shieber, Harvard University Library Notes, July 2008.  Shieber is the Welch Professor of Computer Science at Harvard, co-director of the Berkman Center for Internet and Society at Harvard Law School, Director of Harvard's Office of Scholarly Communication, and the architect of the OA mandate at the Harvard Faculty of Art and Sciences. Excerpt:

LN:  With open-access policies adopted in the Faculty of Arts and Sciences and at Harvard Law School, what are OSC’s next steps among the Harvard faculties?

SS:  We’re talking to several other faculties right now about open-access policies.  But it’s a separate discussion with each faculty, and different faculties may want to work differently. My guess is that it will take some time for schools who want an open-access policy to come to that decision and to figure out exactly how they want to do it.

It took FAS and the Law School a couple of years to do it. Once we have the repository set up, however, any faculty should be able to use it.

LN:  What can you tell us about the organization of the OSC itself?

SS:  We’re in the process of hiring a small number of staff—a program manager and an administrative person—who will be located in Wadsworth House. Initially, the office will be responsible for running the repository, implementing the open-access policies in FAS and the Law School, and doing outreach to other schools to help them with considerations of similar policies. We need to ensure that the repository is successful in getting a large percentage of the articles by our faculties deposited....

LN:  What issues will the [OSC faculty advisory committee] address over time?

SS:  ...[S]upport for faculty publishing in open-access journals, difficult issues in monograph and book publishing in the humanities, access to scientific data, tools for supporting open access, new kinds of scholarly output such as databases and web sites....

LN:  What’s the status of the open-access repository?

SS:  We’ve started setting up a system based on the DSpace software to operate the repository. Once that’s ready, the plan is to get a small number of beta testers —maybe one or two or three of the departments in FAS— to test out the system. Hopefully, by the fall, we’ll have a version that we’re reasonably happy with where we can do a broader rollout. We’ll keep the repository confined to the Harvard campus before making an official launch that is open to everyone. But I would hope that, during the fall, we’d have the repository fully functioning....

LN:  How can Harvard librarians help?

SS:  Librarians are in contact with faculty all the time. They can play a key role in getting individual faculty members to understand the importance of placing articles in the repository. They tend to be much more knowledgeable about these issues of the repositories and open access and the importance not only for mass distribution but for the faculty member him- or herself of making articles available with open access. So that connection with faculty provides a perfect venue to facilitate the process of moving articles from filing cabinets and computers in the faculty member’s office into the repository.

The most valuable thing in the near term that people in the library can do is to take every opportunity where they meet with faculty to remind them about the importance of attending to publication agreements, about using addenda to make them consistent with open-access policies such as the FAS and NIH policies, to get them to use the repository once it’s up and running, and to track and report any kind of difficulties in using the repository or any uncertainties about rights and obligations. By letting the office know what issues are coming up we can try to address those. The librarians will be on the front line of all of the OSC’s efforts.

SciELO's OA experience in Latin America could benefit South Africa

Eve Gray, ASSAF scholarly publishing team visits SciELO in Brazil, Gray Area, August 5, 2008.  Excerpt:

On July 7-11, 2008, a delegation from the Academy of Sciences of South Africa (ASSAf) visited BIREME In Sao Paulo, Brazil. The ASSAF delegation was there to review the potential for the adoption of the SciELO (Scientific Electronic Library Online) model as a platform to manage scientific publication in South Africa. Given that there is a wider African Academies of Science project to boost scholarly publishing across Africa, this could be a spearhead for a future regional open access network. (For background, see my blog of 30 April.)

This was an important visit. SciELO is a model of successful regional collaboration to raise the profile of a developing economy region's research publication in the face of an inequitable global system....

We watch the outcome of this initiative with great interest. SciELO could be a powerful partner. [Jean-Claude] Guèdon describes it as probably the most successful regional/international [OA] initiative - it includes Portugal and Spain as well as Latin American countries – which has the potential, he argues, 'to play a formidable role in this battle to remove the divide barriers or, at least, lower them'....

More background from the BIREME newsletter:

SciELO has had a successful performance in Latin America and the Caribbean, and is an outstanding reference in the process of research, evaluation and adoption of a solution for national scientific communication...The first portal - SciELO Brazil collection - started operating publicly in 1998. Since then, the SciELO project has developed and is present in eight countries, adding up to over 550 titles of certified journals and more than 180 thousand full-text articles available free online (open access), including original articles, review articles, editorials and other types of communication...

ASSAf showed interest to put into practice a pilot experience with an initial group of five South African publications in order to test the functionalities of the SciELO platform. The BIREME was invited to make a technical visit to South Africa in September 2008 to demonstrate the system to the members of the Academy Advisory Board....

Case study of an OA portal for medical research

Elke Roesner, Open Access Portale und ihre Etablierung am Markt : die Entwicklung eines Geschäftsmodells für “German Medical Science”, Berliner Handreichungen zur Bibliotheks- und Informationswissenschaft, Vol. 230, 2008.  (Thanks to medinfo.)  Because the file is a PDF, I can't link to a machine translation.

Knowledge repositories for geology

K. P. Jaikiran, Knowledge management in geological sciences, Current Science, July 25, 2008. (Thanks to MyNews.in.)
... Scientists and academicians are basically knowledge-workers whose work involves application of multidimensional knowledge in their respective fields. Tremendous scope exists for the creation of knowledge repositories and networks in the scientific world for overall improvement in the quality of work. ...

In geological mapping and exploration programmes, the embedded organizational knowledge, including the tacit knowledge of senior professionals is seldom put to use. This handicap can be overcome only through the development of knowledge repositories in University departments, organizations such as the Geological Survey of India and research institutions involved in geological studies. ...

Mining of knowledge from repositories using standard tools would pave the way for application of the right kind of knowledge for appropriate situations. ...

[A] good geologist needs to be a good field worker too. ... The skills needed for these develop after a long period of time spent in the field ... By the time a person matures into a professional geologist, he would have reached the end of his career. He would soon leave the organization along with his repertoire of geological skills. Any attempt to capture and store this organizational knowledge in the form of an institutional repository would definitely prove to be beneficial to organizations and their activities. ...

Monday, August 04, 2008

RePEc July update

Christian Zimmermann, RePEc in July 2008, The RePEc blog, August 3, 2008.

While the Summer is usually calm, RePEc has just passed some historic benchmarks in terms of content and traffic: 750 journals listed, 2000 working paper series, half a million works available online, 20 million working paper downloads recorded, 200,000 works that are cited within RePEc. This all reflects the tremendous growth that RePEc continues to enjoy. ...

Oxford Journals to manage NIH deposit

Daniel Griffin, Oxford Journals to place NIH funded articles into PubMed Central, Information World Review, August 4, 2008.
Biomedical articles that have been published in Oxford Journals and funded by the National Institute of Health (NIH) will now be deposited into PubMedCentral (PMC).

Oxford Journals, a division of Oxford University Press (OUP) said the new development will “help authors comply with the public access policies of the NIH”. ...

Commenting on the announcement, Martin Richardson, managing director of Oxford Journals said, ‘already all of our open access articles are being deposited into PMC. Now any NIH-funded authors who publish their articles in one of our journals will not need to deposit them into PMC themselves – Oxford Journals will do so for no charge on their behalf.’

Manuscripts which are NIH funded and already open access will be available immediately on PMC. However, only the final published articles rather than original submitted versions will be available from the PMC repository. Oxford Journals said they will collaborate with authors in order to identify other NIH funded articles.
Update. See also the press release from Oxford Journals.

Update (PS).  Also Tracey Caldwell's article on the same subject in the same journal (September 4, 2008).  Among other things, Tracey compares and contrasts the Oxford policy with the Nature policy.  Oxford will always deposit the published edition, while Nature will only deposit the final version of the author's peer-reviewed manuscript.  But Nature will (eventually) deposit into institutional repositories as well as central repositories like PMC, while Oxford has no plans to deposit in institutional repositories.

OA as a condition of Type 1 civilization

Michael Shermer, Toward a Type 1 civilization, Los Angeles Times, July 22, 2008.  Excerpt:

Our civilization is fast approaching a tipping point. Humans will need to make the transition from nonrenewable fossil fuels as the primary source of our energy to renewable energy sources that will allow us to flourish into the future. Failure to make that transformation will doom us to the endless political machinations and economic conflicts that have plagued civilization for the last half-millennium....

Type 1 civilization might be like:

Type 0.1:  Fluid groups of hominids living in Africa. Technology consists of primitive stone tools. Intra-group conflicts are resolved through dominance hierarchy, and between-group violence is common....

Type 0.5: The state as a political coalition with jurisdiction over a well-defined geographical territory and its corresponding inhabitants, with a mercantile economy that seeks a favorable balance of trade in a win-lose game against other states....

Type 0.8:  Liberal democracies that give the vote to all citizens. Markets that begin to embrace a nonzero, win-win economic game through free trade with other states....

Type 1.0:  Globalism that includes worldwide wireless Internet access, with all knowledge digitized and available to everyone. A completely global economy with free markets in which anyone can trade with anyone else without interference from states or governments. A planet where all states are democracies in which everyone has the franchise....

Improvements at Bioinformation

Matt Hodgkinson reports that Bioinformation, a self-described OA journal, formerly charged a publication fee while claiming not to, and formerly blocked all copying while claiming to block only copying for commercial use.  However, the editor quickly fixed these problems when Matt pointed them out.

PS:  Thanks to Matt for taking the time to write to the editor, and thanks to Prof. Kangueane for attending to the problems so promptly.

Article sharing + social networking = Labmeeting

Labmeeting is a new online service combining article sharing and social networking.  The about page is less informative than Erick Schonfeld's description on TechCrunch (July 30, 2008).  From Schonfeld:

...Mark Kaganovich figures that [file sharing] will get [scientists to visit social networking sites]. After graduating from Harvard with undergraduate degrees in biochemistry and computer science two years ago, he set out to create Labmeeting. In May, 2008 he closed a $500,000 seed round from Peter Thiel, Kinsey Hills, and other angel investors. And since last week, Labmeeting has been open to anyone with a college e-mail account.

Typically, scientists have stacks of papers, protocols, and notes in their offices that they pass around as PDFs. Labmeeting is designed first and foremost as a document management site that allows scientists and students to easily upload all of those PDFs, organize them, search them, and share them. Scientists can create groups, and invite other members of their labs to create a common repository of papers that can be accessed from anywhere. The PDFs appear inside an embedded Scribd window (Kinsey Hills is also an investor in Scribd).

Says Kaganovich:

What we are trying to do is change the way information in biomedical research and the medical community is distributed and retrieved.

Scientists can recommend papers to colleagues, mark them up, create collections, and follow what other scientists are collecting. Each scientist gets a profile page. By interacting through their research, they are more likely to interact with each other. Labmeeting could also form of basis a community ranking system for scientific papers, based on who is reading, writing, and sharing them.

Labmeeting is free for individual scientists and students. Eventually, Kaganovich plans to charge subscription fees to corporate users such as drug and biotech companies.

From the page of terms:

...labmeeting.com does not claim any ownership rights in the...materials...that you post to the labmeeting Services. After posting your Content to the labmeeting Services, you continue to retain all ownership rights in such Content, and you continue to have the right to use your Content in any way you choose. By displaying or publishing ("posting") any Content on or through the labmeeting Services, you hereby grant to labmeeting.com a limited [and non-exclusive] license to use, modify, publicly perform, publicly display, reproduce, and distribute such Content solely on and through the labmeeting Services....This license will terminate at the time you remove your Content from the labmeeting Services. The license does not grant labmeeting.com the right to sell your Content, nor does the license grant labmeeting.com the right to distribute your Content outside of the labmeeting Services....

Except for Content posted by you, you may not copy, modify, translate, publish, broadcast, transmit, distribute, perform, display, or sell any Content appearing on or through the labmeeting Services....

labmeeting.com may delete any Content that in the sole judgment of labmeeting.com violates this Agreement or which may be offensive, illegal or violate the rights, harm, or threaten the safety of any person....

Prohibited Content includes, but is not limited to Content that, in the sole discretion of labmeeting.com [has any of 14 listed properties]....

More on the Evans study and OA

Stevan Harnad, Are Online and Free Online Access Broadening or Narrowing Research? Open Access Archivangelism, August 4, 2008.

Evans, James A. (2008) Electronic Publication and the Narrowing of Science and Scholarship Science 321(5887): 395-399 DOI:10.1126/science.1150473
Excerpt: "[Based on] a database of 34 million articles, their citations (1945 to 2005), and online availability (1998 to 2005),... as more journal issues came online, the articles [cited] tended to be more recent, fewer journals and articles were cited, and more of those citations were to fewer journals and articles... [B]rowsing of print archives may have [led] scientists and scholars to [use more] past and present scholarship. Searching online... may accelerate consensus and narrow the range of findings and ideas built upon." ...

In one of the few fields where this can be and has been analyzed thoroughly, astrophysics, which effectively has 100% Open Access (OA) (free online access) already, Michael Kurtz too found that with free online access to everything, reference lists became (a little) shorter, not longer, i.e., people are citing (somewhat) fewer papers, not more, when everything is accessible to them.

The following seems a plausible explanation:

Before OA, researchers cited what they could afford to access, and that was not necessarily all the best work, so they could not be optimally selective for quality, importance and relevance. (Sometimes -- dare one say it? -- they may even have resorted to citing "blind," going by just the title and abstract, which they could afford, but not the full text, to which they had no subscription.)

In contrast, when everything becomes accessible, researchers can be more selective and can cite only what is most relevant, important and of high quality. (It has been true all along that about 80-90% of citations go to the top 10-20% of articles. Now that the top 10-20% (along with everything else in astrophysics), is accessible to everyone, everyone can cite it, and cull out the less relevant or important 80-90%....

But the trouble is that apart from astrophysics and high energy physics, no other field has anywhere near 100% OA: It's closer to 15% in other fields. So aside from a global correlation (between the growth of OA and the average length of the reference list), the effect of OA cannot be very deeply analyzed in most fields yet.

In addition, insofar as OA is concerned, much of the Evans effect seems to be based on "legacy OA," in which it is the older literature that is gradually being made accessible online or freely accessible online, after a long non-online, non-free interval. Fields differ in their speed of uptake and their citation latencies. In physics, which has a rapid turnaround time, there is already a tendency to cite recent work more, and OA is making the turnaround time even faster. In longer-latency fields, the picture may differ. For the legacy-OA effect especially, it is important to sort fields by their citation turnaround times; otherwise there can be biases....

Are online and free online access broadening or narrowing research? They are broadening it by making all of it accessible to all researchers, focusing it on the best rather than merely the accessible, and accelerating it.

PS:  Also see my comments on the Evans article.


Sunday, August 03, 2008

More blog notes on the Edinburgh Repository Fringe

Stuart Macdonald has posted more blog notes on the Edinburgh Repository Fringe (Edinburgh, July 31-August 1, 2008): See also our earlier post on the conference.

New OA journal on tobacco-related diseases

Tobacco Induced Diseases is a new peer-reviewed OA journal from BioMed Central and the International Society for the Prevention of Tobacco Induced Diseases. See the July 31 announcement. The article-processing charge is £850 (€1080, US$1690), subject to discounts and waivers. Authors retain copyright to their work, and articles are released under the Creative Commons Attribution License.

Update. To clarify, the journal was launched in 2002. What's new is that it moved to BMC. See also the blog post from BMC.

Copyright law now protects failed business models more than innovation

William Patry, whom I quoted here just last week on copyright and the NIH policy, is laying down his influential copyright blog.  (Thanks to Glyn Moody.)  Patry is the Senior Copyright Counsel at Google and former copyright counsel to the U.S. House of Representatives Judiciary Committee.  From his final post:

...The Current State of Copyright Law is too depressing....

I regard myself as a centrist. I believe very much that in proper doses copyright is essential for certain classes of works, especially commercial movies, commercial sound recordings, and commercial books, the core copyright industries. I accept that the level of proper doses will vary from person to person and that my recommended dose may be lower (or higher) than others. But in my view...we are well past the healthy dose stage and into the serious illness stage. Much like the U.S. economy, things are getting worse, not better. Copyright law has abandoned its reason for being: to encourage learning and the creation of new works. Instead, its principal functions now are to preserve existing failed business models, to suppress new business models and technologies, and to obtain, if possible, enormous windfall profits from activity that not only causes no harm, but which is beneficial to copyright owners. Like Humpty-Dumpty, the copyright law we used to know can never be put back together again: multilateral and trade agreements have ensured that, and quite deliberately.

It is profoundly depressing, after 26 years full-time in a field I love, to be a constant voice of dissent....I have blogged about great articles others have written, or highlighted scholars who have not gotten the attention they deserve; I tried to find cases, even inconsequential ones, that I can fawn over. But after awhile, this wore thin, because the most important stories are too often ones that involve initiatives that are, in my opinion, seriously harmful to the public interest....Being so negative, while deserved on the merits, gives a distorted perspective of my centrist views, and is emotionally a downer....

I intend to spend my free time figuring out a constructive way to talk about the difficult issues we face and how to advance toward their solution.

Comment.  It's sad that we're losing this sane, sage voice on copyright law, and alarming that a copyright centrist would be so deeply depressed by the current state of the law and its continuing trajectory.  If you haven't paid much attention to the severe tilt of copyright law away from its traditional balance of interests toward maximalism, this should be a wake-up call.

Report on the April OA workshop in Nigeria

Ezra Shiloba Gbaje, Advocating for Open Access Model in Disseminating Scholarly Information in Nigeria, Nigerian Library Association, undated.

The version to which I've linked is an image scan of a print original.  I don't have time to rekey excerpts and appreciate this summary from eIFL.net:

Ezra Shiloba Gbaje, librarian of the Ahmadu Bello University in Zaria, Nigeria, and eIFL OA country coordinator, gives a detailed account of the first national OA/IR workshop that was held on April 28-29 thanks to an eIFL.net grant. Through the article, Ezra highlights the benefits of Open Access for the scholarly community and society at large. The purpose of the workshop was to raise more awareness of Open Access journals and Institutional repositories in Nigeria, a country that cannot afford to pay the high subscription fees of traditional publishing. eIFL OA program manager Iryna Kuchma and other renowned international specialists discussed IRs software solutions, copyright issues, advocacy and policymaking strategies and promotional and marketing efforts with a large audience of scholars, policy-makers, university and systems librarians and ICT experts. Currently, the Kashim Ibrahim Library at the Ahmadu Bello University is populating its pilot Institutional repository using DSpace, the first of its kind in West Africa.

Rising OA Quotient for biomedical literature

Matthew Cockerill, How open access is your research area? (revisited), BioMed Central Blog, August 2, 2008.  Excerpt:

Just over a year ago, this blog post introduced the concept of the Open Access Quotient, an easy-to-calculate metric for the fraction of the biomedical literature that is immediately freely accessible in full text form, for a given subject area. [PS:  Blogged here 7/23/07.]

Now seems as good a time as any to revisit this metric, to get a sense of what progress has been made in opening up the literature. Has the growth of open access publishing, and the strengthening of the NIH open access policy, had a measurable impact on the accessibility of research?

Looking at the biomedical literature has a whole, there has clearly been progress. The fraction of recent PubMed abstracts that will link the users straight through to the fulltext, with no permission barrier getting in the way, is up from 6.8% to 8.45%. There's still a long way to go, but that's a lot more immediate open access articles (about 1000 more per month, in fact)....[PS:  Omitting two charts.]

It is clear that open access is close to becoming the norm rather than the exception for publications in areas such as malaria [OAQ up from 19.6% last year to 35.6% this year] and genomics [OAQ up from 12.9% last year to 18.5% this year] , and the take-up of open access in these areas seems to still be growing fast, rather than showing any signs of plateauing. Just as encouraging  is the doubling in the proportion of clinical trials publications that are now open access [up from 4.0% last year to 9.7% this year] ....

Portal of OA-related projects at the U of Michigan

Open.Michigan is a new portal of OA-related projects at the University of Michigan.  Also see its blog and wiki.  From the about page:

The Open.Michigan site is a gateway to a wide spectrum of initiatives at the University of Michigan (U-M) and our collaborating institutions. With a common goal of opening resources for teaching, learning and research for use and enhancement by a global community, these projects increase the value of those resources to U-M and the world. Open.Michigan provides a clear view of the many places and ways U-M contributes to our world's knowledge and creates exemplary resources for education and research.

Open.Michigan includes information, updates, discussion, blogs, videos, and podcasts detailing our efforts. You will also find links to projects in open educational resources (OERs) such as the U-M Health Sciences Global Access project and the dScribe project; in open source software, the Sakai and SiteMaker projects; in open archives and publishing, Deep Blue, digitalculturebooks, MBooks, and OAIster projects; and in open standards, the IMS Global Learning Consortium work and conferences.

Open.Michigan is part of an emerging paradigm for participatory education on a global scale, where a wide array of contributors add to and build upon the open knowledge commons....This open view of the U-M scholarly community can help enrich our own experiences for the development, use, and sharing of knowledge....

Comment.  This is a great idea.  It reminds insiders and outsiders alike of the school's many commitments to openness, reminds us of the relatedness of related projects (in open research, open software, open education, and open standards), and reminds us that universities are in the knowledge discovery and sharing business, not the knowledge lock-down and commodification business.