Google is developing a program to help academics around the world exchange huge amounts of data.
The firm's open source team is working on ways to physically transfer huge data sets up to 120 terabytes in size.
"We have started collecting these data sets and shipping them out to other scientists who want them," said Google's Chris DiBona.
Google sends scientists a hard drive system and then copies it before passing it on to other researchers....
"We have a number of machines about the size of brick blocks, filled with hard drives.
"We send them out to people who copy the data on them and ship them back to us. We dump them on to one of our data systems and ship it out to people."
Google keeps a copy and the data is always in an open format, or in the public domain or perhaps covered by a creative commons license....
One of the largest data sets copied and distributed was data from the Hubble telescope - 120 terabytes of data. One terabyte is equivalent to 1,000 gigabytes.
Mr DiBona said he hoped that Google could one day make the data available to the public....
Posted by
Peter Suber at 3/08/2007 12:28:00 PM.
The open access movement:
Putting peer-reviewed scientific and scholarly literature
on the internet. Making it available free of charge and
free of most copyright and licensing restrictions.
Removing the barriers to serious research.
I recommend the OA tracking project (OATP) as the best way to stay on top of new OA developments. You can read the OATP feed on a blog-like web page or subscribe to it by RSS, email, or Twitter. You can also help build the feed by tagging new developments you encounter.