Friday, February 16, 2007

NASA & Google - Part 2

I have always been a Google fan, but this latest announcement with NASA is very cool. NASA is teaming up with Google (again) to use it's massive amounts of computing power and storage.

With the amount of data that NASA is generating it is no wonder they turn to Google, who has more than 10,000 servers (i.e.: 450k) over 13 data centers and expenditures of $1.5 billion on property and equipment in 9 months of 2006. NASA's Columbia super computer recently had an upgrade of 600 Terabytes, 20 StorageTek libraries and more. Columbia is connected to 1.1 Petabytes of storage with a SGI SAN (I love SGI equipment by the way).

I think Google will be alone for a while in the claim to have Exabytes of data in their possession. I wanted to link to what an exabyte was for those that didn't know how much storage this was, and came across an interesting 2003 CIO magazine article that states:

"It estimated that in 1999, the total of all human knowledge, music, images and words amounted to about 12 exabytes. About 1.5 of those exabytes were generated during 1999 alone. "

Since I probably won't get anyone at Google to bite on locating a data center where I live, I'll put the request out to them to divulge what they use for a storage infrastructure, management software and technologies used to connect storage equipment (their Bigtable publication is an interesting read by the way). It would be an amazing lessons learned story/white paper to learn what they use, how they implemented it and what they learned in the process.

Ok...enough suspense, here is the article at Byteandswitch.com.

No comments: