scale this: Google
[OT from cottage renovations]
attention conservation warning: severe geek content
of the very small number of people who ever read this space, I'm not sure how many -- if any -- have ever spent time inside a data center or have much of an idea of what it takes to keep this whole interwebs thing going. if those inner workings register somewhere between "no interest" and "express ticket to sleepyland" on your personal snooze-o-meter, you need read no further.
OK, you're still here. excellent.
consider first what happens when something happens to a data centre (a typical one looks like this inside). such as this weekend's example of an electrical fire and explosion taking out 9,000 servers and affecting 7,500 customers (the number of consumer- or business-level end users impacted by this incident is, of course, much higher).
nine thousand servers, that's a lot of computers, right? if your computer is a tower (case is vertically oriented) imagine putting another one right next to it, and another one next to that, side by side: the row would be more than a mile and a quarter wide. if your computer is a desktop unit (case is horizontally oriented) the stack would be that same height.
data center computers come in a rack mount (pizza box) form factor, typically 19 in wide and some multiple of 1¾in tall. single height = 1U, double height = 2U, etc. a standard rack has about 45U (~6½ft) of space and in theory could accommodate 25 computers, but in practice it is not a good idea to pack them in at such density due to the needs to manage electrical load, air flow and thermal output and to accommodate networking gear and whatever else needs to be in the rack.
so let's turn those servers into 1U rackmount boxes and figure on 16 of them per rack, that's a high density but hey, I'm not volunteering to maintain it. you'll need 562½ racks, let's round up to 563, which side by side are 890.6 ft wide, just shy of the length of three football fields (of course in practice many of those servers would be 2U or 4U boxes and many customers will have less than a rack's worth of machines in place, so the actual numbers would tend to be larger -- but you get the idea).
yes, that's a lot of computers but it was just one data centre so the incident was a transient blip, like the recent Comcast DNS hack attack incident. anybody with a serious online presence has it distributed across multiple locations in order to maintain continuity of service in the face of something like this happening. life goes on: the email gets delivered, the wheels of e-commerce keep turning, the blogs are scribbled and (occasionally) read, the porn gets downloaded, whatever. eventually that node's infrastructure is placed back in service and things are was they were. exposure is limited when online presence is scaled large enough to be distributed.
but some things are bigger than that.
way bigger.
what does it take to keep Google going?
they are pretty closemouthed about it, but this interview is enough to give some idea of the scale involved.
read it and think. if you Get It your head will hurt.
the infrastructure of the Victorian and Edwardian eras was massive: canals, railways, bridges, tunnels, dams, mines, ports. its communications component included telegraph lines and undersea cables linking enterprises that already spanned the globe. it was built to last: of stone, brick and steel; designed with an elegance that is no longer seen. it was all so tangible and though also highly distributed, a lot easier to conceptualise and to manage than what we have now.
it's a different world now.
we build on the cheap and "functional" is an excuse for "ugly".
our focus has already shifted from the physical to the abstract.
today's movement and interconnection of information is hard enough to grasp.
but look at where Google already is and the question becomes: where is it going?
[lightly edited as a result of posting and then getting on the subway and thinking about it a bit more]

no subject
I can't visualize this many servers! Blades or whatever they're using. Even the photos here boggle the mind.
I heard somewhere that technology doubles every 18 months. I tell my users that all the time, I hope it’s true. If so, and leave it up to Google to prove it, we’ll soon be seeing Nano-Google I/O Conferences. What now sits in warehouses the size of Amsterdam, will soon fit in someone’s downtown office.
Amazing stuff. I love that Intel needed to create separate circuit boards just for Google
Have you heard of Google Search Appliance? Uses Google File System, I guess. I’ll be attending a Webinar this coming week, my boss thought it sounded ingenious for our SAP headaches. When he hears of the cost, $30K+ I think he’ll change his mind though.
Thanks for the links.