Everything here has to run smoothly, continuously. And it does. During the 2003 Northeast blackout, Equinix’s nearby NY2 facility didn’t miss a beat. “You would never have known there was a blackout,” Poleshuk says. “People working here didn’t want to go back to their [temporary shelters].” For 28 hours, NY2 ran on backup generators, he says. “We were the only place around with power.”
This is why data centers use a lot of energy: They are always running. One cabinet—a 7-foot-tall box housing between 35 and 42 servers—consumes roughly the same amount of power as the average house. A typical data center may contain 1,800 to 3,000 cabinets.
But data centers need power not just to run the computers. Like a personal computer, each server generates heat. Put thousands of servers in one space and you have a lot of hot air; temperatures can reach 90 F. If the cooling system goes down in NY4, Poleshuk says, temperatures in the server area will rise by one degree every minute. Too much heat can mean a server meltdown, and a meltdown of servers handling global-scale transactions would be catastrophic. Consequently, data centers, for many years, were consuming as much power to run their servers as they were to cool them.
A Typology Unlike Any Other
For architects, data centers pose a unique challenge, technically, spatially, and programmatically. Albert France-Lanord, founder of his eponymous architecture firm in Stockholm, learned this when designing Pionen, a data center for Bahnhof, which is the Internet service provider that hosts the Wikileaks data. Bahnhof wanted a welcoming and futuristic space in a bunker deep in the White Mountains underneath Stockholm. The site, a former nuclear bomb shelter, was chosen for its visual impact: The James Bond–esque setting has an immediate effect on prospective clients—building users who might neverstep foot inside again. “What is different from other projects is that [a data center is] not designed for people; it’s really designed for machines,” France-Lanord says.
Depending on the client and the project, architects can have little or nothing to do with the building’s computing components. For Bahnhof’s newest data center in Stockholm, which opened in mid-October, France-Lanord was given only the server cabinet dimensions, around which he essentially designed the space. In other projects, firms may work with mechanical engineers and technology consultants from day one to design the space and to determine how to marry energy efficiency with immense computing demand.
To address energy efficiency, architects have two primary options: improve server performance through scheduling or incorporate more efficient cooling systems into the building. Using the first tactic, architects can work with the client’s IT team to assess when the servers require the most power, and then batch which servers run at full capacity and which stay dormant. Operations such as routine backups can wait a few hours or even a few days. HP, for one, is trying to optimize power consumption and capitalize on off-peak rates by altering which servers are used and when, Bash says.
It’s on the cooling side of the equation, however, where architects can have the greatest effect. “A data center is essentially a great big box full of hot air,” says Neil Sheehan, AIA, a principal at Chicago-based Sheehan Partners, which designed NY4. Exhausting the hot air efficiently comes down to design decisions.