Week of July 9, 2007
Snapshot from the Field
LOOKING FOR A PREVIOUS STORY? CHECK THE ARCHIVE.
A Movable Feast for Data Centers?
by JACK LYNE, Site Selection
Executive Editor of Interactive Publishing
Think outside the box: It's certainly a solid enough idea, but it's one we've all been hammered with so often that now it sometimes spurs only vast eruptions of yawning.
Then there's Sun Microsystems' out-of-the-box idea that is a box. A box that's actually a facility. A box that moves.
It's the Blackbox, a mobile data center that Sun has just started selling. Weighing 24,000 pounds (10,800 kg.), the Blackbox is essentially a shipping container, 20 feet (6.1 meters) long and eight feet (2.4 meters) wide. That configuration, say Sun's designers, maximizes the product's mobility by capitalizing on containerized shipping's low costs.
"Just about every CIO and start-up I meet says they're crippled by data-center energy and space constraints," Sun CEO and President Jonathan Schwartz explained when product development began late last year.
Late last month, the Blackbox center shipped to its first customer: the Stanford Linear Accelerator Center (SLAC).
The question now is how many companies – and what kind – will follow suit. It's clear that reliable data centers are a bone-level workplace essential. It's not clear, though, how the Blackbox is faring in the market, since Sun isn't saying. After widely trumpeting the product during development, the company's now gone strangely mum.
"I am afraid there is nothing more we can disclose at this stage," Sun spokesperson Audrey Lam told the SiteNet Dispatch. "I can only add that the Blackbox will be shipping to Asia next quarter."
PTS Data Center Solutions President Peter Sacco, however, readily voices doubts about Sun's mobile facility signaling some sort of seismic shift in data center site selection.
"I don't think that our company's traditional clientele for data center projects will look at the Blackbox as a feasible alternative," Sacco explains from PTS's Franklin Lakes, N.J., headquarters. "While I assume that Sun has done the necessary marketing to determine a product need, I have not encountered any clients in the 50 or so data centers I visit annually that I thought this solution would be a viable option for."
The idea of a mobile data center isn't a new one, as Sun readily acknowledges.
The SLAC's Capacity Crunch
Rackable Systems, in fact, beat the Blackbox to market, starting sales in March of Concentro, another wrinkle in mobile center solutions. Packaged in a container that's 40 feet (12.1 meters) long, Concentro can house as many as 1,200 servers. By contrast, the Blackbox holds 252 servers.
Sun's product, however, synched up with the needs of the first Blackbox customer, the SLAC – which was one of the product's beta testers.
Operated by Stanford University for the U.S. Dept. of Energy, Menlo Park, Calif.-based SLAC designs and builds electron accelerators used in high-energy physics and radiation research. The research lab's problem was that it had almost totally maxed out its computer building's power distribution and cooling capacities. At the same time, though, the organization's computing demands were rapidly spiking up.
Overhauling the facility's wiring was going to take time, something the center didn't have. With the Blackbox, though, the SLAC managed to almost totally take time out of the equation. It could quickly acquire extra capacity, roll it onto campus and plug it in.
"We needed to expand quickly this fiscal year, but solving the cooling and power challenges for the [computing] building takes longer," Randy Melen, leader of the center's High Performance Storage and Computing team, explained in announcing the Blackbox purchase on June 23rd. "We worked with Sun to answer the question, 'How do you extend your data center without too much pain?' "
The SLAC is preparing to install its Blackbox – which is painted white for energy efficiency. The mobile center will be located in a parking lot on a concrete pad immediately adjacent to the existing power supply.
The lab hasn't specified the equipment it will use inside its mobile center. Sun officials have said that a Blackbox equipped with the low end of the company's Niagara servers (which sell for about US$3,000 apiece) would be the world's 170th-fastest supercomputer. With seven terabytes of memory, two petabytes of tape storage and 1.5 petabytes of disk storage, a center can simultaneously handle up to 10,000 desktop users, Sun asserts.
But getting a firm fix on the Blackbox's in-the-field performance is a nebulous proposition, given Sun's current reticence. Earlier, though, the company certainly wasn't shy about pumping up the product's potential pluses.
Costs Cuts for Power, Construction?
"When everyone was racing to build the smallest rack-mount servers," Sun Chief Technology Officer Greg Papadopoulos explained in October of last year, "we asked a contrarian question: 'What is the biggest computer we could build?'
Time savings has been frequently cited as a major Blackbox benefit. The product can be up and running in about a tenth of the time needed to design, build and deploy a traditional bricks-and-mortar center, Sun has stated. A center delivered fully equipped with Sun gear supposedly requires only simple hookups for power and data, plus cold water for cooling. (Blackbox buyers can also acquire only the $500 bare-box shell and fit it out with whatever equipment they choose.)
Sun has also made much of its new center's cost savings. A fully equipped top-of-the-line Blackbox is reportedly selling for about $1 million. That, Sun contends, is roughly one-hundredth of the initial outlay required for a traditional 10,000-sq.-ft. (9,290-sq.-m.) data center. In addition, the mobile data center offers reported energy savings of 25 percent.
PTS's Sacco, however, isn't so certain that the Blackbox can deliver all that.
"As far as speed is concerned, maybe," he says. "But as far as construction cost savings, prove it. As far as I can tell, the only savings will come from building the walls. All of the other support infrastructure components – the UPS, generator, air-conditioning, power distribution, etc. – have to be accounted for inside the box.
"In terms of the [claims of] power cost savings, again, prove it," Sacco continued. "Power efficiency is a hot topic. The truth is, though, that small and medium-sized companies' computer room operators don't purchase more energy-efficient solutions if it means a higher initial capital expenditure."
Sun's current no-comment stance on the Blackbox may be directly connected to the product's clientele.
Googling the Blackbox?
A number of tech-industry analysts are convinced that Sun is quietly manufacturing Blackboxes for Google – a company that's legendarily secretive about its technology.
The Blackbox's energy efficiencies would definitely dovetail with current thinking inside the world's number one search engine. Google, for example, has almost completed the installation of 9,212 solar panels atop its Mountain View, Calif., headquarters. The company's 1,600-kilowatt project is the largest solar installation on a U.S. corporate campus.
But Google's larger energy-saving push is aimed at PC and server power usage. Significantly, a number of heavyweight energy users are thinking the same way as well.
"Today, the average desktop PC wastes nearly half of its power, and the average server wastes one-third of its power," Google Senior Vice President of Operations Urs Holzle explained at the CSCI's launch in Mountain View, Calif. "[The CSCI] is setting a new 90-percent efficiency target for power supplies… If achieved, [that] will reduce greenhouse gas emissions by 54 million tons (48.6 million metric tons) per year, saving more than $5.5 billion in energy costs."
The alliance, which also includes Pacific Gas & Electric, the U.S. Environmental Protection Agency and the World Wildlife Fund, intends to expand its membership to include the entire Fortune 500.
The market will decide whether the Blackbox jibes with CSCI-style thinking. For certain, though, the data center world is a top-shelf target for energy conservation.
Already heavy power users, data centers are showing increasingly ravenous appetites. Worldwide energy consumption by data centers doubled between 2000 and 2005, according to an AMD-commissioned study released in February.
U.S. operations consume a very big chunk of the world's data center power, the report noted. In 2005, total electricity consumption by American data centers, including servers, cooling and auxiliary equipment, was approximately 45-billion kilowatt hours – about 1.2 percent of all U.S. energy consumption. All told, American data centers used about $2.7 billion worth of electricity in 2005, Koomey reported. That was about 37 percent of all of the energy that the world's data centers used that year.
"This study demonstrates that unchecked demand for data center energy use can constrain growth and present real business challenges," AMD Server and Workstation Division Vice President Randy Allen said in releasing the report at the LinuxWorld OpenSolutions Summit in New York. "These issues, traditionally thought of as issues reserved for a company's IT department, need to be brought directly into the board room."
That sort of energy-conscious thinking is increasingly apparent in the data center location decisions of power-gobbling Internet companies. Lower-priced power has become a big driver. Google, for example, is building a center in The Dalles, Ore., while Microsoft and Yahoo are both building centers in Quincy, Wash. Nearby, Yahoo opened a data center late last year in Wenatchee, Wash.
The Blackbox obviously has the ability to move quickly to areas with less expensive energy. In addition, the mobility of Sun's center could also strike a market chord for use in military operations, catastrophe areas and developing nations.
Whether mobile or fixed, though, all data centers are facing major challenges, according to a Gartner study released late last year. By 2008, 50 percent of existing U.S. data centers will have insufficient power and cooling capacity, the report predicts.
How Much Motion?
"With the advent of high-density computer equipment such as blade servers, many data centers have maxed out their power and cooling capacity," Gartner Research Vice President Michael Bell said in releasing the study. "It is important for data center managers to focus on the electrical and cooling issue in the near term, and adopt these best practices to mitigate the problem before it results in equipment failure, down time and high remediation costs."
How the Blackbox data center plays into those power and cooling concerns remains to be seen. A number of analysts, though, feel that the product's primary market will consist of heavy power users that utilize Sun's centers much like overcrowded school districts use trailers for classrooms: Blackboxes would be moved in to deal with increased computing loads or to provide temporary storage during data migration.
Sacco agrees with other industry observers who view Sun's new data center as part of the increasing commoditization of corporate infrastructure. At the same time, though, he doesn't see the mobile Blackbox suddenly opening up a boundless palette for data center locations.
"Given limiting factors like distance from headquarters and vendor access," he notes, "I don't think [the Blackbox] will have much real effect on the geographic range of site selections."
For that matter, he doesn't foresee Blackboxes having a dramatic effect on the data center environment.
"As far as I know, they are a captive computing environment, and I have not seen too many computer rooms that did not operate in a heterogeneous computing environment," he says. "At this point in time, I believe the Blackbox will have only a minimal impact on the brick-and-mortar computer room."