An energy irony clouds the work of the powerful U.S. computers that have tracked the retreating Arctic sea ice. While the data servers handle the information behind the ever-worsening satellite images of global warming's impact, they are burning a lot of coal.
In fact, the National Snow and Ice Data Center (NSIDC), funded largely through grants from U.S. government agencies but located at the University of Colorado at Boulder, needs 100 kilowatthours per hour of fossil fuel power to process data on the state of the world's frozen regions. That's roughly the amount of electricity it takes to power about 80 average U.S. homes, according to the latest figures. And about half of that power is spent not to crunch data, but just to cool the equipment.
"Here we are working on climate research, and our data center is consuming an awful lot of power," said David Gallaher, NSIDC's technical services manager. "Even in the dead of winter, these things are cranking full-tilt, trying to chill off the 100°F-plus (37°C) heat coming off the back of these units."
With the cool air of the Rocky Mountains all around, it didn't make sense. "We said, 'Why are we doing this?' " Gallaher said. "Why don't we dump the warm air outside and pull in the cool air?"
Now, a $600,000 renovation, largely funded by a National Science Foundation (NSF) grant, seeks to do just that. Along with better equipment and an innovative evaporative cooling technology, it aims to make the NSIDC computing center one of the most energy-efficient data centers in the United States.
A Growing Mission
NSIDC's job has grown exponentially since it was established in 1982 as part of the University of Colorado's Cooperative Institute for Research in Environmental Sciences. The center started out as an analog archive and information center, but now manages remote sensing data from NASA's Earth Observing System satellite program. NSIDC currently archives and serves more than 91 terabytes of Earth science data to researchers around the world.
NSIDC is entirely funded by competitive grants, with the largest share of funding from NASA, and smaller shares from NSF and the U.S. Department of Commerce's National Oceanic and Atmospheric Administration. Among the recent work of the scientists there: examining the causes for the loss of sea ice in West Antarctica, and producing the maps that showed that Arctic sea ice extent this past September was the third-lowest in the satellite record (just behind record-setting 2007 and the second-lowest year, 2008.)
(Related: Decline of Arctic Ice Continued in 2008)
But those calculations would be impossible without powerful computers and servers to manage the data.
With 70 servers in two rooms, NSIDC's facility was a fraction of the size of the largest data centers in the world, like Switch Communications' huge SuperNAP in Las Vegas, with racks for 7,000 servers. Still, by using more energy-efficient technology, NSIDC found it could do a lot better.
Switching to higher-density memory disks that use about the same amount of energy as the old disks, but store more data, NSIDC was able to consolidate its data center from two rooms to just one. The number of servers also was cut by 60 percent through "virtualization." That means replacing several dedicated servers operating at a low average processor utilization level with a single "host" server that operates at a higher average utilization level.
"A typical server uses about 5 percent of the CPU [central processing unit] that's sitting there," Gallaher explains. "But it's hard to put a whole lot of different processes on it. Certain applications don't play well with others. Virtualization allows us to put many servers on that one physical server, and yet each [virtual] server is custom-built to its own needs. Each application gets exactly what it needs."
The new data center, to be completed next summer, also is being rearranged so the rows of server racks will no longer face the same direction. Instead, the direction in which the rows face will alternate, so the backs of the servers don't spill heat onto the front of the servers on the adjoining aisle. The aisles will be sealed with plastic sheets so that the cooling systems don't work overtime cooling the hot air.
Using the Cool Air
The project includes a 25-kilowatt rooftop solar array to replace some of that coal power. But perhaps the greatest innovation is how the center will cut down its need for traditional air-conditioning simply by taking advantage of the surrounding geography. "This is Boulder," says Gallaher. Filtered outdoor air is expected to provide most of its cooling needs for the equipment. And on hot days when that's impossible, the data center is installing a new system that uses indirect evaporative cooling—a technology that uses no compressors, but blows air through water, and takes advantage of the change in temperature when the water evaporates.
"Ninety percent of the time, the total power used by this whole cooling system will be about equal to the power your car air conditioner uses," Gallaher says. "We're quite ecstatic about it."
The 90 percent reduction in cooling costs is expected to pay for the cost of the system in about three years. The energy savings add up to a 45 percent reduction in operation costs for NSIDC each year.
Although NSIDC was located in the advantageous climate long before the growth of its data center, some energy efficiency experts believe that geography soon will be more of a consideration when server facilities are sited.
"People put data centers in the Pacific Northwest because there is fairly cheap hydropower there," says Otto Van Geet, a National Renewable Energy Laboratory engineer. "You could put data centers in the middle of Texas where the wind resource is good, or in the Midwest, etcetera. I think that is something you'll start to see more of—people taking [energy sources] into account when they choose where to put data centers."
(Related: The Big Thaw)
Candace Adorka is a reporter with Medill News Service in Washington, D.C.