Neon lines and dots of aqua, violet, crimson, and pink dissolve into smoky swirls—that's what the burning of fuel looks like when it is simulated on one of the world's most powerful supercomputers.
These psychedelic snapshots could pave the way for the development of cars that use 25 percent to 50 percent less fuel than the autos of today. But the problem of improving upon the 150-year-old internal combustion engine is so complex that the scientists who work on it are eager for a major development in the supercomputing world to occur later this year. The U.S. Department of Energy's Oak Ridge National Laboratory (ORNL) in Tennessee is set to deploy a massive upgrade to Jaguar, the nation's fastest supercomputer and Number 3 in the world. The new system, called Titan, is expected to work at twice the speed of the machine that is currently the fastest supercomputer in the world, Japan's K computer.
Although most news coverage of the supercomputing world focuses on the race among nations for supremacy (China leapfrogged the United States in 2010, and both were surpassed by Japan last year), Oak Ridge convened a conference last month in Washington, D.C., to focus on the real-world problems that high-power supercomputing seeks to address. Tackling the world's energy challenges is high on the list. Scientists are looking forward to bringing Titan's speed and power to calculations that may open the door to viable fusion technology, lead to a better understanding of climate change, and greatly improve that inefficient but ubiquitous energy generator—the internal combustion engine.
(Related Quiz: What You Don't Know About Cars and Fuel)
"We're at kind of an interesting time," mechanical engineer Jacqueline Chen, of the Sandia National Laboratories' combustion research facility, told the audience of about 100 supercomputing experts from around the world. "While we are still using monolithic fossil fuels—gasoline, diesel, and aviation fuels—there's a wide, diverse stream of new fuels that has emerged and is evolving." At the same time a new generation of high-efficiency, low-emissions combustion systems are in development. "So we've got two moving targets," she said. The simultaneous change in fuels and engine systems greatly complicates research to reduce petroleum reliance and lower carbon dioxide emissions.
"The only way to get there in a reasonable, timely manner is to really understand the underpinning fuel and combustion science," Chen said. Her work is aimed at developing validated models that will predict how new fuel and engine combinations will work, an effort she says will "greatly enable engine designers to shorten their product design cycles."
Beyond Spark Ignition
The internal combustion engine has been the workhorse of world transportation for more than a century, since innovators like Nikolaus Otto and Gottlieb Daimler developed and perfected the design in the 1870s and 1880s. But in that time, a lot of energy has been squandered. Less than one third of the energy from the fuel put into the gas tank of a car with a typical spark-ignition engine is used to move the vehicle down the road. The rest is lost, mostly as exhaust heat, due to the engine's inherent inefficiencies. Diesel engines, which use compression rather than an electric spark to ignite the fuel, are far more efficient, but researchers believe greater improvements are possible.
One promising new low-temperature combustion technology being researched is called "homogeneous charge compression ignition," or HCCI. Instead of using an electric spark to ignite the fuel, the fuel mixture is compressed until it combusts spontaneously by chemical reaction in proper phase with the piston motion. This combustion takes place at lower temperatures, higher pressures, and with a much more dilute fuel mixture than in the spark-ignition engines of most cars today. The bonus of low-temperature compression ignition is that fuel efficiency could increase by as much as 25 to 50 percent. But an HCCI engine is harder to control, and more sensitive to fuel chemistry than are conventional spark-ignition engines.
"We don't understand the coupling of turbulent mixing and ignition chemistry in fine enough detail to help us impact the design," said Chen. "You need to get the correct burn rate, or you get a very noisy engine." In other words, either the fuel mixture needs to be adjusted, or the temperatures in different portions of the mixture need to be stratified, or layered, to control the speed at which pressure rises in the engine.
Even as researchers are looking at overhauling engine technology, fuels are changing too. Far more of the oil on the world market is "heavy"—like the oil that comes from the Canadian oil sands—requiring extra processing steps to convert it into fuel. And renewable fuels like ethanol are also becoming a greater part of the fuel mix, with a great deal of research aimed at increasing fuel from varied plant sources.
"Hundreds of molecules have been proposed as alternative fuels—many of them from biology," said Chen. "How do you assess which are worth pursuing? It's not practical to run them in all in comprehensive engine tests, which would require manufacturing a large amount of each proposed new fuel and fuel blend. And with many new engine designs in development, it's not clear which engine to use to test which future fuels."
That's where supercomputers come in.
Chen and her colleagues have turned to supercomputing power to help better understand combustion, and to model and predict the behavior of fuels by simulating the conditions found in new types of combustion engines. Using 113 million central processing unit (CPU)-hours on the Oak Ridge Leadership Computing Facility's Jaguar supercomputer, Chen and her team were able to simulate fine-scale mixing-chemistry interactions in HCCI combustion with different approaches for mixture stratification that offer different options for controlling the rate of combustion in an HCCI engine.
But the calculations taxed Jaguar, even with its speed of 3.3 petaflops, or a quadrillion calculations per second. That's why Chen and her colleagues are looking forward to Titan, which will upgrade Oak Ridge's supercomputing power to 20 petaflops. Titan also will have energy-efficient, high-performance code accelerators called graphics processing units (GPUs). The resulting "hybrid" supercomputer, combining CPU and GPU power, will be able to run code far faster, and Chen and her team believe it will give them the ability to simulate conventional and alternative fuels of greater chemical complexity.
A Tool for Energy Future
The combustion work is only one of a number of cutting-edge energy research projects that stand to benefit from the upgrade in supercomputing power. William Tang, head of the Fusion Simulation Program at the U.S. Department of Energy's Princeton Plasma Physics Laboratory is seeking a breakthrough in fusion energy. The work is funded by the governments of the world's eight largest economies, the G8. In a separate project, a consortium headquartered at Oak Ridge and including scientists from the Massachusetts Institute of Technology, Westinghouse, the electric industry, and other organizations is studying how radiation moves in today's nuclear reactors. Their aim is to understand how nuclear fuel could burn longer and generate less waste. Also at Oak Ridge, scientists are seeking to better understand climate change.
All of these are "multiscale, multiphysics problems," explains James Hack, director of the National Center for Computational Sciences at ORNL and head of the ORNL Climate Change Science Institute. His work investigates physical processes operating on a very small scale (the condensing of aerosols, for example) up to climate processes operating on a planetary scale. He and other researchers are hoping that Titan will aid in the ultra-high-resolution simulations needed to improve understanding of the complex interrelationship between climate and energy.
"This next step allows us to achieve a level of fidelity that will help us make choices on the energy systems of the future," he says.