Success Stories
Supercomputer Powers NASA's Climate Research
Client:
NASAChallenge:
- Speed execution of higher-resolution and more-complex global climate models.
- Increase the supercomputing system’s capabilities.
- Improve management of exponentially increasing volumes of data.
Solution:
- Administer the Discover high-performance computing cluster, which has more than 35,000 CPUs.
- Develop, operate and maintain visualization.
- Manage more than 25 petabytes of climate science.
Results:
- Developed the highest-resolution atmospheric simulation of its kind, modeling two years of the Earth’s climate.
- In five years, the computing cluster’s performance increased 130-fold.
- Created the 17-by-6-foot Visualization Wall — a new tool for climate scientists.
-
contact usQuestions? Contact us.
-
downloadDownload this success story.
Currently, the NCCS Discover computing cluster ranks in the top 100 of the world’s supercomputers and is a leader among those systems focused on climate and weather research.
The center, based at NASA’s Goddard Space Flight Center, integrates supercomputing, visualization and data interaction technologies to support research for more than 500 scientists at NASA centers, and researchers at laboratories and universities around the world.
“The computer is the climate scientist’s tool — the better the tool, the better the scientific results, and the greater the understanding of what’s happening in the complete Earth system,” says Phil Webster, head of Goddard’s Computational and Information Sciences and Technology Office. “A key challenge for us is to build better machines because what we need doesn’t exist.”
Analyzing the world’s climate and creating global climate models demands a supercomputing capability that continually pushes the leading edge. Since 2000, CSC has helped the NASA Center for Climate Simulation (NCCS) operate, maintain and improve its supercomputing systems.
In the past five years, CSC has helped increase Discover’s performance 130-fold. Today, the NCCS computing cluster uses more than 35,000 processing cores to crunch more than 400 trillion floating-point operations per second. By comparison, it would take every person on Earth adding pairs of seven-digit numbers at the rate of one per second for more than 17 hours to do what Discover can do in one second.
Managing big data
Another challenge for the center, says Webster, is data management, or more accurately, Big Data management. Scientists using the center integrate millions of observations collected daily, reanalyze past observations and perform climate model simulations, each of which can produce massive amounts of data. CSC also helps administer Discover’s archive system, which stores about 28 petabytes1 of data, with a total capacity of 37 petabytes.
“The Big Data problem is like finding a needle in a needle stack,” says Scott Wallace, CSC NCCS Support program manager. “Finding your needle in a pile of 28 trillion needles is not significantly harder than finding it in a pile of one trillion needles because they're both effectively impossible, unless you build in a way to keep track of where each needle is located.”
As the center generates and manages increasing quantities of data, it has turned to visualization technologies to help scientists see their research. A recent addition to the center is its Visualization Wall, driven by16 Linux-based servers. These servers split images across the 17-by-6-foot wall, creating one huge, high-resolution medium on which scientists can display still images, video and animated content from data generated on Discover.
“The wall gives scientists an important new tool because it lets them see their research in incredible detail,” says Fred Reitz, CSC NCCS Support operations and deputy program manager.
A keener focus
Even as the center improves its capabilities, researchers continue to ask for more. For example, several groups of scientists have more than doubled their workload requests because of upcoming deadlines on key projects such as the Fifth Assessment Report for the Intergovernmental Panel on Climate Change (IPCC), the leading international body for climate change assessment.
Today, Discover can compute in one day three simulated days in the life of the Earth at one of the highest resolutions ever attained—about 3.5-km global resolution, or about 3.6 billion grid cells. The center’s current “stretch” goal is to generate in one day a computation that covers 365 days at 1-km global resolution.
“Just in terms of electricity, that one computation would require 16 megawatts2 of power the way things are done today,” says Wallace. “This isn’t within reach now, but that’s our distant grail. We’re forever looking for better resolution and faster times.”
Recently, in fact, the center reached a new benchmark when Discover ran the highest resolution atmospheric simulation of its kind, modeling two years of the Earth’s climate at 10 km globally. To achieve advances like these, NASA also taps CSC’s High Performance Computing Center of Excellence for assistance.
Established in 1999, the CSC center has more than 160 specialists operating systems that collectively provide capacity for more than 110 petabytes of data and have a capability of almost two petaflops3 of computation.
“Climate research continues to stretch computing capabilities,” says Donna Klecka, director of CSC’s High Performance Computing Center of Excellence. “Through our center, we can further support NASA’s center, bringing our deep computing expertise to innovate and create Big Data solutions that its climate scientists need.”
Download this Success Story (PDF).
1 One petabyte equals one quadrillion bytes, or one thousand terabytes, of computer memory.
2 One megawatt of electricity powers 1,000 homes for one day.
3 One petaflop equals one thousand trillion floating-point operations per second.

