Throughout the research community, scientists and engineers conduct computer simulations to discover how a new process or technology might perform when put to practical use and to make critical next-step decisions in technology development. It is imperative that researchers have complete confidence in the outcomes of those simulations. They attain that confidence by running numerous simulations to account for the effect that a range of factors may have on the final results.
The practice of running those multiple simulations is called uncertainty quantification analysis--a process that investigates, identifies, and takes into account the uncertainties in the simulations that are due to a number of possible variables. Those uncertainties could be errors caused by instrumentation, methodology, or other factors.
"The issue we face in multiphase flow modeling [the examination of how two or more distinct phases like coal, water, and gas interact when mixed together] is that the computational time for modeling flows, such as those encountered inside of a coal gasifier, is too great," he explained. "We need to do 30, 40, even up to 100 separate simulations to account for all relevant sources of uncertainty in the results and make decisions on how to proceed. With the supercomputer, we can now turn around in 2 or 3 weeks what would have taken 6 weeks."
The NETL supercomputer is currently number 94 on the Top500 list of the most powerful computers in the world. Capable of performing 503 trillion operations per second, it processes information a million times faster than most high-end desktop computers.
"Computational resources have to be shared among researchers," he said. "Now, speed is on our side. We can run more simulations with a faster turnaround."
Shahnam explained that the ability to run more simulations in less time allows researchers to perform more accurate analysis of the work, and make more informed changes in technologies or materials that will lead to faster, more efficient development of energy innovations.
"NETL's supercomputer provides enough computational cores to allow this work without impacting other researchers," Guenther said. "In the past, the only way to do this work would be to ask other researchers not to run work for a given time to free up enough cores/nodes to do uncertainty quantification."
Mathematician, biologist, and historian of science
TNS 30FurigayJof-140704-4788226 30FurigayJof
Most Popular Stories
- Steven Sotloff Beheading Video Claimed by Islamic State
- Apple Planning to Launch Mobile Wallet
- Fantasy Football Gambling Industry Facing Increased Legal Scrutiny
- Challenge to Texas Voter ID Begins
- Men Are the Big Winners in the Jobs Recovery
- Durant Spurns Under Armour to Return to Nike
- Netflix Unveils New Way to Share Picks
- Auto Industry Going Back to Bad Habits
- Construction Spending Staged Strong Rebound in July
- Ford is Finding Success With Its 'Unminivan'