Simulations performed on the Summit supercomputer at the Department of Energy’s Oak Ridge National Laboratory revealed new insights into the role of turbulence in mixing fluids and could open new possibilities for projecting climate change and studying fluid dynamics.
The study, published in the Journal of Turbulence, used Summit to model the dynamics of a roughly 10-meter section of ocean. That study generated one of the most detailed simulations to date of how turbulence disperses heat through seawater under realistic conditions. The lessons learned can apply to other substances, such as pollution spreading through water or air.
“We’ve never been able to do this type of analysis before, partly because we couldn’t get samples at the necessary size,” said Miles Couchman, co-author and a postdoc at the University of Cambridge. “We needed a machine-like Summit that could allow us to observe these details across the vast range of relevant scales.”
Turbulent processes might sound simple, like stirring a cold splash of cream into a hot cup of coffee. They’re not simple. Turbulence occurs at scales that can vary widely — from a huge ocean wave to a tiny ripple, for example — as motion within cascading motion works its way through layer after layer of water, mixing heat and other substances along the way.
“In this case, you have colder fluid sitting on the ocean floor and warmer fluid above,” Couchman said. “One of the big uncertainties for climate modeling and other such applications arises from a lack of understanding of how heat mixes across these layers. The surface waters are being heated from above by the sun, but how does that heat get dispersed? It turns out that turbulent processes play a key role, which is what we’re now trying to better understand. What we’re discovering can be applied not just to the mixing of heat in water but to pollutants mixing in the atmosphere and to a variety of other questions.”
Scientists model the properties of turbulent mixing by using mathematical formulas called Navier-Stokes equations. These models depend heavily on a fluid’s Prandtl number, named for German physicist Ludwig Prandtl, which describes the ratio of how quickly momentum dissipates relative to heat in a fluid.
Most previous studies focused on flows with a Prandtl number of roughly 1 to simplify calculations. In reality, ocean flows tend toward a Prandtl number of 7 or greater.
The computational power of Summit allowed the team to fully resolve the ocean turbulence at realistic ratios for the first time, providing new insight into the ocean’s turbulent dynamics.
“These kinds of turbulent flows are characterized by extreme events,” Couchman said. “You have this massive volume of water, and within that water are concentrated patches of sporadic but vigorous mixing. It’s crucial to perform such detailed simulations to properly identify and characterize the nature of these important mixing events, which requires significant computational power.”
Simulating such a variety of motions at scale generates enormous amounts of data, enough to choke even some of the world’s most powerful supercomputers.
“One typical approach simplifies the results by averaging measurements across the entire domain,” Couchman said. “But that approach smears out the important finer details and just gives you a cloudy picture of what’s happening. We wanted to take this section of water and follow the turbulence from the initial burst all the way until the motion dies out, analyzing all relevant scales. If we can do that in enough detail, we can zoom in on these smaller important patches of turbulence and better understand what’s happening as the mixing takes place.”
The team turned to the Oak Ridge Leadership Computing Facility and received an allocation of compute time on Summit, then the nation’s fastest supercomputer at 200 petaflops, or 200 quadrillion calculations per second. The processing power of Summit and the help of OLCF scientific liaison Murali Gopalakrishnan Meena allowed the researchers to capture every motion — from the largest to the smallest — under conditions closer to those of the ocean to compare the mixing across the layers of water under varying conditions.
“These are snapshots in time,” said Steve de Bruyn Kops, co-author of the study and professor of mechanical and industrial engineering at the University of Massachusetts Amherst. “You can think of this simulation as a cube of water made up of points on a digital grid. Each cube’s almost 4 trillion total grid points, so we’re able to replicate conditions down to the centimeter level. We couldn’t have done this in a water tank because there’s no water tank big enough to accommodate 3D measurements at centimeter resolution.”
The unprecedented level of detail offered by the simulations uncovered features that may contradict long-held theories.
“Let’s go back to the example of the coffee and cream,” de Bruyn Kops said. “A basic assumption of turbulence theory has been the cold cream and the hot coffee should mix at the same rate, as you stir and the coffee goes from black to brown. But we’re finding from these simulations that’s not the case. The heat’s mixing at a slower rate than the momentum from the turbulence. That’s a whole new avenue to explore.”
The team expects even more revelations as ORNL’s Frontier, the world’s first exascale supercomputer and fastest computer on the planet, opens to full scientific operations.
“We might be able to scale up even further,” de Bruyn Kops said. “More detail means more opportunity for analysis.”
Funding: Support for this research came from the DOE Office of Science’s Advanced Scientific Computing Research program and from the U.S. Office of Naval Research. The OLCF is a DOE Office of Science user facility at ORNL.
Published in journal: Journal of Turbulence
Source/Credit: Oak Ridge National Laboratory | Matt Lakin
Reference Number: es06132301