Using Oak Ridge National Laboratory’s Frontier system, the world’s first exascale supercomputer for open science, researchers have shown how microscopic wear on turbine blades can quietly sap jet engine efficiency, raise fuel burn and shorten engine lifespan, according to the official project report published by ORNL.
The work brought together scientists from the University of Melbourne, GE Aerospace and ORNL to study what happens when high pressure turbine blades develop surface roughness from erosion, oxidation and mechanical wear. These flaws are often microscopic, yet they operate inside engines where temperatures exceed 2,000 degrees Celsius and airflow is extremely complex.
Engineers have long suspected that even small surface defects can disrupt aerodynamics and heat transfer, but modeling the effect accurately has been nearly impossible. The problem spans vastly different scales, from tiny micrometer level pits and scratches to full size turbine blades spinning at extreme speeds. Traditional simulations could not capture both levels of detail at once.
Frontier changed that. Capable of more than one quintillion calculations per second, the system allowed researchers to run ultra high fidelity simulations with 10 to 20 billion grid points and enormous computational freedom. These models resolved the full physics of turbulence using direct numerical simulation rather than relying on approximations.
The results challenged long held assumptions. Much of the industry’s understanding of surface roughness comes from simplified lab geometries, but turbine blades behave very differently. Inside real engines, airflow transitions between laminar and turbulent states, and the presence of roughness accelerates that transition.
That earlier shift to turbulence dramatically increases heat transfer into the blade and boosts aerodynamic losses. In practical terms, that means hotter components, reduced durability, more frequent maintenance and higher fuel consumption. Even small surface damage can therefore compound into significant performance penalties over time.
To make the simulations possible, the team upgraded and optimized its in house solver, HiPSTAR, to run efficiently on Frontier’s GPU architecture. Individual cases took weeks to compute. The same workload on a standard laptop would have taken more than a thousand years.
The findings are already influencing next generation turbine design. Engineers at GE Aerospace are using the data to refine high pressure turbine performance and cooling strategies, including work connected to NASA’s Hybrid Thermally Efficient Core project. The goal is straightforward: squeeze more thrust out of every unit of fuel while lowering emissions and operating costs.
By exposing exactly how tiny imperfections ripple into large efficiency losses, the research gives designers a clearer path to longer lasting, cleaner and more efficient jet engines. In an industry where even a fraction of a percent improvement can save millions in fuel, the ability to see these effects at full scale could be transformative.
The study has been published in the ASME Journal of Turbomachinery.

