Above the Cloud

By Marius Swoboda, Head of Design Systems Engineering, and Hubert Dengg, Thermal Analyst Rolls-Royce Germany, Dahlewitz, Germany

Cloud computing reduces by 80 percent the time required for a coupled CFD and structural simulation.

Save PDF
Above the Cloud

Rolls-Royce reduced the wall-clock time to perform the simulation by 80 percent.

Jet Engine

Rolls-Royce uses an in-house, specialized structural code to determine the operating temperature of jet engine components such as turbine disks. The thermal boundary conditions for such an analysis are usually determined by mounting thermal sensors on the components and capturing heat flux measurements while the engine is running. One problem with this approach is that the thermal design of a new engine cannot begin until late in the product development process, when the first prototype becomes available. At this point, changes to the design are expensive, limiting what can be done to optimize thermal performance. 

Rolls-Royce is a leader in the implementation of a new high-performance computing (HPC) cloud approach in which its structural solver is coupled with the ANSYS Fluent computational fluid dynamics (CFD) solver to provide heat flux predictions at many points on component walls without reference to a physical prototype. Performing this coupled simulation requires a high level of computational power because the solution is time-dependent. This means that CFD and structural solutions must be computed to convergence at each time step as the solution progresses. Rolls-Royce reduced the wall-clock time to perform the simulation by 80 percent by running the simulation on a hosted, shared HPC cloud system.

Challenge of higher inlet temperatures

Engine manufacturers continue to increase turbine entry temperatures as they strive to improve engine efficiency. In this process, engineers must often redesign the engine’s cooling and sealing systems to prevent the overheating of critical internal components. Rolls-Royce engineers determine the operating temperatures of these components by performing a thermal analysis with an in-house, specialized structural code. One of the inputs to the thermal analysis is the transient heat flux at an array of points on the walls of the components under study. Engineers believed that they could achieve major improvements in the design process by determining the heat flux with CFD, then coupling the CFD code to the structural code to exchange the data at each computational cycle. The goal was to achieve an iterative loop with smooth exchange of information between the structural and CFD simulations so that the team could ensure consistent temperature and heat flux on the coupled metal–fluid domain interfaces. This continuous update of the heat transfer information to the components gives an accurate representation of the range of temperatures they will experience during startup and steady operation.

The conjugate heat transfer simulation process is very computationally demanding, especially when 3-D CFD models with more than 10 million cells are required. With internal HPC resources at full capacity, Rolls-Royce engineers considered using cloud resources to access HPC capabilities for this application. Engineers had to overcome several challenges.

Contours of heat flux
Contours of heat flux, which are used as boundary conditions for structural code
Contours of total temperature
Contours of total temperature for the interstage cavity as outputs of the CFD calculation

An interface between structural and CFD codes was available but had to be upgraded to allow for HPC. The other challenge was configuring the ANSYS Fluent process to run on several machines when called from the structural software. While a Fluent calculation spawned flawlessly on multiple cores, only one machine was used for the structural code within the coupling procedure. A change to the use of dedicated Fluent licenses in the cloud allowed the process to run independently of the in-house licensing and the queuing system. In the end, the licensing process ran much faster in the cloud than in-house.

Running simulation in the cloud

Rolls-Royce selected CPU 24/7 GmbH & Co. KG to provide remote HPC computing power on demand. The computation was performed on an HPC cluster using Intel® Xeon® E5-2690 processors and FDR Infiniband® interconnects. The calculation was done in cycles in which either the structural solver or the Fluent CFD solver alternately ran and then passed data to the other when the cycle was completed. The CFD solution supplied heat flux at the walls, and temperature and swirl velocity as outputs to the structural code. The structural code provided temperature at the walls and inlets as boundary conditions for the CFD code. At each step, multiple iterations of CFD and structural solvers ran with solvers exchanging data until their calculated wall temperatures matched. The simulation ran for a total of 6,000 seconds of simulation time, which included startup, low-power and high-power engine operation. As expected, the bulk of the computational resources were consumed by the CFD calculation. The CFD part of the calculation was run on all 32 cores, while the structural part ran on only one.

CFD model turbine cavity
CFD model turbine cavity
CFD model of high-pressure turbine interstage cavity

Running the coupled fluid–structural simulation on the HPC cluster in the cloud was five times faster wall-clock time than running the problem on a local workstation.

CPU 24/7 contributed considerable expertise to the project, including how to set up a cluster, how to run applications in parallel based on a message passing interface (MPI), how to create a host file, how to handle the FlexNet® licenses, and how to prepare everything needed for turnkey access to the cluster. During the whole process, CPU 24/7 supplied comprehensive and expedient technical support. It took only one month from the initial concept of executing the project on the cloud to the completion of the first calculation on the remote cluster. This rapid startup was possible because of the smooth collaboration between ANSYS and the CPU 24/7 team.

cluster load
Cluster load during calculation cycles

The results of the coupled CFD–structural simulation were validated with physical testing results. Because of the near-linear scalability of Fluent, running the coupled fluid–structural simulation on the HPC cluster in the cloud was five times faster wall-clock time than running the problem on a local workstation. By outsourcing the computation workload to an HPC cloud provider, HPC resources were elastically provisioned and released. Rolls-Royce engineers were able to expand or shrink HPC capacity as needed, thus increasing their operational IT efficiency and better utilizing HPC resources. For example, the availability of cloud computing resources makes it possible to scale up HPC to run even bigger models that provide more detailed insights into the physical behavior of the system.

There is currently no physical way to determine the performance of a proposed cooling and sealing design until the hardware is built and tested. At that point, so much time and money have been invested in the design that changes are very expensive. It is also impossible to evaluate more than a few alternatives using the build-and-test method. Simulation is the only answer. One significant advantage of running a coupled structural–fluid simulation in the HPC cloud is the potential to iteratively optimize the entire cooling and sealing system design in the early stages of the product development process. An experiment can be designed to explore the complete design space, and then engineers can select the best possible design for prototyping and testing. Rolls-Royce is aggressively pursuing this HPC cloud approach, which has the potential to achieve significant improvements in jet engine performance.

click below to start a conversation with ANSYS

Contact Us
Contact Us
Contact