Optimisation of Large Eddy Simulation (LES) turbulence modelling within Fluidity
eCSE05-07Key Personnel
PI/Co-I: Dr Angus Creech - University of Edinburgh, Dr Adrian Jackson - EPCC, University of Edinburgh
Technical: Dr Angus Creech - University of Edinburgh
Relevant documents
eCSE Technical Report: Developing Fluidity for high-fidelity coastal modelling
Paper submitted to Computers & Fluids: Efficient Large Eddy Simulation for the Discontinuous Galkerin Method
Project summary
Energetic tidally-driven flow in coastal areas such as straits represents a challenge for high fidelity simulation, as turbulent flow processes to be modelled range from less than 1m to greater than 1km (eg. tidal jets). Furthermore, such turbulence is highly anisotropic, in a domain that may be only 100-200 metres deep, but have a horizontal surface area (the sea surface) of thousands of square kilometres. Coastal domains also have an irregular shape, with undulating bathymetry and complex coastlines; this makes them well suited to the unstructured mesh approaches used in finite-element computational fluid dynamics.
Fluidity is an open source finite-element computational fluid dynamics code that solves the non-hydrostatic Navier Stokes momentum equation and continuity equation, is capable of modelling free surfaces, and is designed for irregularly shaped, unstructured grids. Fluidity is written in C++ and Fortran, with the majority written in Fortran. It utilises both OpenMP and MPI for parallel computation, and has been the subject of previous development for efficient scaling to thousands of cores on both HECTOR and ARCHER. It also supports a stable velocity-pressure element pair, P1DG-P2, which is first-order Discontinuous Galerkin (DG) velocity and second-order Continuous Galerkin (CG) pressure. This pair is LBB stable and suitable for high Reynolds number flow.
However, assembly of the momentum equations in DG is expensive, taking a substantial portion of each timestep. Optimisation of this was required for a substantial performance improvement. Each timestep in Fluidity consists of several non-linear Picard iterations, which are very similar in nature to the SIMPLE family of algorithms of Patankar.
Typically Fluidity requires at least two Picard iterations per timestep to achieve satisfactory convergence of the velocity and pressure fields. Substantial performance improvements were made to the DG assembly routines, which alone gave noticeable improvements in overall runtimes, from low to 1000+ core counts.
Other miscellaneous optimisations and bug fixes were made, in particular: graph reordering of mesh files to produce more coherent meshes; enabling Fluidity to run with 32-bit floating point; load balancing improvements for extruded meshes; faster sub-cycling routines; simplified checkpointing.
Secondly, two new Large Eddy Simulation (LES) turbulence models were developed, one for isotropic grids and one for highly anisotropic grids, which utilised the Compact DG method. The anisotropic method in particular is necessary for coastal modelling, which typically features grids with a horizontal/vertical element aspect ratio of around (20:1). As part of this, an eddy-viscosity wall model was implemented. This was verified against experimental data for a well-known test case.
The third part involved simplifying the configuration of new simulations, particularly those involving DG. Originally in "stock" Fluidity, fields and field options useful to DG simulations all had to be reated and set manually, which is a complex and time-consuming task; now, many options trigger default behaviour, such as the creation of default options and required additional fields. In particular, creating a DG velocity field now automatically creates projected CG velocity fields for results output, which leads an order of magnitude reduction in file size.
Lastly, tools and utilities were created to simplify the process of tidal modelling within Fluidity. These varied from additional Fortran routines and configuration scheme changes, to Python utilities designed to be embedded within simulations via the configuration file. Overall we managed to reduce the runtime for representative tidal simulations by between 3.2 and 3.4 times (depending on the number of nodes used to run the simulation), and significantly improve configuration time for users and reduce output file sizes.
Figure 1: Sound of Islay Large Eddy Simulation (LES) model using the version of Fluidity developed during this project, showing the surface currents. Scottish Power Renewables aims to exploit the strong tidal flow here to generate renewable energy. The unsteady turbulent features, such as the eddies that can be seen near the coastlines, cannot be modelled in such detail with other coastal models such as FVCOM and TELEMAC.
With the new functionality and performance we have enabled through this project we have opened up to tidal modellers both the Fluidity code specifically and the use of HPC systems (such as ARCHER) in general. For instance, the Scottish Association for Marine Science (SAMS) alone has 5 tidal researchers who will now be able to use Fluidity for their work. The SAMS group have two coastal models that could be adapted to Fluidity.
In coastal modelling, high-fidelity, detailed simulations of areas with strong tidal currents (eg. > 2 m/s) will enable a greater understanding of turbulent, inertia-dominated flow. This is particularly important for marine renewable energy, where such sites selected for tidal turbine deployment (eg. Sound of Islay, the Inner Sound in the Pentland Firth, Bluemull Sound in Shetland) are often characterised by such highly energetic flow. To model this flow requires tools capable of simulating both isotropic and anisotropic turbulence in tidal channels with strong vertical shear, vertical boils, irregular bathymetry and a free surface, whilst providing transient information (ie. time-series data) on the flow for spectral analysis and comparison with Acoustic Doppler Current Profiler (ADCP) measurements.
Substantially increasing the performance of Fluidity, as we have done, and adding the ability to deal with highly anisotropic turbulence and grids using Large Eddy Simulation, including boundary layer effects, has enabled us to expand the scope of the science that the code can tackle. Whilst there are some features of this project exclusive to coastal modelling, the benefits of the project work extend far beyond this.
One example of this would be the use of computational fluid dynamics (CFD) to investigate the complex interactions between wind turbines, and their associated wakes, in wind farms. As the wind passes through an upwind turbine in a wind farm, it generates a wake behind it, which results in reduced power output in other wind turbines. As one might expect, it is much more complex than this, as the turbine blades also generate turbulence in the wake, which increases mixing in the air behind it, which affects the length of the wake. Other physical phenomena occur, such as "jetting", where the wind will actually increase in speed between turbines, and wake meandering, where a wake can move in and out of the path of a downwind turbine. The scale and effect of these phenomena are dependent upon atmospheric conditions, topography, and the performance of the turbines themselves.
Achievement of objectives
1. Optimisation of assembly routines with emphasis on Discontinuous Galerkin
Success metric: the assembly routines identified for optimisation will be benchmarked on ARCHER throughout the optimisation process, with new medium/long test cases developed for the Fluidity code trunk.
We have heavily optimised the assembly routines, providing a 3.4-3.6x speed up of the assembly routines for representative problems, including the two test cases we have developed for benchmarking such functionality. For our optimisation work overall we have achieved a 1.2-1.4x speedup for standard CFD problems, and a 3.2x speedup for tidal modelling problems for the whole code.
2. Implementing Large Eddy Simulation (LES) for Discontinuous Galerkin
Success metric: modelling accuracy will be tested on ARCHER with three cases - flow in a rectilinear channel, a backward-facing step, and flow past a cylinder. The results have been successfully compared with literature.
We have implemented a new LES model within Fluidity, and validated it with the well-known backwards-facing step simulation. A paper outlining this new model has been submitted to the Computers and Fluids journal.
3. Implementation of boundary conditions suitable for tidal modelling
Success metric: these features will tested on a previously developed tidal model and measurements, adapted to the new DG LES version of Fluidity.
The tidal boundary forcing and damping conditions have been implemented and confirmed as behaving as expected. Validation of a full tidal model with ADCP and other measurements are ongoing, and the findings are being prepared for publication as a journal paper, for submission in December 2016.
4. Simplification of GUI options and checkpointing
Success metric: this objective's effectiveness will be judged by training a selected set of new users in Fluidity, and judging their response.
We have implemented new GUI default options and optimised the output files, reducing the amount of data that needs to be stored by a factor of 10. We are in the process of organising a workshop for tidal modellers to test and demonstrate this new functionality but such tasks are beyond the scope of the eCSE project.
5. Enabling next-generation tidal modelling
Success metric: None was defined
We have demonstrated the new, optimised, version of Fluidity on a realistic tidal model and been able to show very significant performance increases (above 3x faster) for these simulations, enabling much finer resolution simulations to be run and thereby enabling new science.
Summary of the Software
Fluidity is an open source, finite-element computational fluid dynamics code, capable of scaling to meshes with millions of elements running on thousands of cores using MPI and OpenMP. It is a feature-rich research code with useful capabilities, such as adaptive meshing and DG element support. It has a well-established track record of running on ARCHER and its predecessor HECToR.