OpenFOAM: Difference between revisions

From Alliance Doc
Jump to navigation Jump to search
No edit summary
No edit summary
Line 8: Line 8:
: Here is an example of a serial submission script for OpenFOAM 5.0:
: Here is an example of a serial submission script for OpenFOAM 5.0:
<pre>
<pre>
#$ -cwd
#!/bin/bash
#$ -l h_rt=01:00:00
#SBATCH --time=00:01:00
#SBATCH --account=def-someuser


module purge
module purge
Line 21: Line 22:
: Here is an example of a parallel submission script for OpenFOAM 2.3.0:
: Here is an example of a parallel submission script for OpenFOAM 2.3.0:
<pre>
<pre>
#$ -cwd
#!/bin/bash
#$ -l h_rt=01:00:00
#SBATCH --account=def-someuser
#$ -pe ompi* 4
#SBATCH --ntasks=4              # number of MPI processes
#SBATCH --mem-per-cpu=1024M      # memory; default unit is megabytes
#SBATCH --time=0-00:10          # time (DD-HH:MM)


module purge
module purge
module load gcc/4.6.4 openmpi/gcc openfoam/2.3.0
module load openfoam/5.0
source $OPENFOAM/etc/bashrc
source $FOAM_ETC/bashrc


decomposePar
blockMesh
blockMesh
mpirun icoFoam -parallel
setFields
srun interFoam -parallel
</pre>
</pre>


: Mesh preparation (<code>blockMesh</code>) may be fast enough to be done at the command line (see [[User_Guide#Head_node_policies|Head Node Policies]]).  The solver (<code>icoFoam</code> and others) is usually the most expensive step and should always be submitted as a Grid Engine job except in very small test cases or tutorials.
: Mesh preparation (<code>blockMesh</code>) may be fast enough to be done at the command line (see [[Running jobs]]).  The solver (<code>icoFoam</code> and others) is usually the most expensive step and should always be submitted as a Slurm job except in very small test cases or tutorials.


; Performance:
; Performance:

Revision as of 20:22, 27 November 2017


This article is a draft

This is not a complete article: This is a draft, a work in progress that is intended to be published into an article, which may or may not be ready for inclusion in the main wiki. It should not necessarily be considered factual or authoritative.

Description
The OpenFOAM (Open Field Operation and Manipulation) CFD Toolbox is a free, open source CFD software package. OpenFOAM has an extensive range of features to solve anything from complex fluid flows involving chemical reactions, turbulence and heat transfer, to solid dynamics and electromagnetics.
Modulefile
openfoam
Documentation
OpenFOAM homepage and Tutorials.
Usage
OpenFOAM requires substantial preparation of your environment. In order to run OpenFOAM commands (such as paraFoam, blockMesh, etc), you must load a modulefile followed by sourcing a configuration script.
Here is an example of a serial submission script for OpenFOAM 5.0:
#!/bin/bash
#SBATCH --time=00:01:00
#SBATCH --account=def-someuser

module purge
module load openfoam/5.0
source $FOAM_ETC/bashrc

blockMesh
icoFoam
Here is an example of a parallel submission script for OpenFOAM 2.3.0:
#!/bin/bash
#SBATCH --account=def-someuser
#SBATCH --ntasks=4               # number of MPI processes
#SBATCH --mem-per-cpu=1024M      # memory; default unit is megabytes
#SBATCH --time=0-00:10           # time (DD-HH:MM)

module purge
module load openfoam/5.0
source $FOAM_ETC/bashrc

decomposePar
blockMesh
setFields
srun interFoam -parallel
Mesh preparation (blockMesh) may be fast enough to be done at the command line (see Running jobs). The solver (icoFoam and others) is usually the most expensive step and should always be submitted as a Slurm job except in very small test cases or tutorials.
Performance
OpenFOAM can emit a lot of debugging information in very frequent small writes (e.g. hundreds per second). This may lead to poor performance on our network file systems. If you are in stable production and don't need the debug output, you can reduce or disable it with:
$ mkdir -p $HOME/.OpenFOAM/$WM_PROJECT_VERSION
$ cp $WM_PROJECT_DIR/etc/controlDict $HOME/.OpenFOAM/$WM_PROJECT_VERSION/
followed by editing the debugSwitches dictionary in $HOME/.OpenFOAM/$WM_PROJECT_VERSION/controlDict, changing flags from values >0 to 0. (Contribution from C. Lane)