cc_staff
1,894
edits
mNo edit summary |
mNo edit summary |
||
Line 938: | Line 938: | ||
</tabs> | </tabs> | ||
=== | === UDFs === <!--T:520--> | ||
The first step is to transfer your UDF (the sampleudf.c and any additional | The first step is to transfer your User-Defined Function or UDF (namely the sampleudf.c source file and any additional dependency files) to the cluster. When uploading from a windows machine be sure the text mode setting of your transfer client is used otherwise fluent won't be able to read the file properly on the cluster since it runs linux. The UDF should be placed in the directory where your journal, cas and dat files reside. Next add one of the following commands into your journal file before the commands that read in your simulation cas/dat files. | ||
==== Interpreted UDF==== <!--T:521--> | ==== Interpreted UDF==== <!--T:521--> | ||
To tell fluent to interpret your UDF at runtime add | To tell fluent to interpret your UDF at runtime add the following command line into your journal file before the cas/dat files are read or initialized. The filename sampleudf.c should be replaced with the name of your source file. The command remains the same regardless if the simulation is being run in serial or parallel. To ensure the UDF can be found in the same directory as the journal file remove any managed definitions from the cas file by opening it in the gui and resaving either before uploading to the Alliance or opening it in the gui on a compute node or gra-vdi then resaving it. Doing this will ensure only the following command/method will be in control when fluent runs. To use a interpreted UDF with parallel jobs it will need to be parallelized as described in the section below otherwise it will likely only work when submitted as a serial (1 core) job. | ||
define/user-defined/interpreted-functions "sampleudf.c" "cpp" 10000 no | define/user-defined/interpreted-functions "sampleudf.c" "cpp" 10000 no | ||
Line 950: | Line 950: | ||
==== Compiled UDF ==== <!--T:522--> | ==== Compiled UDF ==== <!--T:522--> | ||
To use this approach the UDF must be compiled on an alliance cluster at least once to create the libudf subdirectory. If you compiled your UDF on a remote system such as your laptop or a lab machine then the corresponding libudf subdirectory cannot simply be copied onto the Alliance. That said | To use this approach the UDF must be compiled on an alliance cluster at least once to create the libudf subdirectory. If you compiled your UDF on a remote system such as your laptop or a lab machine then the corresponding libudf subdirectory cannot simply be copied onto the Alliance. That said once you have compiled your UDF on one Alliance cluster it can be transferred to another Alliance cluster without recompiling providing you load the same StdEnv environment. Then it can be simply loaded at runtime when submitting jobs by including only the second line below in your journal file, especially important when submitting parallel jobs. Similarly if you leave both of the following lines uncommented in your journal file, the UDF will be (re)compiled automatically and then loaded each and every time you run a job on the cluster. This would not only require additional runtime to compile the UDF unnecessarily but also impose additional complexity on every cluster since all the files required to compile your UDF would need to be present and properly setup on both/all systems. Another way to build your UDF libudf subdirectory structure containing the <code>libudf.so</code> shared library on the Alliance would be to open your simulation in the gui on a cluster compute node (or gra-vdi) then navigate to the Compiled UDFs Dialog Box, add your UDF source file and click Build. Before saving your cas file however verify its not defined in the Interpreted UDFs Dialog Box or the UDF Library Manager Dialog Box to ensure only the following journal file commands will control it. To use a compiled UDF with parallel jobs it will need to be parallelized as described in the section below otherwise it will likely only work when submitted as a serial (1 core) job. | ||
define/user-defined/compiled-functions compile libudf yes sampleudf.c "" "" | define/user-defined/compiled-functions compile libudf yes sampleudf.c "" "" | ||
Line 957: | Line 957: | ||
define/user-defined/compiled-functions load libudf | define/user-defined/compiled-functions load libudf | ||
==== Parallel Use ==== <!--T:523--> | |||
Most UDFs are created and tested in serial mode, however must be parallelized when submitting single node Shared Memory and multi-node MPI parallel otherwise the simulation will likely crash at worst or run very slow at best. The topic is well described in <I>Chapter 7:Parallel Considerations</I> of the Ansys official 2024 Fluent Documentation Manual section which can be accessed by following the steps given [https://docs.alliancecan.ca/wiki/Ansys#Online_documentation here] for further information. | |||
== Ansys CFX == <!--T:78--> | == Ansys CFX == <!--T:78--> |