cc_staff
1,857
edits
mNo edit summary |
mNo edit summary |
||
Line 935: | Line 935: | ||
}} | }} | ||
</tab> | </tab> | ||
</tabs> | </tabs> | ||
=== UDF Usage === <!--T:520--> | |||
The first step is to transfer your UDF (the sampleudf.c and any additional required files for it to build) to the cluster. When uploading from a windows machine be sure the text mode setting of your transfer client is used otherwise fluent won't be able to read the file properly on the cluster since it runs linux. The UDF should be placed in the directory where your journal, cas and dat files reside. Next add one of the following commands into your journal file before the commands that read in your simulation cas/dat files. | |||
==== Interpreted UDF==== <!--T:521--> | |||
To tell fluent to interpret your UDF at runtime add one the following line into your journal replacing sampleudf.c with the name of your file. Position the line before the cas/dat files are are read or initialized. To ensure the UDF can be found in the same directory as the journal file remove any managed definitions from the cas file by opening it in the gui and resaving either before uploading to the Alliance or opening it in the gui on a compute node or gra-vdi then resaving it. Doing this will ensure only the following commands in this section are in control. | |||
define/user-defined/interpreted-functions "sampleudf.c" "cpp" 10000 no | |||
==== Compiled UDF ==== <!--T:522--> | |||
To use this approach the UDF must be compiled on an alliance cluster at least once to create the libudf subdirectory. If you compiled your UDF on a remote system such as your laptop or a lab machine then the corresponding libudf subdirectory cannot simply be copied onto the Alliance. That said if you compiled your UDF on one Alliance cluster such as cedar, providing you are running the same StdEnv version on another cluster(s) such as graham then the lidudf subdirectory and its contents can be transferred and simply loaded in the journal file at runtime as shown in the following second line below. Otherwise if you leave both of the following lines uncommented in your journal file, the UDF will be recompiled and then loaded each time you run your simulation. Thus all of the files that are required for your UDF to successfully build must be present. Another way to build your UDF libudf structure on the Alliance would be to open your simulation in the gui on a cluster compute node or gra-vdi then using the pulldown menus. Regardless whether you use the gui or journal file approach be sure the udf is not saved inside the cas file when working in the gui otherwise it will additionally build each time you submit a job on the cluster. Thus likewise before uploading your cas file to the Alliance you should manage your UDF in the gui and clear any instructions that define its presence in a directory and/or to be interpreted or build then resave the cas file so the journal file only used. | |||
define/user-defined/compiled-functions compile libudf yes sampleudf.c "" "" | |||
and/or | |||
define/user-defined/compiled-functions load libudf | |||
== Ansys CFX == <!--T:78--> | == Ansys CFX == <!--T:78--> |