cc_staff
1,894
edits
mNo edit summary |
mNo edit summary |
||
Line 942: | Line 942: | ||
The first step is to transfer your User-Defined Function or UDF (namely the sampleudf.c source file and any additional dependency files) to the cluster. When uploading from a windows machine be sure the text mode setting of your transfer client is used otherwise fluent won't be able to read the file properly on the cluster since it runs linux. The UDF should be placed in the directory where your journal, cas and dat files reside. Next add one of the following commands into your journal file before the commands that read in your simulation cas/dat files. Regardless of whether you use the Interpreted or Compiled UDF approach, before uploading your cas file onto the Alliance please check that neither the Interpreted UDFs Dialog Box or the UDF Library Manager Dialog Box are configured to use any UDF, this will ensure that when jobs are submitted only the journal file commands will be in control. | The first step is to transfer your User-Defined Function or UDF (namely the sampleudf.c source file and any additional dependency files) to the cluster. When uploading from a windows machine be sure the text mode setting of your transfer client is used otherwise fluent won't be able to read the file properly on the cluster since it runs linux. The UDF should be placed in the directory where your journal, cas and dat files reside. Next add one of the following commands into your journal file before the commands that read in your simulation cas/dat files. Regardless of whether you use the Interpreted or Compiled UDF approach, before uploading your cas file onto the Alliance please check that neither the Interpreted UDFs Dialog Box or the UDF Library Manager Dialog Box are configured to use any UDF, this will ensure that when jobs are submitted only the journal file commands will be in control. | ||
==== Interpreted | ==== Interpreted ==== <!--T:521--> | ||
To tell fluent to interpret your UDF at runtime add the following command line into your journal file before the cas/dat files are read or initialized. The filename sampleudf.c should be replaced with the name of your source file. The command remains the same regardless if the simulation is being run in serial or parallel. To ensure the UDF can be found in the same directory as the journal file remove any managed definitions from the cas file by opening it in the gui and resaving either before uploading to the Alliance or opening it in the gui on a compute node or gra-vdi then resaving it. Doing this will ensure only the following command/method will be in control when fluent runs. To use a interpreted UDF with parallel jobs it will need to be parallelized as described in the section below. | To tell fluent to interpret your UDF at runtime add the following command line into your journal file before the cas/dat files are read or initialized. The filename sampleudf.c should be replaced with the name of your source file. The command remains the same regardless if the simulation is being run in serial or parallel. To ensure the UDF can be found in the same directory as the journal file remove any managed definitions from the cas file by opening it in the gui and resaving either before uploading to the Alliance or opening it in the gui on a compute node or gra-vdi then resaving it. Doing this will ensure only the following command/method will be in control when fluent runs. To use a interpreted UDF with parallel jobs it will need to be parallelized as described in the section below. | ||
Line 948: | Line 948: | ||
define/user-defined/interpreted-functions "sampleudf.c" "cpp" 10000 no | define/user-defined/interpreted-functions "sampleudf.c" "cpp" 10000 no | ||
==== Compiled | ==== Compiled ==== <!--T:522--> | ||
To use this approach your UDF must be compiled on an alliance cluster at least once. Doing so will create a libudf subdirectory structure containing the required <code>libudf.so</code> shared library. The libudf directory cannot simply be copied from a remote system (such as your laptop) to the Alliance since the library dependencies of the shared library will not be satisfied resulting in fluent crashing on startup. That said once you have compiled your UDF on one Alliance cluster you can transfer the newly created libudf to any other Alliance cluster providing your account there loads the same StdEnv environment module version. Once copied, the UDF can be used by uncommenting the second (load) libudf line below in your journal file when submitting jobs to the cluster. Both (compile and load) libudf lines should not be left uncommented in your journal file when submitting jobs on the cluster otherwise your UDF will automatically (re)compiled for each and every job. Not only is this highly inefficient, but also it will lead to racetime-like build conflicts if multiple jobs are run from the same directory. Besides configuring your journal file to build your UDF, the fluent gui (run on any cluster compute node or gra-vdi) may also be used. To do this one would navigate to the Compiled UDFs Dialog Box, add the UDF source file and click Build. When using a compiled UDF with parallel jobs your source file should be parallelized as discussed in the section below. | To use this approach your UDF must be compiled on an alliance cluster at least once. Doing so will create a libudf subdirectory structure containing the required <code>libudf.so</code> shared library. The libudf directory cannot simply be copied from a remote system (such as your laptop) to the Alliance since the library dependencies of the shared library will not be satisfied resulting in fluent crashing on startup. That said once you have compiled your UDF on one Alliance cluster you can transfer the newly created libudf to any other Alliance cluster providing your account there loads the same StdEnv environment module version. Once copied, the UDF can be used by uncommenting the second (load) libudf line below in your journal file when submitting jobs to the cluster. Both (compile and load) libudf lines should not be left uncommented in your journal file when submitting jobs on the cluster otherwise your UDF will automatically (re)compiled for each and every job. Not only is this highly inefficient, but also it will lead to racetime-like build conflicts if multiple jobs are run from the same directory. Besides configuring your journal file to build your UDF, the fluent gui (run on any cluster compute node or gra-vdi) may also be used. To do this one would navigate to the Compiled UDFs Dialog Box, add the UDF source file and click Build. When using a compiled UDF with parallel jobs your source file should be parallelized as discussed in the section below. | ||
Line 958: | Line 958: | ||
define/user-defined/compiled-functions load libudf | define/user-defined/compiled-functions load libudf | ||
==== Parallel | ==== Parallel ==== <!--T:523--> | ||
Before a UDF can be used with a fluent parallel job (single node SMP and multi node MPI) it will need to be parallelized. By doing this we control how/which processes (host and/or compute) run specific parts of the UDF code. | Before a UDF can be used with a fluent parallel job (single node SMP and multi node MPI) it will need to be parallelized. By doing this we control how/which processes (host and/or compute) run specific parts of the UDF code when fluent is run in parallel on the cluster. The instrumenting procedure involves adding compiler directives, predicates and reduction macros into your working serial UDF. Failure to do so will result in fluent running slow at best or immediately crashing at worst. The end result will be a single UDF that runs efficiently when fluent is used in both serial and parallel mode. The subject is described in detail under <I>Part I: Chapter 7: Parallel Considerations</I> of the Ansys 2024 <I>Fluent Customization Manual</I> which can be accessed [https://docs.alliancecan.ca/wiki/Ansys#Online_documentation here]. | ||
==== DPM ==== <!--T:524--> | |||
UDFs can be used to customize Discrete Phase Models (DPM) as described in <I>Part III: Solution Mode | Chapter 24: Modeling Discrete Phase | 24.2 Steps for Using the Discrete Phase Models| 24.2.6 User-Defined Functions</I> of the <I>2024R2 Fluent Users Guide</I> and section <I>Part I: Creating and Using User Defined Functions | Chapter 2: DEFINE Macros | 2.5 Discrete Phase Model (DPM) DEFINE Macros</I> of the <I>2024R2 Fluent Customization Manual</I> available [https://ansyshelp.ansys.com/account/secured?returnurl=/Views/Secured/prod_page.html?pn=Fluent&pid=Fluent&lang=en here]. Before a DMP based UDF can be worked into a simulation, the injection of a set of particles is typically defined by specifying "Point Properties" with variables such as source position, initial trajectory, mass flow rate, time duration, temperature and so forth depending on the injection specifics/options. This can be done in the gui by clicking the Physics panel, Discrete Phase to open the <I>Discrete Phase Model</I> box and then clicking the <I>Injections</I> button. Doing so will open an <I>Injections</I> dialog box where one or more injections can be created by clicking the <I>Create</I> button. The "Set Injection Properties" dialog which appears will contain an "Injection Type" pulldown with first four types available are "single, group, surface, flat-fan-atomizer". If you select any of these then you can then the "Point Properties" tab can be selected to input the corresponding Value fields. Another way to specify the "Point Properties" would be to read an injection text file. To do this select "file" from the Injection Type pulldown, specify the Injection Name to be created and then click the <I>File</I> button (located beside the <I>OK</I> button at the bottom of the "Set Injection Properties" dialog). Here either an Injection Sample File (with .dpm extension) or a manually created injection text file can be selected. To Select the File in the Select File dialog box that change the File of type pull down to All Files (*), then highlight the file which could have any arbitrary name but commonly likely does have a .inj extension, click the OK button. Assuming there are no problems with the file, no Console error or warning message will appear in fluent. As you will be returned to the "Injections" dialog box, you should see the same Injection name that you specified in the "Set Injection Properties" dialog and be able to List its Particles and Properties in the console. Next open the Discrete Phase Model Dialog Box and select Interaction with Continuous Phase which will enable updating DPM source terms every flow iteration. This setting can be saved in your cas file or added via the journal file as shown. Once the injection is confirmed working in the gui the steps can be automated by adding commands to the journal file after solution initialization, for example: | |||
/define/models/dpm/interaction/coupled-calculations yes | |||
/define/models/dpm/injections/delete-injection injection-0:1 | |||
/define/models/dpm/injections/create injection-0:1 no yes file no zinjection01.inj no no no no | |||
/define/models/dpm/injections/list-particles injection-0:1 | |||
/define/models/dpm/injections/list-injection-properties injection-0:1 | |||
where a basic manually created injection steady file format might look like: | |||
$ cat zinjection01.inj | |||
(z=4 12) | |||
( x y z u v w diameter t mass-flow mass frequency time name ) | |||
(( 2.90e-02 5.00e-03 0.0 -1.00e-03 0.0 0.0 1.00e-04 2.93e+02 1.00e-06 0.0 0.0 0.0 ) injection-0:1 ) | |||
noting that injection files for DPM simulations are generally setup for either steady or unsteady particle tracking where the format of the former is described in subsection <I>Part III: Solution Mode | Chapter 24: Modeling Discrete Phase | 24.3. Setting Initial Conditions for the Discrete Phase | 24.3.13 Point Properties for File Injections | 24.3.13.1 Steady File Format</I> of the <I>2024R2 Fluent Customization Manual</I>. | |||
== Ansys CFX == <!--T:78--> | == Ansys CFX == <!--T:78--> |