ALERT! Warning: your browser isn't supported. Please install a modern one, like Firefox, Opera, Safari, Chrome or the latest Internet Explorer. Thank you!
Startseite » ... » Zentrale Einrichtungen  » ZIH  » Wiki
phone prefix: +49 351 463.....

HPC Support

Operation Status

Ulf Markwardt: 33640
Claudia Schmidt: 39833

Login and project application

Phone: 40000
Fax: 42328

You are here: Compendium » Applications » FEMSoftware

FEM Software

For an up-to-date list of the installed software versions on our cluster, please refer to SoftwareModulesList.


ABAQUS is a general-purpose finite-element program designed for advanced linear and nonlinear engineering analysis applications with facilities for linking-in user developed material models, elements, friction laws, etc.

Eike Dohmen (from Inst.f. Leichtbau und Kunststofftechnik) sent us the attached description of his ABAQUS calculations. Please try to adapt your calculations in that way.
Eike is normaly a Windows-User and his description contains also some hints for basicly unix-commands. ( ABAQUS-SLURM.pdf - only in german)

Please note: Abaqus calculations should be startet with a batch script. Please read here the information to the BatchSystem SLURM.

The detailed Abaqus documentation can be found at (only accessible from within the TU Dresden campus net).

Example - Thanks to Benjamin Groeger, Inst. f. Leichtbau und Kunststofftechnik)

1. prepaire a abaqus input-file (here the input example from Benjamin)


2. prepare a batch-script on taurus like this


### Thanks to Benjamin Groeger, Institut fuer Leichtbau und Kunststofftechnik, 38748
### runs on taurus and needs ca 20sec with 4cpu
### generates files:
### yyyy.dat
### yyyy.msg
### yyyy.odb
### yyyy.prt
### yyyy.sim
### yyyy.sta

#SBATCH --nodes=1 ### with >1 node abaqus needs a nodeliste
#SBATCH --ntasks-per-node=4
#SBATCH --mem=500 ### memory (sum)
#SBATCH --time=00:04:00
### give a name, what ever you want
#SBATCH --job-name=yyyy
### you get emails when the job will finished or failed
### set your right email
#SBATCH --mail-type=END,FAIL
### set your project
#SBATCH -A p_xxxxxxx

###abaqus have its own mpi

### load and start
module load abaqus
abaqus interactive input=Rot-modell-BenjaminGroeger.inp job=yyyy cpus=4 mp_mode=mpi

3. start the batchscript (name of our script is "batch-Rot-modell-BenjaminGroeger")

sbatch batch-Rot-modell-BenjaminGroeger      ---> you will get a jobnumber = JobID (for example 3130522)

4. control the status of the job

squeue -u your_login     --> in column "ST" (Status) you will find a R=Running or P=Pending (waiting for ressources)


Ansys and Ansys CFX used to be separate packages in the past and are now combined.

Ansys is a general-purpose finite-element program for engineering analysis, and includes preprocessing, solution, and post-processing functions. It is used in a wide range of disciplines for solutions to mechanical, thermal, and electronic problems.

==> Hier ein paar Start-Tipps für Ansys-Nutzer, die vom Windows kommen. (Ansys-SLURM-fuerWindowsUser.pdf)

Using Workbench in Batch Mode

The ANSYS workbench (runwb2) can also be used in a batch script to start calculations from a workbench project into the background.

To do so, you have to specify the -B parameter (for batch mode), -F for your project file, and can then either add differerent commands via -E parameters directly, or specify a workbench script file containing commands via -R.

NOTE: since the MPI library that ANSYS uses internally (Platform MPI) has some problems integrating seamlessly with SLURM, you have to unset the enviroment variable SLURM_GTIDS in your job environment before running workbench. An example batch script could look like this:
#SBATCH --time=0:30:00
#SBATCH --nodes=1
#SBATCH --ntasks=2
#SBATCH --mem-per-cpu=1000M


runwb2 -B -F Workbench_Taurus.wbpj -E 'Project.Update' -E 'Save(Overwrite=True)'
#or, if you wish to use a workbench replay file, replace the -E parameters with: -R mysteps.wbjn

Running Workbench in Parallel

Unfortunately, the number of CPU cores you wish to use cannot simply be given as a command line parameter to your runwb2 call. Instead, you have to enter it into an XML file in your home. This setting will then be used for all your runwb2 jobs. While it is also possible to edit this setting via the Mechanical GUI, experience shows that this can be problematic via X-Forwarding and we only managed to use the GUI properly via DCV, so we recommend you simply edit the XML file directly with a text editor of your choice. It is located under:

$HOME/.mw/Application Data/Ansys/v181/SolveHandlers.xml

(mind the space in there.) You might have to adjust the ANSYS Version (v181) in the path. In this file, you can find the parameter


that you can simply change to something like 16 oder 24. For now, you should stay within single-node boundaries, because multi-node calculations require additional parameters. The number you choose should match your used --cpus-per-task parameter in your sbatch script.

COMSOL Multiphysics

" COMSOL Multiphysics (formerly FEMLAB) is a finite element analysis, solver and Simulation software package for various physics and engineering applications, especially coupled phenomena, or multiphysics." [Wikipedia ]

Comsol may be used remotely on ZIH machines or locally on the desktop, using ZIH license server.

For using Comsol on ZIH machines, the following operating modes (see Comsol manual) are recommended:
  • Interactive Client Server Mode

In this mode Comsol runs as server process on the ZIH machine and as client process on your local workstation. The client process needs a dummy license for installation, but no license for normal work. Using this mode is almost undistinguishable from working with a local installation. It works well with Windows clients. For this operation mode to work, you must build an SSH tunnel through the firewall of ZIH. For further information, see the Comsol manual.

Example for starting the server process (4 cores, 10 GB RAM, max. 8 hours running time):

module load comsol
srun -c4 -t 8:00 --mem-per-cpu=2500 comsol -np 4 server
  • Interactive Job via Batchsystem SLURM
module load comsol
srun -n1 -c4 --mem-per-cpu=2500 -t 8:00 --pty --x11=first comsol -np 4

Man sollte noch schauen, ob das Rendering unter Options -> Preferences -> Graphics and Plot Windows auf Software-Rendering steht - und dann sollte man campusintern arbeiten können.
  • Background Job via Batchsystem SLURM
#SBATCH --time=24:00:00
#SBATCH --nodes=2
#SBATCH --ntasks-per-node=2
#SBATCH --cpus-per-task=12
#SBATCH --mem-per-cpu=2500

module load comsol
srun comsol -mpi=intel batch -inputfile ./MyInputFile.mph

Submit via: sbatch <filename>


Both, the shared memory version and the distributed memory version (mpp) are installed on all machines.

To run the MPI version on Taurus or Venus you need a batchfile (sumbmit with sbatch <filename>) like:
#SBATCH --time=01:00:00 # walltime
#SBATCH --ntasks=16 # number of processor cores (i.e. tasks)
#SBATCH --mem-per-cpu=1900M # memory per CPU core

module load ls-dyna
srun mpp-dyna i=neon_refined01_30ms.k memory=120000000