Talk:GROMACS

From Alliance Doc
Jump to navigation Jump to search

This section was imported from Westgrid and doesn't directly apply to the new National Systems.

Installation[edit]

Version 4.6 Compilation[edit]

Modules for Intel based Gromacs:

module purge
module load intel/2016
module load openmpi/1.6.5-intel
module load fftw/3.3.4-intel
module load gromacs/4.6.7-intel

Modules for GNU based Gromacs:

module purge
module load openmpi/1.6.5-gnu
module load fftw/3.3.4-gnu
module load gromacs/4.6.7-gnu

my_cmake_intel.sh:

#! /bin/bash
rm CMakeCache.txt

PREF="-DCMAKE_INSTALL_PREFIX=/global/software/gromacs/gromacs-4.6.7-intel/"
OPTS="$PREF"

module purge
module load intel/2016
module load fftw/3.3.4-intel
module load cmake

CC=icc CXX=icpc cmake $OPTS ../gromacs-4.6.7

my_cmake_mpi_intel.sh:

#! /bin/bash
rm CMakeCache.txt

PREF="-DCMAKE_INSTALL_PREFIX=/global/software/gromacs/gromacs-4.6.7-intel/"
MPI="-DGMX_MPI=on -DGMX_THREAD_MPI=off -DGMX_OPENMP=off"

OPTS="$PREF $MPI"

module purge
module load intel/2016
module load fftw/3.3.4-intel
module load cmake
module load openmpi/1.6.5-intel

CC=icc CXX=icpc cmake $OPTS ../gromacs-4.6.7

Installation using CMake[edit]

For Gromacs 4.6 and up CMake is required for building. There are several different ways to run CMake.

Generally, it is better to use out-of-source building, that is, create a directory, go there, and point cmake to the source. CMake needs the directory for the build on its command line, so if I am in that directory, then specify it relative to itself, as "../dir_name". From the command line non-interactively:

cmake ../gromacs-4.6.7

Text mode interactive:

ccmake ../gromacs-4.6.7

GUI interactive:

cmake-gui ../gromacs-4.6.7

Non-interactive approach:

 cd /usr/apps/src/gromacs/
 mkdir build-gromacs-4.6.7
 cd build-gromacs-4.6.7
 
 cmake ../gromacs-4.6.7

Prerequisites[edit]

  • Better compilers: gcc 4.7 and up, Intel 12 and up;
  • FFTW
  • For new native GPU support in GROMACS, NVIDIA's CUDA http://www.nvidia.com/object/cuda_home_new.html version 3.2 software development kit is required, and the latest version is strongly encouraged. NVIDIA GPUs with at least NVIDIA compute capability 2.0 are required, e.g. Fermi or Kepler cards.
  • OpenMPI http://www.open-mpi.org/ version 1.4.1 (or higher);
  • CMake version 2.8.0 or higher;
  • BLAS and LAPACK benefit some of the GROMACS tools but to not provide any benefits for GROMACS itself.
  • A few GROMACS tools get some extra functionality when linked with the GNU scientific library GSL

Building non-MPI version[edit]

The configuration step is done with the my_cmake.sh script:

rm CMakeCache.txt

FFTW="/usr/apps/fftw/fftw-3.3.4-gcc447/"
ATLAS="/usr/apps/atals/atlas-3.10.2-gnu/"

export CMAKE_PREFIX_PATH=$FFTW:$ATLAS:$CMAKE_PREFIX_PATH

PREF="-DCMAKE_INSTALL_PREFIX=/usr/apps/gromacs/gromacs-4.6.7-gnu/"

OPTS="$PREF" 

cmake $OPTS ../gromacs-4.6.7

So, the compilation is done as

mkdir build-gromacs-4.6.7
cd build-gromacs-4.6.7

./my_cmake.sh
make -j 5
make install

Atlas (BLAS) library did not work, though. But this is not a problem, I just ignored it.

Building MPI version[edit]

The OpenMPI library built with Intel compiler did not work, I had to use a version built with the GNU compilers.

 module unload openmpi
 module load openmpi/1.8.4

 ./my_cmake_mpi.sh
 ./make -j 5
 ./make install

This worked.

With the default OpenMPI, which probably was compiled with the Intel compiler, the make step gave an error like

Could not find -lmpi

The configuration step with cmake is done in the my_cmake_mpi.sh script:

rm CMakeCache.txt

FFTW="/usr/apps/fftw/fftw-3.3.4-gcc447/"
ATLAS="/usr/apps/atals/atlas-3.10.2-gnu/"

export CMAKE_PREFIX_PATH=$FFTW:$ATLAS:$CMAKE_PREFIX_PATH

PREF="-DCMAKE_INSTALL_PREFIX=/usr/apps/gromacs/gromacs-4.6.7-gnu/"
MPI="-DGMX_MPI=on -DGMX_THREAD_MPI=off -DGMX_OPENMP=off"

OPTS="$PREF $MPI" 

cmake $OPTS ../gromacs-4.6.7

After this compilation, the proper invoking would be

module unload openmpi
module load openmpi/1.8.4

module load gromacs
./mdrun_mpi

Note that when MPI compilation is selected the Makefile is using the default _mpi suffix for the compiled binaries, so the same installation path can be used. The single precision binaries have no suffix, the double precision have _d suffix, and the MPI enabled binaries have the _mpi suffix.

Traduction[edit]

édition d'ajouts 2018-04-15, 2018-06 révision demandé à AliKerrache 2018-06-08