General build instructions are available in the top-level README. The commands used for the continuous integration builds and for the official releases are available in .gitlab-ci.yml.
This page contains more detailed instructions to compile a full-featured version of GetDP, including instructions to compile common dependencies: the Gmsh library as well as PETSc. In addition to CMake and C++, C and Fortran compilers, you should have (preferably optimized) versions of the BLAS and LAPACK libraries on your system (e.g. OpenBLAS, ATLAS or the MKL).
Minimal Gmsh library
The Gmsh library is used in GetDP to read MSH4 meshes and perform efficient field interpolation on general meshes (mesh to mesh interpolation), through the ScalarField, VectorField and TensorField family of functions, in conjunction with the GmshRead operation.
To build a small static Gmsh library:
git clone https://gitlab.onelab.info/gmsh/gmsh.gitcd gmshmkdir libcd libcmake -DDEFAULT=0 -DENABLE_PARSER=1 -DENABLE_POST=1 -DENABLE_ANN=1 -DENABLE_BLAS_LAPACK=1 -DENABLE_BUILD_LIB=1 -DENABLE_PRIVATE_API=1 ..# Notes:# * replace -DENABLE_BUILD_LIB=1 with -DENABLE_BUILD_SHARED=1 to build a shared library# * if you don't have root access, add -DCMAKE_INSTALL_PREFIX=path-to-install# * for a list of all available configuration options see http://getdp.info/doc/texinfo/getdp.html#Compiling-the-source-codemake lib# Notes:# * use `make shared` for the shared librarysudo make install/fast# Notes:# * remove "sudo" if you don't have root accesscd ../..
Shared memory GetDP in real arithmetic
PETSc is the standard linear algebra toolkit used by GetDP (GetDP can also use linear solvers from Sparskit -- see the configuration options for more information).
To compile a sequential real arithmetic PETSc library (here for PETSc 3.7.4):
curl -O http://ftp.mcs.anl.gov/pub/petsc/release-snapshots/petsc-3.7.4.tar.gztar zxvf petsc-3.7.4.tar.gzcd petsc-3.7.4export PETSC_DIR=$PWDexport PETSC_ARCH=real_mumps_seq./configure --with-clanguage=cxx --with-debugging=0 --with-mpi=0 --with-mpiuni-fortran-binding=0 --download-mumps=yes--with-mumps-serial--with-shared-libraries=0 --with-x=0 --with-ssl=0 --with-scalar-type=real# Notes:# * if your compilers are not found, specify them explicitly: e.g. --FC=/path/to/f90 for the fortran compiler# * if the BLAS/LAPACK libraries are not installed in standard locations, you will have to specify their location by hand with options --with-blas-lib=/path/to/libblas and --with-lapack-lib=/path/to/liblapack# * as a last recourse (but this will severely degrade performance), you can use generic non-optimized BLAS/LAPACK libraries by using the option --download-fblaslapack=1makecd ..
If you also wish to solve eigenvalue problems, you will want to install SLEPc, an eigensolver based on PETSc (here for SLEPc 3.7.1):
git clone https://gitlab.onelab.info/getdp/getdp.gitcd getdpmkdir bincd bincmake -DENABLE_BLAS_LAPACK=0 ..# Notes:# * -DENABLE_BLAS_LAPACK=0 forces GetDP to use the same BLAS/LAPACK libraries as the ones used by PETSc# * use option -DCMAKE_PREFIX_PATH=non-standard-install-path;other-non-standard-install-path if you have libraries installed in non-standard locations# * use options -DPETSC_DIR=... -DPETSC_ARCH=... if the corresponding environment variables are not set properlymakecd ../..
Shared memory GetDP in complex arithmetic
Follow the same steps as in the previous section, but change --with-scalar-type=real with --with-scalar-type=complex when configuring PETSc.
Distributed memory GetDP with MPI support for running on computer clusters
For MPI (distributed memory capable) GetDP versions you need to compile the MPI version of PETSc (here for PETSc 3.7.4):
curl -O http://ftp.mcs.anl.gov/pub/petsc/release-snapshots/petsc-3.7.4.tar.gztar zxvf petsc-3.7.4.tar.gzcd petsc-3.7.4export PETSC_DIR=$PWDexport PETSC_ARCH=complex_mumps_mpi./configure --with-debugging=0 --with-clanguage=cxx --with-shared-libraries=0 --with-x=0 --download-mumps=1 --download-metis=1 --download-parmetis=1 --download-scalapack=1 --download-blacs=1 --with-scalar-type=complex# Notes:# * remove option --with-scalar-type=complex to build in real arithmetic# * if the BLAS/LAPACK libraries are not installed in standard locations, you will have to specify their location by hand with options --with-blas-lib=/path/to/libblas and --with-lapack-lib=/path/to/liblapackmakecd ..
If you also wish to solve eigenvalue problems, compile SLEPc (here for SLEPc 3.7.1):
git clone https://gitlab.onelab.info/getdp/getdp.gitcd getdpmkdir bincd bincmake -DENABLE_MPI=1 -DENABLE_BLAS_LAPACK=0 ..# Notes :# * use option -DCMAKE_PREFIX_PATH=non-standard-install-path if you have libraries installed in non-standard locations# * use options -DPETSC_DIR=... -DPETSC_ARCH=... if the corresponding environment variables are not set properlymakecd ../..