Carputils - Installation

Installation of the CARP examples requires retrieving several git repositories, editing an environment variable and editing a settings file.

Git Access

First, you will need access to the repositories, hosted at https://bitbucket.org/carpusers/. To gain access, create a Bitbucket account, and contact gernot.plank@medunigraz.at for access to the carpusers team.

Note

In case you haven’t already setup SSH on your computer, it’s time to do so now. Create a key pair that contains a private key (saved to your local computer) and a public key (uploaded to bitbucket-ssh-keys). Bitbucket uses the key pair to authenticate anything the associated account can access.

Once these steps are complete, you can download and install the examples using the instructions below.

Automatic Installation

The easiest way to get the correct repositories is to use the automatic installation script. To use it, run the following in the terminal:

# You may need to escape the '!':
wget https://bitbucket.org/!api/2.0/snippets/carpusers/z9gpg/files/carp-examples-setup.sh
bash carp-examples-setup.sh

The script will prompt you to set which components to install and where to install them, and finally advise you to add some lines to your .bashrc and to edit the generated settings file. Once this is done, and your .bashrc has been refreshed, you can move on to running test examples.

Manual Installation

First, clone this repository to your system, in a location of your choice:

cd ~/software
git clone git@bitbucket.org:carpusers/carputils.git

Create a directory for examples, and clone the example repo(s) there:

mkdir ~/software/carp-examples
cd ~/software/carp-examples
git clone git@bitbucket.org:carpusers/devtests.git
git clone git@bitbucket.org:carpusers/benchmarks.git
git clone git@bitbucket.org:carpusers/tutorials.git
git clone git@bitbucket.org:carpusers/shellfun.git
git clone git@bitbucket.org:carpusers/limpetgui.git
git clone git@bitbucket.org:carpusers/pvprocess.git

Add both the carputils repo and the directory you made for examples to your PYTHONPATH in .bashrc:

export PYTHONPATH="$PYTHONPATH:$HOME/software/carputils:$HOME/software/carp-examples"

Finally, use the cusettings executable in carputils/bin to generate a carputils settings file. This can be placed in one of:

  • A location specified by the CARPUTILS_SETTINGS environment variable
  • ./settings.yaml (i.e. in the current working directory)
  • ~/.config/carputils/settings.yaml
  • In the root of the carputils git repository, called settings.yaml

The order above is also the order in which the settings are searched for, so setting a settings file path with the environment variable will override all the others, for example. The third option, under ~/.config, is recommended in most cases. To generate the settings file there run:

cusettings ~/.config/carputils/settings.yaml

You will finally need to modify this file so that the paths match your system installation. Note that in yaml indentation has meaning, so take care when editing!

It would not be entirely unexpected if you run into version issues with python. If this is the case, you’ll find the auto-pip shell script in shellfun/updating. Run this script once to upgrade all relevant packages to a sufficiently recent revision. It is recommended to add or upgrade Python packages in your local userspace by calling the script

./auto_pip

The previous approach to directly update Python’s system packages has in some cases produced system inconsistencies and is not recommended anymore.

Documentation

The documentation is based on Sphinx and can be generated using the Makefile in the carputils/doc folder. Several output formats are supported:

Please use `make <target>' where <target> is one of
html       to make standalone HTML files
dirhtml    to make HTML files named index.html in directories
singlehtml to make a single large HTML file
pickle     to make pickle files
json       to make JSON files
htmlhelp   to make HTML files and a HTML help project
qthelp     to make HTML files and a qthelp project
devhelp    to make HTML files and a Devhelp project
epub       to make an epub
latex      to make LaTeX files, you can set PAPER=a4 or PAPER=letter
latexpdf   to make LaTeX files and run them through pdflatex
latexpdfja to make LaTeX files and run them through platex/dvipdfmx
text       to make text files
man        to make manual pages
texinfo    to make Texinfo files
info       to make Texinfo files and run them through makeinfo
gettext    to make PO message catalogs
changes    to make an overview of all changed/added/deprecated items
xml        to make Docutils-native XML files
pseudoxml  to make pseudoxml-XML files for display purposes
linkcheck  to check all external links for integrity
doctest    to run all doctests embedded in the documentation (if enabled)

To generate a html version of the documentation

cd carputils/doc
make clean
make html

The html webpages are found then in the folder carputils/doc/build/html. To view point your favorite browser to

firefox ${CARPUTILS_DIR}/doc/build/html/index.html


.. _carputils-testing-system:

Testing System

To run regression tests, you will need to acquire the reference solutions. Make a directory to store them and clone the reference solution repositories there:

mkdir /data/carp-regression-reference
cd /data/carp-regression-reference
git clone git@bitbucket.org:carpusers/devtests-reference.git devtests
git clone git@bitbucket.org:carpusers/benchmarks-reference.git benchmarks

Make sure that the cloned respository name matches that of the corresponding test repositories above. Then, set the parent directory as the REGRESSION_REF setting in settings.yaml:

REGRESSION_REF: /data/carp-regression-reference

Optionally, set the REGRESSION_TEMP and REGRESSION_PKG settings in settings.yaml, as described in the file template.

Usage

With the installation complete, you should be able to cd to a test directory and run examples there with the run.py scripts. Run with:

./run.py

or:

python run.py

Add the --help or -h option to see a list of command line options and inputs defined for this specific experiment. The options shown by --help consist of four distinct sections, optional arguments, execution options and experimental input arguments, output arguments and debugging and profiling options.

usage: run.py [-h] [--np NP] [--tpp TPP] [--runtime HH:MM:SS]
              [--build {CPU,GPU}]
              [--flavor {petsc,boomeramg,parasails,pt,direct}]
              [--platform {desktop,archer,archer_intel,archer_camel,curie,medtronic,mephisto,smuc_f,smuc_t,smuc_i,vsc2,vsc3,wopr}]
              [--queue QUEUE] [--vectorized-fe VECTORIZED_FE] [--dry-run]
              [--CARP-opts CARP_OPTS] [--gdb [PROC [PROC ...]]]
              [--ddd [PROC [PROC ...]]] [--ddt] [--valgrind [OUTFILE]]
              [--valgrind-options [OPT=VAL [OPT=VAL ...]]] [--map]
              [--scalasca] [--ID ID] [--suffix SUFFIX]
              [--overwrite-behaviour {prompt,overwrite,delete,error}]
              [--silent] [--visualize] [--mech-element {P1-P1,P1-P0}]
              [--postprocess {phie,optic,activation,axial,filament,efield,mechanics}
                 [{phie,optic,activation,axial,filament,efield,mechanics} ...]]
              [--tend TEND] [--clean]

optional arguments:

  -h, --help            show this help message and exit
  --CARP-opts CARP_OPTS
                        arbitrary CARP options to append to command
  --mech-element {P1-P1,P1-P0}
                        CARP default mechanics finite element (default: P1-P0)
  --postprocess {phie,optic,activation,axial,filament,efield,mechanics} [{phie,optic,activation,axial,filament,efield,mechanics} ...]
                        postprocessing mode(s) to execute
  --tend TEND           Duration of simulation (ms)
  --clean               clean generated output files and directories

execution options:

  --np NP               number of processes
  --tpp TPP             threads per process
  --runtime HH:MM:SS    max job runtime
  --build {CPU,GPU}     CARP build to use (default: CPU)
  --flavor {petsc,boomeramg,parasails,pt,direct}
                        CARP flavor
  --platform {desktop,archer,archer_intel,archer_camel,curie,medtronic,mephisto,smuc_f,smuc_t,smuc_i,vsc2,vsc3,wopr}
                        pick a hardware profile from available platforms
  --queue QUEUE         select a queue to submit job to (batch systems only)
  --vectorized-fe VECTORIZED_FE
                        vectorized FE assembly (default: on with FEMLIB_CUDA,
                        off otherwise)
  --dry-run             show command line without running the test

debugging and profiling:

  --gdb [PROC [PROC ...]]
                        start (optionally specified) processes in gdb
  --ddd [PROC [PROC ...]]
                        start (optionally specified) processes in ddd
  --ddt                 start in Allinea ddt debugger
  --valgrind [OUTFILE]  start in valgrind mode, use in conjunction with --gdb
                        for interactive mode
  --valgrind-options [OPT=VAL [OPT=VAL ...]]
                        specify valgrind CLI options, without preceding '--'
                        (default: leak-check=full track-origins=yes)
  --map                 start using Allinea map profiler
  --scalasca            start in scalasca profiling mode

output options:

  --ID ID               manually specify the job ID (output directory)
  --suffix SUFFIX       add a suffix to the job ID (output directory)
  --overwrite-behaviour {prompt,overwrite,delete,error}
                        behaviour when output directory already exists
  --silent              toggle silent output
  --visualize           toggle test results visualisation

To understand how the command line of individual experiments is built it may be insightful to inspect the command line assembled and executed by the script. This is achieved by launching the run script with an additional --dry option. For instance, running a simple experiment like

./run.py --visualize --dry

yields the following output (marked up with additional explanatory comments)

--------------------------------------------------------------------------------------------------------------------------------
                                     Launching CARP Simulation 2017-06-13_simple_20.0_pt_np8
--------------------------------------------------------------------------------------------------------------------------------

# pick launcher, core number and executable
mpiexec -n 8 /home/plank/src/carp-dcse-pt/bin/carp.debug.petsc.pt \
\
# feed in static parameter sets compiled in the par-file
  +F simple.par \
\
# provide petsc and pt solver settings for elliptic/parabolic PDE, mechanics PDE and Purkinje system
  +F /home/plank/src/carputils/carputils/resources/options/pt_ell_amg \
  +F /home/plank/src/carputils/carputils/resources/options/pt_para_amg \
  +F /home/plank/src/carputils/carputils/resources/options/pt_mech_amg \
  -ellip_options_file /home/plank/src/carputils/carputils/resources/options/pt_ell_amg \
  -parab_options_file /home/plank/src/carputils/carputils/resources/options/pt_para_amg \
  -purk_options_file /home/plank/src/carputils/carputils/resources/petsc_options/pastix_opts \
  -mechanics_options_file /home/plank/src/carputils/carputils/resources/options/pt_mech_amg \
  \
# pick numerical settings, toggle between pt or petsc, Purkinje always uses petsc
  -ellip_use_pt 1 \
  -parab_use_pt 1 \
  -purk_use_pt 0 \
  -mech_use_pt 1 \
  \
# simulation ID is the name of the output directory
  -simID 2017-06-13_simple_20.0_pt_np1 \
  \
# pick the mesh we use and define simulation duration
  -meshname meshes/2016-02-20_aTlJAwvpWB/block \
  -tend 20.0 \
  \
# define electrical stimulus
  -stimulus[0].x0 450.0 \
  -stimulus[0].xd 100.0 \
  -stimulus[0].y0 -300.0 \
  -stimulus[0].yd 600.0 \
  -stimulus[0].z0 -300.0 \
  -stimulus[0].zd 600.0 \
  \
# some extra output needed for visualization purposes
  -gridout_i 3 \
  -gridout_e 3

-------------------------------------------------------------------------------------------------------------------------------
                                     Launching Meshalyzer
-------------------------------------------------------------------------------------------------------------------------------

/home/plank/src/meshalyzer/meshalyzer 2017-06-13_simple_20.0_pt_np1/block_i \
                                      2017-06-13_simple_20.0_pt_np1/vm.igb.gz simple.mshz

It is always possible to change general global settings such as the solvers to be used or the target computing platform for which the command line is being assembled.

For instance, switching from the default flavor pt set in settings.yaml to flavor petsc is achieved by

./run.py --visualize --flavor petsc --dry

Compared to above, the numerical settings selection has changed

...
# whenever possible, we use petsc solvers now
-ellip_use_pt 0 \
-parab_use_pt 0 \
-purk_use_pt 0 \
-mech_use_pt 0 \
...

Setting a different default platform in the settings.yaml file or at the command line through the --platform input parameter changes launcher and command line, also generating a submission script if run on a cluster. For instance, we can easily generate a submission script for the UK national supercomputer ARCHER by adding the appropriate platform string

./run.py --np 532 --platform archer --runtime 00:30:00 --dry

which writes a submission script and, without the --dry option, would directly submit to the queuing system

Requested mesh already exists, skipping generation.
---------------------------------------------------------------------------------------------------------------------------
                                Batch Job Submission
---------------------------------------------------------------------------------------------------------------------------

qsub 2017-06-14_simple_20.0_pt_np528.pbs

with the generated submission script

#!/bin/bash --login
#PBS -N 2017-06-14_simp
#PBS -l select=22
#PBS -l walltime=00:30:00
#PBS -A e348

# Make sure any symbolic links are resolved to absolute path
export PBS_O_WORKDIR=$(readlink -f $PBS_O_WORKDIR)

# Change to the directory that the job was submitted from
# (remember this should be on the /work filesystem)
cd $PBS_O_WORKDIR

# Set the number of threads to 1
#   This prevents any system libraries from automatically
#   using threading.
export OMP_NUM_THREADS=1

################################################################################
# Summary

# Run script executed with options:
# --np 528 --platform archer --runtime 00:30:00 --dry

################################################################################
# Execute Simulation

mkdir -p 2017-06-14_simple_20.0_pt_np528

aprun -n 528 /compute/src/carp-dcse-pt/bin/carp.debug.petsc.pt \
  +F simple.par \
  +F /compute/src/carputils/carputils/resources/options/pt_ell_amg_large \
  +F /compute/src/carputils/carputils/resources/options/pt_para_amg_large \
  +F /compute/src/carputils/carputils/resources/options/pt_mech_amg_large \
  -ellip_use_pt 1 \
  -parab_use_pt 1 \
  -purk_use_pt 0 \
  -mech_use_pt 1 \
  -ellip_options_file /compute/src/carputils/carputils/resources/options/pt_ell_amg_large \
  -parab_options_file /compute/src/carputils/carputils/resources/options/pt_para_amg_large \
  -purk_options_file /compute/src/carputils/carputils/resources/petsc_options/pastix_opts \
  -mechanics_options_file /compute/src/carputils/carputils/resources/options/pt_mech_amg_large \
  -vectorized_fe 0 \
  -mech_finite_element 0 \
  -simID 2017-06-14_simple_20.0_pt_np528 \
  -meshname meshes/2015-12-04_qdHibkmous/block \
  -tend 20.0 \
  -stimulus[0].x0 450.0 \
  -stimulus[0].xd 100.0 \
  -stimulus[0].y0 -300.0 \
  -stimulus[0].yd 600.0 \
  -stimulus[0].z0 -300.0 \
  -stimulus[0].zd 600.0

See Running Regression Tests for details on usage of the regression testing system.

Installing a Local Python Environment

Many HPC systems use older operating systems with out of data Python distributions, so you may find yourself without the required Python 2.7 version. Follow the below instructions to build and install a local Python environment.

Firstly, get the latest Python source from https://www.python.org/downloads/. Get the latest Python 2 version - carputils does not run in Python 3. Then, make a separate directory to contain the installed Python distribution (including the binaries and any extra modules you will install), e.g.:

cd $HOME
mkdir python-distribution

Extract the Python source tarball and run the configure script, setting the --prefix and --exec-prefix options to the path of the directory created above:

tar xvf Python-2.7.10.tar.gz
cd Python-2.7.10
./configure --prefix $HOME/python-distribution --exec-prefix $HOME/python-distribution

Then, build and install Python to the specified folder:

make
make install

The built Python interpreter should now be present in $HOME/python-distribution/bin. To make this interpreter the default when running python on the command line, add the following line to the bottom of the file ~/.bashrc:

export PATH=$HOME/python-distribution/bin:$PATH

Finally, install the pip Python package manager to facilitate the easier installation of Python modules later. This is done with:

python -m ensurepip
pip install --upgrade pip # To update to latest version

Additional Libraries

carputils uses some additional common numerical libraries in Python for pre- and post-processing of simulations. To install, run the following pip command:

pip install numpy scipy matplotlib

If scipy installation fails due to not finding BLAS/LAPACK, run with the bash variables BLAS and LAPACK set to the path of the shared library file:

BLAS=/path/to/libblas.so LAPACK=/path/to/liblapack.so pip install scipy

If no system BLAS/LAPACK is available, it is easily built from source. Download the latest LAPACK from http://www.netlib.org/lapack/ and unpack it, and copy the configuration template:

tar xvf lapack-3.5.0.tgz
cd lapack-3.5.0
cp make.inc.example make.inc

Then, edit make.inc and add the options -fPIC and -m64 (on 64 bit) to OPTS and NOOPT. Build BLAS and LAPACK with:

make blaslib
make

The shared library files librefblas.a and liblapack.a are then generated in the root of the source, and their paths can be passed to pip as above as the BLAS and LAPACK variables respectively.