Changes between Version 26 and Version 27 of pleiadesbash


Ignore:
Timestamp:
10/06/20 09:39:37 (5 years ago)
Author:
Mathieu Morlighem
Comment:

--

Legend:

Unmodified
Added
Removed
Modified
  • pleiadesbash

    v26 v27  
    3737'''Do NOT install mpich'''. We have to use the one provided by NAS. Pleiades will ''only'' be used to run the code, you will use your local machine for pre and post processing, you will never use Pleiades' matlab. You can check out ISSM and install the following packages:
    3838 - m1qn3
    39  - PETSc (see instructions below)
    40 
    41 Installing PETSc on a cluster is challenging because PETSc cannot call `mpiexec` without formally submitting a job. You first need to run the installation script `install-3.7-pleiades.sh`. You will then be stopped by PETSc with the following message:
    42 {{{
    43 #!sh
    44 =================================================================================
    45     Since your compute nodes require use of a batch system or mpiexec you must:
    46 
    47  1) cd /home3/mmorligh/issm/trunk-jpl/externalpackages/petsc/src
    48 
    49  2) Submit ./conftest-arch-linux2-c-opt to 1 processor of your batch system or system you are
    50     cross-compiling for; this will generate the file reconfigure-arch-linux2-c-opt.py
    51 
    52  3) Run ./reconfigure-arch-linux2-c-opt.py (to complete the configure process).
    53 =================================================================================
    54 }}}
    55 
    56 To make your life easy, you have a script ready to go. Go to the `src` folder, where PETSc is being installed, open `script.queue` and change `group_list` to `s1507` if you are in Mathieu's group. Then submit the job using `qsub`:
    57 {{{
    58 #!sh
    59 qsub script.queue
    60 }}}
    61 
    62 A job ID (similar to `3513616.pbspl1.nas.nasa.gov`) will appear to the screen. Once the job is done, you should see a new file in the current directory: `./reconfigure-arch-linux2-c-opt.py`. Once you see it, you can run:
    63 {{{
    64 #!sh
    65 ./reconfigure-arch-linux2-c-opt.py
    66 }}}
    67 PETSc is going to compile its packages, and at the end, you will need to copy and paste the `make` commands:
    68 {{{
    69 #!sh
    70 xxx=========================================================================xxx
    71  Configure stage complete. Now build PETSc libraries with (gnumake build):
    72    make PETSC_DIR=/home1/mmorligh/issm/trunk-jpl//externalpackages/petsc/src PETSC_ARCH=arch-linux2-c-opt all
    73 xxx=========================================================================xxx
    74 }}}
    75 Copy and execute `make PETSC_DIR=....`. Then:
    76 {{{
    77 #!sh
    78 =========================================
    79 gmake[1]: Leaving directory '/home3/mmorligh/issm/trunk-jpl/externalpackages/petsc/src'
    80 Now to install the libraries do:
    81 make PETSC_DIR=/home1/mmorligh/issm/trunk-jpl//externalpackages/petsc/src PETSC_ARCH=arch-linux2-c-opt install
    82 =========================================
    83 }}}
    84 Copy and execute `make PETSC_DIR=.... PETSC_ARCH=arch-linux2-c-opt install`. And that's it!
     39 - PETSc (use `install-3.13-pleiades.sh` or newer)
    8540
    8641For documentation of pleiades, see here: http://www.nas.nasa.gov/hecc/support/kb/
     
    10459 --with-mpi-include=" " \
    10560 --with-mpi-libflags=" -lmpi" \
    106  --with-mkl-libflags="-L/nasa/intel/Compiler/2016.2.181/compilers_and_libraries_2016.2.181/linux/mkl/lib/intel64 -lmkl_intel_lp64 -lmkl_sequential -lmkl_core -lpthread -lm " \
     61 --with-mkl-libflags="-L/nasa/intel/Compiler/2016.2.181/mkl/lib/intel64/ -mkl=cluster " \
    10762 --with-metis-dir="$ISSM_DIR/externalpackages/petsc/install" \
    10863 --with-mumps-dir="$ISSM_DIR/externalpackages/petsc/install" \
    10964 --with-scalapack-dir="$ISSM_DIR/externalpackages/petsc/install" \
    110  --with-cxxoptflags="-O3 -axAVX" \
     65 --with-cxxoptflags="-g -O3 -axCORE-AVX2,AVX -xSSE4.2 -ipo -no-inline-min-size -inline-max-size=345 -no-inline-max-total-size -no-inline-max-per-routine -no-inline-max-per-compile " \
    11166 --with-fortran-lib="-L/nasa/intel/Compiler/2016.2.181/compilers_and_libraries_2016.2.181/linux/compiler/lib/intel64/ -lifcore -lifport" \
    11267 --with-vendor="intel-pleiades" \
     
    150105   --with-mpi-include=" " \
    151106   --with-mpi-libflags=" -lmpi" \
    152    --with-mkl-libflags="-L/nasa/intel/Compiler/2016.2.181/mkl/lib/intel64/ -lmkl_intel_lp64 -lmkl_sequential -lmkl_core -lpthread -lm " \
     107   --with-mkl-libflags="-L/nasa/intel/Compiler/2016.2.181/mkl/lib/intel64/ -mkl=cluster " \
    153108   --with-metis-dir="$ISSM_DIR/externalpackages/petsc/install" \
    154109   --with-mumps-dir="$ISSM_DIR/externalpackages/petsc/install" \
    155110   --with-scalapack-dir="$ISSM_DIR/externalpackages/petsc/install" \
    156    --with-cxxoptflags="-g -O3 -axCORE-AVX2,AVX -xSSE4.2" \
     111   --with-cxxoptflags="-g -O3 -axCORE-AVX2,AVX -xSSE4.2 -ipo -no-inline-min-size -inline-max-size=345 -no-inline-max-total-size -no-inline-max-per-routine -no-inline-max-per-compile " \
    157112   --with-fortran-lib="-L/nasa/intel/Compiler/2016.2.181/compilers_and_libraries_2016.2.181/linux/compiler/lib/intel64/ -lifcore -lifport" \
    158113   --with-gsl-dir="$ISSM_DIR/externalpackages/gsl/install" \