Changes between Version 15 and Version 16 of pleiadescsh


Ignore:
Timestamp:
01/21/22 12:57:36 (3 years ago)
Author:
schlegel
Comment:

--

Legend:

Unmodified
Added
Removed
Modified
  • pleiadescsh

    v15 v16  
    3030#Packages
    3131module load pkgsrc/2020Q4
    32 module load comp-intel/2016.2.181
    33 module load mpi-hpe/mpt
     32module load comp-intel/2018.3.222
     33module load mpi-intel/2018.3.222
     34
     35#Set compilers
     36setenv CC mpicc
     37setenv CXX mpicxx
     38setenv F77 mpif77
    3439}}}
    3540
     
    3944
    4045'''Do NOT install mpich'''. We have to use the one provided by NAS. Pleiades will ''only'' be used to run the code, you will use your local machine for pre and post processing, you will never use Pleiades' matlab. You can check out ISSM and install the following packages:
    41  - PETSc (use the pleiades script install-3.13-pleiades.sh)
     46 - PETSc (use the pleiades script install-3.14-pleiades.sh or newer)
    4247 - m1qn3
    4348
     
    5863./configure \
    5964 --prefix=$ISSM_DIR \
     65 --enable-standalone-libraries \
    6066 --with-wrappers=no \
    61  --with-petsc-dir="$ISSM_DIR/externalpackages/petsc/install" \
    62  --with-m1qn3-dir="$ISSM_DIR/externalpackages/m1qn3/install" \
     67 --with-m1qn3-dir=$ISSM_DIR/externalpackages/m1qn3/install \
     68 --with-metis-dir=$ISSM_DIR/externalpackages/petsc/install \
     69 --with-petsc-dir=$ISSM_DIR/externalpackages/petsc/install \
     70 --with-scalapack-lib="-L/nasa/intel/Compiler/2018.3.222/compilers_and_libraries_2018.3.222/linux/mkl/lib/intel64/libmkl_scalapack_lp64.so" \
    6371 --with-mpi-include=" " \
    6472 --with-mpi-libflags=" -lmpi" \
    65  --with-mkl-libflags="-L/nasa/intel/Compiler/2016.2.181/compilers_and_libraries_2016.2.181/linux/mkl/lib/intel64 -lmkl_intel_lp64 -lmkl_sequential -lmkl_core -lpthread -lm " \
    66  --with-metis-dir="$ISSM_DIR/externalpackages/petsc/install" \
    67  --with-mumps-dir="$ISSM_DIR/externalpackages/petsc/install" \
    68  --with-scalapack-lib="-L/nasa/intel/Compiler/2016.2.181/compilers_and_libraries_2016.2.181/linux/mkl/lib/intel64/libmkl_scalapack_lp64.so" \
    69  --with-cxxoptflags="-O3 -axAVX" \
    70  --with-fortran-lib="-L/nasa/intel/Compiler/2016.2.181/compilers_and_libraries_2016.2.181/linux/compiler/lib/intel64/ -lifcore -lifport" \
    71  --with-vendor="intel-pleiades" \
     73 --with-mkl-libflags="-L/nasa/intel/Compiler/2018.3.222/compilers_and_libraries_2018.3.222/linux/mkl/lib/intel64 -lmkl_intel_lp64 -lmkl_sequential -lmkl_core -lpthread -lm" \
     74 --with-mumps-dir=$ISSM_DIR/externalpackages/petsc/install \
     75 --with-fortran-lib="-L/nasa/intel/Compiler/2018.3.222/compilers_and_libraries_2018.3.222/linux/compiler/lib/intel64_lin/ -lifcore -lifport -lgfortran" \
     76 --with-cxxoptflags="-O3 " \
     77 --with-vendor="intel-pleiades-mpi" \
    7278 --enable-development
    7379}}}
     
    7581== Installing ISSM on Pleiades with Dakota ==
    7682
    77 For Dakota to run, you you will still need to make PETSc and m1qn3, but you will need to make sure you are using the intel mpi and that the externalpackages are built with the mpi compilers.
     83For Dakota to run, you will not need to remake PETSc and m1qn3, as long as you made them using the intel mpi and the mpi compilers, as described above.
    7884
    79 In your `~/.cshrc`, add the following lines:
    80 
    81 {{{
    82 #Set compilers
    83 setenv CC mpicc
    84 setenv CXX mpicxx
    85 setenv F77 mpif77
    86 }}}
    87 
    88 And change your loaded packages to (note the removal of the pkgsrc):
     85In your `~/.cshrc`, change your loaded packages to remove the pkgsrc. Since pkgsrc loads an incompatible boost, it needs to be removed to build dakota:
    8986
    9087{{{
     
    9491}}}
    9592
    96 Then ''log out and log back in'', and reinstall the following packages:
    97  - PETSc (use the pleiades script install-3.14-pleiades.sh)
    98  - m1qn3
    99  
    100 In addition, will need to build the external package:
     93Then ''log out and log back in'', and build the external packages:
    10194 - gsl, install-pleiades.sh
    10295 - boost, install-1.55-pleiades.sh
    10396 - dakota, install-6.2-pleiades.sh
    10497
    105 Finally, you will need to use the following configuration script:
     98Finally, you will need to use the following configuration script (i.e. adding the "with" lines for dakota and boost):
    10699
    107100{{{
     
    123116 --with-mumps-dir=$PETSC_ROOT \
    124117 --with-fortran-lib="-L/nasa/intel/Compiler/2018.3.222/compilers_and_libraries_2018.3.222/linux/compiler/lib/intel64_lin/ -lifcore -lifport -lgfortran" \
    125  --with-cxxoptflags="-O3 -limf" \
     118 --with-cxxoptflags="-O3 " \
    126119 --with-vendor="intel-pleiades-mpi" \
    127120 --enable-development
     
    157150use your username for the `login` and enter your code path and execution path. These settings will be picked up automatically by matlab when you do `md.cluster=pfe()`
    158151
    159 Without dakota, make sure your module list includes scicon/app-tools, comp-intel/2016.2.181, and mpi-hpe/mpt.
    160 With dakota, make sure your module list includes scicon/app-tools, comp-intel/2018.3.222, and mpi-intel/2018.3.222. You can specify your own list of modules by adding to `pfe_settings.m`, for example:
     152Make sure your module list includes scicon/app-tools, comp-intel/2018.3.222, and mpi-intel/2018.3.222. You can specify your own list of modules by adding to `pfe_settings.m`, for example:
    161153
    162154{{{