73 | | For Dakota to run, you you will still need to make PETSc, and m1qn3. |
74 | | |
75 | | In addition, will need to build the external packages: |
| 73 | For Dakota to run, you you will still need to make PETSc and m1qn3, but you will need to make sure you are using the intel mpi and that the externalpackages are built with the mpi compilers. |
| 74 | |
| 75 | In your `~/.cshrc`, add the following lines: |
| 76 | |
| 77 | {{{ |
| 78 | #Set compilers |
| 79 | setenv CC mpicc |
| 80 | setenv CXX mpicxx |
| 81 | setenv F77 mpif77 |
| 82 | }}} |
| 83 | |
| 84 | And change your loaded packages to (note the removal of the pkgsrc): |
| 85 | |
| 86 | {{{ |
| 87 | #Packages |
| 88 | module load comp-intel/2018.3.222 |
| 89 | module load mpi-intel/2018.3.222 |
| 90 | }}} |
| 91 | |
| 92 | Then ''log out and log back in'', and reinstall the following packages: |
| 93 | - PETSc (use the pleiades script install-3.14-pleiades.sh) |
| 94 | - m1qn3 |
| 95 | |
| 96 | In addition, will need to build the external package: |
| 97 | - gsl, install-pleiades.sh |
| 98 | - boost, install-1.55-pleiades.sh |
88 | | --with-petsc-dir="$ISSM_DIR/externalpackages/petsc/install" \ |
89 | | --with-m1qn3-dir="$ISSM_DIR/externalpackages/m1qn3/install" \ |
90 | | --with-boost-dir=/nasa/pkgsrc/sles12/2018Q3/ \ |
| 109 | --with-m1qn3-dir=$ISSM_DIR/externalpackages/m1qn3/install \ |
| 110 | --with-triangle-dir=$ISSM_DIR/externalpackages/triangle/install \ |
| 111 | --with-metis-dir=$PETSC_ROOT \ |
| 112 | --with-petsc-dir=$PETSC_ROOT \ |
| 113 | --with-scalapack-lib="-L/nasa/intel/Compiler/2018.3.222/compilers_and_libraries_2018.3.222/linux/mkl/lib/intel64/libmkl_scalapack_lp64.so" \ |
| 114 | --with-boost-dir=$ISSM_DIR/externalpackages/boost/install \ |
95 | | --with-mkl-libflags="-L/nasa/intel/Compiler/2016.2.181/compilers_and_libraries_2016.2.181/linux/mkl/lib/intel64 -lmkl_intel_lp64 -lmkl_sequential -lmkl_core -lpthread -lm -limf -lsvml -lirc" \ |
96 | | --with-metis-dir="$ISSM_DIR/externalpackages/petsc/install" \ |
97 | | --with-mumps-dir="$ISSM_DIR/externalpackages/petsc/install" \ |
98 | | --with-scalapack-dir="$ISSM_DIR/externalpackages/petsc/install" \ |
99 | | --with-graphics-lib="/usr/lib64/libX11.so" \ |
100 | | --with-fortran-lib="-L/nasa/intel/Compiler/2016.2.181/compilers_and_libraries_2016.2.181/linux/compiler/lib/intel64/ -lifcore -lifport" \ |
| 118 | --with-mkl-libflags="-L/nasa/intel/Compiler/2018.3.222/compilers_and_libraries_2018.3.222/linux/mkl/lib/intel64 -lmkl_intel_lp64 -lmkl_sequential -lmkl_core -lpthread -lm" \ |
| 119 | --with-mumps-dir=$PETSC_ROOT \ |
| 120 | --with-fortran-lib="-L/nasa/intel/Compiler/2018.3.222/compilers_and_libraries_2018.3.222/linux/compiler/lib/intel64_lin/ -lifcore -lifport -lgfortran" \ |
| 121 | --with-cxxoptflags="-O3 " \ |
154 | | use your username for the `login` and enter your code path and execution path. Be sure to create the final execution directory (mkdir) within the nobackup folder. These settings will be picked up automatically by matlab when you do `md.cluster=pfe()`. To determine your `grouplist`, on Pleiades run: |
| 191 | use your username for the `login` and enter your code path and execution path. Be sure to create the final execution directory (mkdir) within the nobackup folder. These settings will be picked up automatically by matlab when you do `md.cluster=pfe()`. |
| 192 | |
| 193 | Without dakota, make sure your module list includes scicon/app-tools, comp-intel/2016.2.181, and mpi-hpe/mpt. |
| 194 | With dakota, make sure your module list includes scicon/app-tools, comp-intel/2018.3.222, and mpi-intel/2018.3.222. You can specify your own list of modules by adding to `pfe_settings.m`, for example: |
| 195 | |
| 196 | {{{ |
| 197 | cluster.modules = {'comp-intel/2018.3.222' 'mpi-intel/2018.3.222' 'scicon/app-tools'}; |
| 198 | }}} |
| 199 | |
| 200 | To determine your `grouplist`, on Pleiades run: |