Changes between Version 15 and Version 16 of pleiadescsh
- Timestamp:
- 01/21/22 12:57:36 (3 years ago)
Legend:
- Unmodified
- Added
- Removed
- Modified
-
pleiadescsh
v15 v16 30 30 #Packages 31 31 module load pkgsrc/2020Q4 32 module load comp-intel/2016.2.181 33 module load mpi-hpe/mpt 32 module load comp-intel/2018.3.222 33 module load mpi-intel/2018.3.222 34 35 #Set compilers 36 setenv CC mpicc 37 setenv CXX mpicxx 38 setenv F77 mpif77 34 39 }}} 35 40 … … 39 44 40 45 '''Do NOT install mpich'''. We have to use the one provided by NAS. Pleiades will ''only'' be used to run the code, you will use your local machine for pre and post processing, you will never use Pleiades' matlab. You can check out ISSM and install the following packages: 41 - PETSc (use the pleiades script install-3.1 3-pleiades.sh)46 - PETSc (use the pleiades script install-3.14-pleiades.sh or newer) 42 47 - m1qn3 43 48 … … 58 63 ./configure \ 59 64 --prefix=$ISSM_DIR \ 65 --enable-standalone-libraries \ 60 66 --with-wrappers=no \ 61 --with-petsc-dir="$ISSM_DIR/externalpackages/petsc/install" \ 62 --with-m1qn3-dir="$ISSM_DIR/externalpackages/m1qn3/install" \ 67 --with-m1qn3-dir=$ISSM_DIR/externalpackages/m1qn3/install \ 68 --with-metis-dir=$ISSM_DIR/externalpackages/petsc/install \ 69 --with-petsc-dir=$ISSM_DIR/externalpackages/petsc/install \ 70 --with-scalapack-lib="-L/nasa/intel/Compiler/2018.3.222/compilers_and_libraries_2018.3.222/linux/mkl/lib/intel64/libmkl_scalapack_lp64.so" \ 63 71 --with-mpi-include=" " \ 64 72 --with-mpi-libflags=" -lmpi" \ 65 --with-mkl-libflags="-L/nasa/intel/Compiler/2016.2.181/compilers_and_libraries_2016.2.181/linux/mkl/lib/intel64 -lmkl_intel_lp64 -lmkl_sequential -lmkl_core -lpthread -lm " \ 66 --with-metis-dir="$ISSM_DIR/externalpackages/petsc/install" \ 67 --with-mumps-dir="$ISSM_DIR/externalpackages/petsc/install" \ 68 --with-scalapack-lib="-L/nasa/intel/Compiler/2016.2.181/compilers_and_libraries_2016.2.181/linux/mkl/lib/intel64/libmkl_scalapack_lp64.so" \ 69 --with-cxxoptflags="-O3 -axAVX" \ 70 --with-fortran-lib="-L/nasa/intel/Compiler/2016.2.181/compilers_and_libraries_2016.2.181/linux/compiler/lib/intel64/ -lifcore -lifport" \ 71 --with-vendor="intel-pleiades" \ 73 --with-mkl-libflags="-L/nasa/intel/Compiler/2018.3.222/compilers_and_libraries_2018.3.222/linux/mkl/lib/intel64 -lmkl_intel_lp64 -lmkl_sequential -lmkl_core -lpthread -lm" \ 74 --with-mumps-dir=$ISSM_DIR/externalpackages/petsc/install \ 75 --with-fortran-lib="-L/nasa/intel/Compiler/2018.3.222/compilers_and_libraries_2018.3.222/linux/compiler/lib/intel64_lin/ -lifcore -lifport -lgfortran" \ 76 --with-cxxoptflags="-O3 " \ 77 --with-vendor="intel-pleiades-mpi" \ 72 78 --enable-development 73 79 }}} … … 75 81 == Installing ISSM on Pleiades with Dakota == 76 82 77 For Dakota to run, you you will still need to make PETSc and m1qn3, but you will need to make sure you are using the intel mpi and that the externalpackages are built with the mpi compilers.83 For Dakota to run, you will not need to remake PETSc and m1qn3, as long as you made them using the intel mpi and the mpi compilers, as described above. 78 84 79 In your `~/.cshrc`, add the following lines: 80 81 {{{ 82 #Set compilers 83 setenv CC mpicc 84 setenv CXX mpicxx 85 setenv F77 mpif77 86 }}} 87 88 And change your loaded packages to (note the removal of the pkgsrc): 85 In your `~/.cshrc`, change your loaded packages to remove the pkgsrc. Since pkgsrc loads an incompatible boost, it needs to be removed to build dakota: 89 86 90 87 {{{ … … 94 91 }}} 95 92 96 Then ''log out and log back in'', and reinstall the following packages: 97 - PETSc (use the pleiades script install-3.14-pleiades.sh) 98 - m1qn3 99 100 In addition, will need to build the external package: 93 Then ''log out and log back in'', and build the external packages: 101 94 - gsl, install-pleiades.sh 102 95 - boost, install-1.55-pleiades.sh 103 96 - dakota, install-6.2-pleiades.sh 104 97 105 Finally, you will need to use the following configuration script :98 Finally, you will need to use the following configuration script (i.e. adding the "with" lines for dakota and boost): 106 99 107 100 {{{ … … 123 116 --with-mumps-dir=$PETSC_ROOT \ 124 117 --with-fortran-lib="-L/nasa/intel/Compiler/2018.3.222/compilers_and_libraries_2018.3.222/linux/compiler/lib/intel64_lin/ -lifcore -lifport -lgfortran" \ 125 --with-cxxoptflags="-O3 -limf" \118 --with-cxxoptflags="-O3 " \ 126 119 --with-vendor="intel-pleiades-mpi" \ 127 120 --enable-development … … 157 150 use your username for the `login` and enter your code path and execution path. These settings will be picked up automatically by matlab when you do `md.cluster=pfe()` 158 151 159 Without dakota, make sure your module list includes scicon/app-tools, comp-intel/2016.2.181, and mpi-hpe/mpt. 160 With dakota, make sure your module list includes scicon/app-tools, comp-intel/2018.3.222, and mpi-intel/2018.3.222. You can specify your own list of modules by adding to `pfe_settings.m`, for example: 152 Make sure your module list includes scicon/app-tools, comp-intel/2018.3.222, and mpi-intel/2018.3.222. You can specify your own list of modules by adding to `pfe_settings.m`, for example: 161 153 162 154 {{{