| 54 | |
| 55 | == Selecting MPI == |
| 56 | |
| 57 | First, you will need to select a message passing interface on Greenplanet. Because Greenplanet is a cluster, we will not install our own mpich, but use greenplanet's MPI. Log in and enter the following command: |
| 58 | |
| 59 | {{{ |
| 60 | #!sh |
| 61 | mpi-selector --set openmpi_intel-1.5.4_psm |
| 62 | }}} |
| 63 | |
| 64 | ''Log out and log back in'' to apply this change. |
| 65 | |
| 66 | == Environment == |
| 67 | |
| 68 | On Greenplanet, add the following lines: |
| 69 | {{{ |
| 70 | #!sh |
| 71 | export ISSM_DIR=PATHTOTRUNK |
| 72 | source $ISSM_DIR/etc/environment.sh |
| 73 | module load gcc/4.7.3 |
| 74 | module load cmake/2.8.10.2 |
| 75 | }}} |
| 76 | |
| 77 | ''Log out and log back in'' to apply this change. |
| 78 | |
| 79 | == Installing ISSM on Greenplanet == |
| 80 | |
| 81 | Greenplanet will ''only'' be used to run the code, you will use your local machine for pre and post processing, you will never use Greenplanet's matlab. You can check out ISSM and install the following packages: |
| 82 | - PETSc (use the greenplanet script and ``follow`` the instructions, you will need to submit a job and compile PETSc manually, do not make test, it will not work on the cluster) |
| 83 | - m1qn3 |