Hi everyone,

Sorry to bother again. I used intel compilers 2017 and gcc 8.3 to compile PETsc 3.17 without much trouble, however it didn't go that well for ISSM. The system is running CentOS 7.6.
I used to compile the same stuff all right with intel 2018+gcc7. However, this new cluster does not have intel 2018 available.

Cheers,
Wade

$ make
make all-recursive
make[1]: Entering directory '/work1/xdsc0571/ISSM_src/ISSM_4.21'
Making all in src
make[2]: Entering directory '/work1/xdsc0571/ISSM_src/ISSM_4.21/src'
Making all in c
make[3]: Entering directory '/work1/xdsc0571/ISSM_src/ISSM_4.21/src/c'
CXX modules/Bamgx/libISSMCore_la-Bamgx.lo
In file included from /public/home/xdsc0571/work/ISSM_src/ISSM_4.21/externalpackages/petsc/install/include/petscsys.h(1454),
from /public/home/xdsc0571/work/ISSM_src/ISSM_4.21/externalpackages/petsc/install/include/petscvec.h(9),
from /public/home/xdsc0571/work/ISSM_src/ISSM_4.21/externalpackages/petsc/install/include/petscmat.h(6),
from /public/home/xdsc0571/work/ISSM_src/ISSM_4.21/externalpackages/petsc/install/include/petscpc.h(6),
from /public/home/xdsc0571/work/ISSM_src/ISSM_4.21/externalpackages/petsc/install/include/petscksp.h(6),
from ./modules/Bamgx/../../toolkits/./petsc/petscincludes.h(11),
from ./modules/Bamgx/../../toolkits/toolkits.h(15),
from ./modules/Bamgx/Bamgx.cpp(7):
/public/home/xdsc0571/work/ISSM_src/ISSM_4.21/externalpackages/petsc/install/include/petsclog.h(394): error: identifier "PETSC_FUNCTION_NAME_CXX" is undefined
PetscCallMPI(MPI_Type_size(type,&typesize));
^

In file included from /public/home/xdsc0571/work/ISSM_src/ISSM_4.21/externalpackages/petsc/install/include/petscsys.h(1454),
from /public/home/xdsc0571/work/ISSM_src/ISSM_4.21/externalpackages/petsc/install/include/petscvec.h(9),
from /public/home/xdsc0571/work/ISSM_src/ISSM_4.21/externalpackages/petsc/install/include/petscmat.h(6),
from /public/home/xdsc0571/work/ISSM_src/ISSM_4.21/externalpackages/petsc/install/include/petscpc.h(6),
from /public/home/xdsc0571/work/ISSM_src/ISSM_4.21/externalpackages/petsc/install/include/petscksp.h(6),
from ./modules/Bamgx/../../toolkits/./petsc/petscincludes.h(11),
from ./modules/Bamgx/../../toolkits/toolkits.h(15),
from ./modules/Bamgx/Bamgx.cpp(7):
/public/home/xdsc0571/work/ISSM_src/ISSM_4.21/externalpackages/petsc/install/include/petsclog.h(404): error: identifier "PETSC_FUNCTION_NAME_CXX" is undefined
PetscCallMPI(MPI_Comm_size(comm,&size));
^

PETsc configure command:
./configure --COPTFLAGS="-Ofast -march=core-avx2 -I"${MKLROOT}/include"" --CXXOPTFLAGS="-Ofast -march=core-avx2 -I"${MKLROOT}/include"" --FOPTFLAGS="-Ofast -march=core-avx2" --prefix=/work1/xdsc0571/ISSM_src/ISSM_4.21/externalpackages/petsc/install --PETSC_DIR=/work1/xdsc0571/ISSM_src/ISSM_4.21/externalpackages/petsc/src --download-fblaslapack --with-debugging=0 --with-valgrind=0 --with-x=0 --with-ssl=0 --with-shared-libraries=1 --download-metis=1 --download-parmetis=1 --download-mumps=1 --with-blas-lapack-dir="${MKLROOT}" --with-scalapack-include="${MKLROOT}/include" --with-scalapack-lib="-L${MKLROOT}/lib/intel64 -lmkl_scalapack_lp64 -lmkl_cdft_core -lmkl_intel_lp64 -lmkl_intel_thread -lmkl_core -lmkl_blacs_intelmpi_lp64 -liomp5 -lpthread -lm -ldl" --with-mpi-dir=/opt/hpc/software/mpi/hpcx/v2.7.4/intel-2017.5.239 --known-mpi-shared-libraries=1

ISSM configure command:
CC=mpicc CXX=mpicxx F77=mpifort F90=mpifort ./configure \
--prefix=$ISSM_DIR/ \
--with-matlab-dir=/public/home/xdsc0571/matlab-test/MATLAB-install/R2021b \
--with-triangle-dir=$ISSM_DIR/externalpackages/triangle/install \
--with-mpi-include="/opt/hpc/software/mpi/hpcx/v2.7.4/intel-2017.5.239/include" \
--with-mpi-libflags="-L/opt/hpc/software/mpi/hpcx/v2.7.4/intel-2017.5.239/lib/ -lmpi" \
--with-petsc-dir=$ISSM_DIR/externalpackages/petsc/install \
--with-metis-dir=$ISSM_DIR/externalpackages/petsc/install \
--with-blas-lapack-dir=$ISSM_DIR/externalpackages/petsc/install \
--with-scalapack-include="$MKLROOT/include" \
--with-scalapack-lib="-L${MKLROOT}/lib/intel64 -lmkl_scalapack_lp64 -lmkl_cdft_core -lmkl_intel_lp64 -lmkl_intel_thread -lmkl_core -lmkl_blacs_intelmpi_lp64 -liomp5 -lpthread -lm -ldl" \
--with-mkl-libflags="$MKLROOT/lib/intel64/libmkl_blas95_ilp64.a $MKLROOT/lib/intel64/libmkl_lapack95_ilp64.a -L$MKLROOT/lib/intel64 -lmkl_scalapack_ilp64 -lmkl_cdft_core -lmkl_intel_ilp64 -lmkl_intel_thread -lmkl_core -lmkl_blacs_intelmpi_ilp64 -liomp5 -lpthread -lifcore -lm -ldl" \
--with-mumps-dir=$ISSM_DIR/externalpackages/petsc/install \
--with-m1qn3-dir=$ISSM_DIR/externalpackages/m1qn3/install \
--with-semic-dir=$ISSM_DIR/externalpackages/semic/install \
--with-cxxoptflags="-Ofast -march=core-avx2 -ansi" \
--with-numthreads=16

    9 days later

    wadewei Hi Wade,
    I would guess that you need to ensure that PETSc is compiled with the same MPI implementation. If you run which mpicc in a new terminal, does it return /opt/hpc/software/mpi/hpcx/v2.7.4/intel-2017.5.239/bin/mpicc (or something similar)? If so, I would try adding,

    export CC=mpicc
    export CXX=mpicxx
    export FC=mpifort

    to the top of your PETSc install script and rerunning it.

      10 days later

      justinquinn
      Hi Justin,

      Thank you for your kind reply. That is exactly what I suspected. Something could have mixed up in the process. I recompiled from scratch, and everything went on smoothly.