Hello,

I am attempting to use PETSc with CUDA awareness in ISSM. PETSc does compile successfully but, when I try to make ISSM, I get these errors:

CXX main/kriging-kriging.o
CXXLD libISSMCore.la
CXXLD libISSMOverload.la
CXXLD libISSMModules.la
CXXLD issm.exe
CXXLD issm_slc.exe
CXXLD kriging.exe
/u/kmosley/issm-files/ISSM/externalpackages/petsc/install/lib/libpetsc.so: undefined reference to cusparseSpMV_preprocess@libcusparse.so.12'
/u/kmosley/issm-files/ISSM/externalpackages/petsc/install/lib/libpetsc.so: undefined reference to
cusparseSpMV_preprocess@libcusparse.so.12'
/u/kmosley/issm-files/ISSM/externalpackages/petsc/install/lib/libpetsc.so: undefined reference to `cusparseSpMV_preprocess@libcusparse.so.12'
collect2: error: ld returned 1 exit status
make[3]: *** [Makefile:5236: issm.exe] Error 1
make[3]: *** Waiting for unfinished jobs....
collect2: error: ld returned 1 exit status
make[3]: *** [Makefile:5266: kriging.exe] Error 1
collect2: error: ld returned 1 exit status
make[3]: *** [Makefile:5260: issm_slc.exe] Error 1
make[3]: Leaving directory '/u/kmosley/issm-files/ISSM/src/c'
make[2]: *** [Makefile:465: all-recursive] Error 1
make[2]: Leaving directory '/u/kmosley/issm-files/ISSM/src'
make[1]: *** [Makefile:520: all-recursive] Error 1
make[1]: Leaving directory '/u/kmosley/issm-files/ISSM'
make: *** [Makefile:452: all] Error 2

Is there something I should add to the configure script to make this work? I will attach my PETSc and ISSM configure scripts. Thanks

install-322-linux.sh
1kB
configure.sh
807B

    Hi Ken, you have to find which libraries defines this symbol and add it to your configuration script. Can you check whether /u/kmosley/issm-files/ISSM/externalpackages/petsc/install/lib/libcusparse.so.12 defines it?
    Cheers
    Mathieu

    kmosley Hi Ken, I think we may have to modify the Autom4ke files in order make this work. You may be able to get away with it temporarily by adding,

    export LDFLAGS="${LDFLAGS} -Wl,-L${ISSM_DIR}/externalpackages/petsc/install/lib -Wl,-lcusparse"

    to your config. By the way, what do you get from,

    cd $ISSM_DIR/externalpackages/petsc/src; make getlinklibs

      justinquinn Hi, this is what I get

      [kmosley@dt-login01 petsc]$ cd $ISSM_DIR/externalpackages/petsc; make getlinklibs
      -Wl,-rpath,/u/kmosley/issm-files/ISSM/externalpackages/petsc/src/arch-linux-c-opt/lib -L/u/kmosley/issm-files/ISSM/externalpackages/petsc/src/arch-linux-c-opt/lib -Wl,-rpath,/u/kmosley/issm-files/ISSM/externalpackages/petsc/install/lib -L/u/kmosley/issm-files/ISSM/externalpackages/petsc/install/lib -Wl,-rpath,/sw/spack/deltas11-2023-03/apps/linux-rhel8-zen3/gcc-11.4.0/cuda-11.8.0-vfixfmc/lib64 -L/sw/spack/deltas11-2023-03/apps/linux-rhel8-zen3/gcc-11.4.0/cuda-11.8.0-vfixfmc/lib64 -L/sw/spack/deltas11-2023-03/apps/linux-rhel8-zen3/gcc-11.4.0/cuda-11.8.0-vfixfmc/lib64/stubs -Wl,-rpath,/sw/spack/deltas11-2023-03/apps/linux-rhel8-x86_64/gcc-8.5.0/gcc-11.4.0-yycklku/lib64 -L/sw/spack/deltas11-2023-03/apps/linux-rhel8-x86_64/gcc-8.5.0/gcc-11.4.0-yycklku/lib64 -Wl,-rpath,/sw/spack/deltas11-2023-03/apps/linux-rhel8-x86_64/gcc-8.5.0/gcc-11.4.0-yycklku/lib/gcc/x86_64-pc-linux-gnu/11.4.0 -L/sw/spack/deltas11-2023-03/apps/linux-rhel8-x86_64/gcc-8.5.0/gcc-11.4.0-yycklku/lib/gcc/x86_64-pc-linux-gnu/11.4.0 -Wl,-rpath,/sw/spack/deltas11-2023-03/apps/linux-rhel8-zen3/gcc-11.4.0/openmpi-4.1.6-lranp74/lib -L/sw/spack/deltas11-2023-03/apps/linux-rhel8-zen3/gcc-11.4.0/openmpi-4.1.6-lranp74/lib -Wl,-rpath,/sw/spack/deltas11-2023-03/apps/linux-rhel8-x86_64/gcc-8.5.0/gcc-11.4.0-yycklku/lib -L/sw/spack/deltas11-2023-03/apps/linux-rhel8-x86_64/gcc-8.5.0/gcc-11.4.0-yycklku/lib -lpetsc -ldmumps -lmumps_common -lpord -lpthread -lscalapack -lflapack -lfblas -lparmetis -lmetis -lm -lz -lcudart -lnvToolsExt -lcufft -lcublas -lcusparse -lcusolver -lcurand -lcuda -ldl -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lgfortran -lm -lgfortran -lm -lgcc_s -lquadmath -lpthread -lstdc++ -ldl.

        kmosley Okay, so to link in CUDA, you can try,

        export LDFLAGS="${LDFLAGS} -Wl,-L${ISSM_DIR}/externalpackages/petsc/install/lib -Wl,-lcudart -Wl,-lnvToolsExt -Wl,-lcufft -Wl,-lcublas -Wl,-lcusparse -Wl,-lcusolver -Wl,-lcurand -Wl,-lcuda"

        I can't promise it will work though. I'll work on incorporating this into our build system though and update you when I've committed the changes.

          justinquinn Would something like this work for OpenMPI? I'm trying to install OpenMPI from my cluster but, when I make I get the following:

          CXXLD libISSMCore.la
          /usr/bin/ld: cannot find -ludev
          /usr/bin/ld: cannot find -levent_core
          /usr/bin/ld: cannot find -levent_pthreads
          collect2: error: ld returned 1 exit status
          make[3]: *** [Makefile:4885: libISSMCore.la] Error 1
          make[3]: Leaving directory '/home/kenneth.mosley/issm-files/ISSM/src/c'
          make[2]: *** [Makefile:466: all-recursive] Error 1
          make[2]: Leaving directory '/home/kenneth.mosley/issm-files/ISSM/src'
          make[1]: *** [Makefile:521: all-recursive] Error 1
          make[1]: Leaving directory '/home/kenneth.mosley/issm-files/ISSM'
          make: *** [Makefile:453: all] Error 2

            kmosley Hi Ken, let's try to do one thing at a time. Besides, you installed MPICH via PETSc and supplied the path you in your ISSM configuration.

            Did you get ISSM linking with CUDA already?