[Trilinos-Users] Problem Building Teuchos

Matt G mgoodman at email.arizona.edu
Wed Mar 17 10:52:13 MDT 2010


I need to compile Trilinos with the Intel compiler chain with mpi support,
and I am getting hung up during a Teuchos blas test, which as far as I can
tell doesn't use mpi, but still complains about missing mpi symbols:

[  3%] Building CXX object
packages/teuchos/test/BLAS/CMakeFiles/Teuchos_BLAS_test.dir/cxx_main.cpp.o
cd /scr6/mgoodman/trilinos-10.0.5-Source/packages/teuchos/test/BLAS &&
/usr/local/uaopt/intel/cc/11.1/064/bin/intel64/icpc    -g
-I/scr6/mgoodman/trilinos-10.0.5-Source
-I/scr6/mgoodman/trilinos-10.0.5-Source/packages/teuchos/src   -o
CMakeFiles/Teuchos_BLAS_test.dir/cxx_main.cpp.o -c
/scr6/mgoodman/trilinos-10.0.5-Source/packages/teuchos/test/BLAS/cxx_main.cpp
Linking CXX executable Teuchos_BLAS_test.exe
cd /scr6/mgoodman/trilinos-10.0.5-Source/packages/teuchos/test/BLAS &&
/homeA/home3/u27/mgoodman/POST/bin/cmake -E cmake_link_script
CMakeFiles/Teuchos_BLAS_test.dir/link.txt --verbose=1
/usr/local/uaopt/intel/cc/11.1/064/bin/intel64/icpc   -g
CMakeFiles/Teuchos_BLAS_test.dir/cxx_main.cpp.o  -o Teuchos_BLAS_test.exe
-rdynamic -L/scr6/mgoodman/trilinos-10.0.5-Source/packages/teuchos/src
../../src/libteuchos.so /usr/local/lib/liblapack.so
/usr/local/lib/libblas.so
-Wl,-rpath,/scr6/mgoodman/trilinos-10.0.5-Source/packages/teuchos/src:/usr/local/lib
../../src/libteuchos.so: undefined reference to `MPI_Allgather'
../../src/libteuchos.so: undefined reference to `MPI_Bcast'
../../src/libteuchos.so: undefined reference to `MPI_Op_create'
../../src/libteuchos.so: undefined reference to `MPI_Alltoall'
../../src/libteuchos.so: undefined reference to `MPI_Initialized'
../../src/libteuchos.so: undefined reference to `MPI_Barrier'
../../src/libteuchos.so: undefined reference to `MPI_Finalize'
../../src/libteuchos.so: undefined reference to `MPI_Gather'
../../src/libteuchos.so: undefined reference to `MPI_Get_processor_name'
../../src/libteuchos.so: undefined reference to `MPI_Comm_size'
../../src/libteuchos.so: undefined reference to `MPI_Allreduce'
../../src/libteuchos.so: undefined reference to `MPI_Allgatherv'
../../src/libteuchos.so: undefined reference to `MPI_Wtime'
../../src/libteuchos.so: undefined reference to `MPI_Comm_rank'
../../src/libteuchos.so: undefined reference to `MPI_Init'
../../src/libteuchos.so: undefined reference to `MPI_Alltoallv'
../../src/libteuchos.so: undefined reference to `MPI_Gatherv'
make[2]: *** [packages/teuchos/test/BLAS/Teuchos_BLAS_test.exe] Error 1
make[2]: Leaving directory `/scr6/mgoodman/trilinos-10.0.5-Source'
make[1]: ***
[packages/teuchos/test/BLAS/CMakeFiles/Teuchos_BLAS_test.dir/all] Error 2
make[1]: Leaving directory `/scr6/mgoodman/trilinos-10.0.5-Source'
make: *** [all] Error 2

Here also is my cmake script:
cmake \
    -D CMAKE_BUILD_TYPE:STRING=DEBUG \
    -D Trilinos_ENABLE_TESTS:BOOL=OFF \
    -D CMAKE_Fortran_COMPILER:FILEPATH=ifort \
    -D CMAKE_CXX_COMPILER:FILEPATH=icpc \
    -D CMAKE_C_COMPILER:FILEPATH=icc \
    -D MPI_USE_COMPILER_WRAPPERS:BOOL=TRUE   \
    -D MPI_C_COMPILER:FILEPATH=/uaopt/intel/impi/3.2.2.006/bin64/mpiicc \
    -D MPI_CXX_COMPILER:FILEPATH=/uaopt/intel/impi/3.2.2.006/bin64/mpiicpc \
    -D MPI_Fortran_COMPILER:FILEPATH=/uaopt/intel/impi/
3.2.2.006/bin64/mpiifort   \
    -D MPI_BASE_DIR:PATH=/uaopt/intel/impi/3.2.2.006/ \
    -D Trilinos_EXTRA_LINK_FLAGS:STRING="-L/uaopt/intel/impi/3.2.2.006/lib64-lmpi
-lmpiif -lmpigi -lrt -lpthread -ldl" \
    -D BUILD_SHARED_LIBS=ON   \
    -D Trilinos_ENABLE_DEFAULT_PACKAGES:BOOL=ON \
    -D Trilinos_ENABLE_Epetra:BOOL=ON \
    -D Trilinos_ENABLE_EpetraExt:BOOL=ON \
    -D Trilinos_ENABLE_LOCA:BOOL=ON \
    -D Trilinos_ENABLE_Zoltan:BOOL=ON \
    -D Trilinos_ENABLE_Isorropia:BOOL=ON \
    -D Trilinos_ENABLE_Ifpack:BOOL=ON \
    -D Trilinos_ENABLE_Teuchos:BOOL=ON \
    -D Trilinos_ENABLE_ML:BOOL=ON \
    -D Trilinos_ENABLE_AztecOO:BOOL=ON \
    -D Trilinos_ENABLE_Anasazi:BOOL=ON \
    -D Trilinos_ENABLE_ForTrilinos:BOOL=ON \
    -D Trilinos_ENABLE_PyTrilinos:BOOL=ON \
    -D Trilinos_ENABLE_TESTS:BOOL=ON \
    -D Teuchos_ENABLE_MPI:BOOL=ON \
    -D Teuchos_ENABLE_ABC:BOOL=ON \
    -D Teuchos_ENABLE_COMPLEX:BOOL=ON \
    -D Teuchos_ENABLE_EXTENDED:BOOL=ON \
    -D DART_TESTING_TIMEOUT:STRING=600 \
    -D TPL_ENABLE_MPI:BOOL=ON \
    -D Trilinos_EXTRA_LINK_FLAGS:STRING="$FORTRAN_LIBRARIES" \
    -D CMAKE_VERBOSE_MAKEFILE:BOOL=TRUE       \
    -D Trilinos_VERBOSE_CONFIGURE:BOOL=TRUE   \
    .


Any thoughts?  I feel like it isn't invoking mpiicpc properly?  Not sure
where to go from here.  The gnu compiler/OpenMPI chain works perfectly, but
alas my HPC support team doesn't like it  . . .

Any help is greatly appreciated!
Thanks.

--Matthew Goodman

=====================
Find me on LinkedIn: http://tinyurl.com/d6wlch
Follow me on twitter: http://twitter.com/meawoppl
-------------- next part --------------
An HTML attachment was scrubbed...
URL: https://software.sandia.gov/pipermail/trilinos-users/attachments/20100317/80f688de/attachment.html 


More information about the Trilinos-Users mailing list