[Trilinos-Users] mpirun must be used to launch all MPI applications

Daniel Wheeler daniel.wheeler2 at gmail.com
Thu Mar 19 14:05:56 MDT 2009


I finally reaized that Trilinos was linking with -lmpi rather than
-lmpich by default. So, the following command configures, and builds
trilinos correctly on the altix at least for my uses (pytrilinos,
aztec00, epetra and ml).

   $ ../configure CC=icc CXX=icc F77=ifort CFLAGS="-O3 -fPIC"
CXXFLAGS="-O3 -fPIC -LANG:std -LANG:ansi -DMPI_NO_CPPBIND" FFLAGS="-O3
-fPIC" --prefix=${USR} --with-install="/usr/bin/install -p"
--with-blas="-L/opt/intel/Compiler/11.0/081/mkl/lib/64
-lmkl_intel_lp64 -lmkl_intel_thread -lmkl_lapack -lmkl_core -liomp5
-lpthread" --with-lapack="-L/opt/intel/Compiler/11.0/081/mkl/lib/64
-lmkl_intel_lp64 -lmkl_intel_thread -lmkl_lapack -lmkl_core -liomp5
-lpthread" --with-libs="-lstdc++" --enable-mpi --enable-amesos
--enable-ifpack --enable-shared --enable-aztecoo --enable-epetra
--enable-epetraext --enable-external --enable-ml --enable-threadpool
--enable-thyra --enable-stratimikos --enable-triutils --enable-galeri
--enable-pytrilinos --cache-file=config.cache
--with-mpi-libdir=/opt/mpich/ch-p4/lib
--with-mpi-incdir=/opt/mpich/ch-p4/include --with-mpi-libs=-lmpich &&
make && make install

I kept a record of my tomfoolery here
<http://matforge.org/wd15/blog/TrilinosAltix> for anyone who is
interested.

Now I need to to figure out if I actually get a performance boost
using the altix.

Cheers

On Tue, Mar 17, 2009 at 6:01 PM, Daniel Wheeler
<daniel.wheeler2 at gmail.com> wrote:
> I finally have a version of trilinos on the altix that does not have
> any symbol errors when I do "from PyTrilinos import Epetra" at the
> python command prompt. I used the following configure command to build
> trilinos.
>
>   ../configure CC=icc CXX=icc F77=ifort CFLAGS="-O3 -fPIC"
> CXXFLAGS="-O3 -fPIC -LANG:std -LANG:ansi -DMPI_NO_CPPBIND" FFLAGS="-O3
> -fPIC" --prefix=${USR} --with-install="/usr/bin/install -p"
> --with-blas="-L/opt/intel/Compiler/11.0/081/mkl/lib/64
> -lmkl_intel_lp64 -lmkl_intel_thread -lmkl_lapack -lmkl_core -liomp5
> -lpthread" --with-lapack="-L/opt/intel/Compiler/11.0/081/mkl/lib/64
> -lmkl_intel_lp64 -lmkl_intel_thread -lmkl_lapack -lmkl_core -liomp5
> -lpthread" --with-libs="-lstdc++" --enable-mpi --enable-amesos
> --enable-ifpack --enable-shared --enable-aztecoo --enable-epetra
> --enable-epetraext --enable-external --enable-ml --enable-threadpool
> --enable-thyra --enable-stratimikos --enable-triutils --enable-galeri
> --enable-pytrilinos --cache-file=config.cache
> --with-mpi-libdir=/opt/mpich/ch-p4/lib
> --with-mpi-incdir=/opt/mpich/ch-p4/include && make && make install
>
> It seems that mpirun is now always required
>
>    $ python
>    Python 2.4.2 (#1, Dec  2 2008, 00:06:21)
>    [GCC 4.1.2 20070115 (SUSE Linux)] on linux2
>    Type "help", "copyright", "credits" or "license" for more information.
>    >>> from PyTrilinos import Epetra
>   mpirun must be used to launch all MPI applications
>
> However, if I launch a simple script that imports Epetra using mpirun
> I still get the same error.
>
>   $ /opt/mpich/ch-p4/bin/mpirun ./script.py
>   mpirun must be used to launch all MPI applications
>
> script.py is just
>
>  #!/usr/bin/env python
>  from PyTrilinos import Epetra
>  print Epetra
>
> Any ideas?
>
> Thanks
>
> --
> Daniel Wheeler
>
> _______________________________________________
> Trilinos-Users mailing list
> Trilinos-Users at software.sandia.gov
> http://software.sandia.gov/mailman/listinfo/trilinos-users
>
>



-- 
Daniel Wheeler



More information about the Trilinos-Users mailing list