[Trilinos-Users] [EXTERNAL] Re: scaling with trilinos tpetra and anasazi solvers

Heroux, Michael A maherou at sandia.gov
Fri Jan 30 08:12:03 MST 2015


?Excellent. That is what these sanity checks are for.  I am glad it worked!

________________________________
From: Mike Atambo <mikeat4999 at gmail.com>
Sent: Friday, January 30, 2015 9:06 AM
To: Heroux, Mike
Cc: Heroux, Michael A; Trilinos-Users at software.sandia.gov
Subject: [EXTERNAL] Re: [Trilinos-Users] scaling with trilinos tpetra and anasazi solvers

Mike (Heroux),
I have had a look at the   code you suggested, and  it led me to  find a problem with the mpi environment here,
i have corrected that,   thank you,  i believe  we are seeing  the correct behaviour now.
Mike.


On Thu, Jan 29, 2015 at 5:15 PM, Mike Atambo <mikeat4999 at gmail.com<mailto:mikeat4999 at gmail.com>> wrote:
Mike (Heroux),
Thanks i will do this,  and see what kind of information i get,  i will post back what i find to
make it easier to for  one of you to help (or explain what i may be missing).

Mike

On Thu, Jan 29, 2015 at 4:29 PM, Heroux, Mike <MHeroux at csbsju.edu<mailto:MHeroux at csbsju.edu>> wrote:
Mike,

Just to follow up a bit more:  The reason I suggest this manual approach is to determine if your MPI environment can scale properly.  If your cores are on a single workstation, you will not get full speed up on this kind of solver, but I am surprised you are not seeing any.  This test would help diagnose the issue.

Mike

From: <Heroux>, "maherou at sandia.gov<mailto:maherou at sandia.gov>" <maherou at sandia.gov<mailto:maherou at sandia.gov>>
Date: Thursday, January 29, 2015 at 1:03 PM
To: Mike Atambo <mikeat4999 at gmail.com<mailto:mikeat4999 at gmail.com>>, "Trilinos-Users at software.sandia.gov<mailto:Trilinos-Users at software.sandia.gov>" <Trilinos-Users at software.sandia.gov<mailto:Trilinos-Users at software.sandia.gov>>
Subject: Re: [Trilinos-Users] scaling with trilinos tpetra and anasazi solvers


Mike,


It is hard to say for sure why you are seeing the results you have.  One thing you could do is write a simple MPI code that does a vector operation such as:


int localn = 50000000/size; // size is the number of MPI ranks


// define arrays x, y, w appropriately

// define ntrials so that the job runs a significant amount of time.


 for (int j=0; j< ntrials; ++j)

     ?for (int i=0; i< localn; ++i) w[i] += alpha*x[i] + y[i];


With a tridiagonal system and the power method, your computation and data access should be similar to the above operation.  This should give you some sense of what to expect.


Mike


________________________________
From: trilinos-users-bounces at software.sandia.gov<mailto:trilinos-users-bounces at software.sandia.gov> <trilinos-users-bounces at software.sandia.gov<mailto:trilinos-users-bounces at software.sandia.gov>> on behalf of Mike Atambo <mikeat4999 at gmail.com<mailto:mikeat4999 at gmail.com>>
Sent: Monday, January 26, 2015 7:29 AM
To: Trilinos-Users at software.sandia.gov<mailto:Trilinos-Users at software.sandia.gov>
Subject: [EXTERNAL] [Trilinos-Users] scaling with trilinos tpetra and anasazi solvers

 My apology,  i send an earlier email to an older thread,  this is a correction.

Im trying to solve a  system using  tpetra and anasazi's krylov-shur,  but there seems to
be no scaling with number of processors,  i tried to run  some of the examples, and
they reflect the same things i saw in my code,

attached (If list accepts attachments... hoping)  is the  lesson_03 power  method,  changed to  create a tridiagonal sparse  matrix,
with about   50 000 000  global indices,   the run time seems the same  for  2 to 16 processors,
is there something im missing?



$ time  mpirun -np 2  ./powerm.x
real 3m16.014s
user 3m5.610s
sys 0m4.452s

$ time  mpirun -np 8  ./powerm.x | grep  real
real 3m14.229s
user 3m7.938s
sys 0m4.939s


$ time  mpirun -np 16  ./powerm.x
real 3m18.153s
user 3m10.202s
sys 0m6.108s

Mike.

matambo at ictp.it<mailto:matambo at ictp.it>
Ext .139





--
M. O. Atambo
mikeat4999 at gmail.com<mailto:mikeat4999 at gmail.com>
matambo at ictp.it<mailto:matambo at ictp.it>
Ext .139
Room 209.




--
M. O. Atambo
mikeat4999 at gmail.com<mailto:mikeat4999 at gmail.com>
matambo at ictp.it<mailto:matambo at ictp.it>
Ext .139
Room 209.

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://software.sandia.gov/pipermail/trilinos-users/attachments/20150130/dd35f959/attachment.html>


More information about the Trilinos-Users mailing list