[Trilinos-Users] PETSc interface

Jed Brown jed at 59A2.org
Thu May 1 10:07:13 MDT 2014


Lucie Freret <lfreret at arrow.utias.utoronto.ca> writes:

> Hello Jed,
>
> Thanks for your answer.
> First of all, I must say that I'm new to Petsc and I have been working  
> with Trilinos library for almost 6 months. I already have a solver  
> which is working fine with Aztec (Gmres) + ML and I'd be curious to  
> compare it with Petsc + ML (I successfully used Petsc with Gmres  
> preconditioned by ASM).
>
> Since I have 4 dof per node, I wanted to specify it to ML (as "PDE  
> equations") but I couldn't find the keyword in Petsc. 

MatSetBlockSize() (and/or MatSetNearNullSpace() if you want coarsening
to include some extra functions).

> However, this is not my major concern. Actually I'm not able to
> reproduce the way ML is building the different coarse levels. To make
> it clear, when I use Petsc + ML, I get (for a given linear system) 7
> coarse levels while I only get 4 with Aztec + ML. I couldn't find any
> examples online to help me to setup the ML parameters properly.

Send the output from -ksp_view -pc_ml_printlevel 2 as compared to
ML_Set_PrintLevel(2) in Trilinos.  Probably the coarse grid limit is
different.  It looks like your smoothers are also configured
differently.  Finally, I recommend configuring the options outside of
the source file (if your specification is static like you show here) or
via the function APIs (if you application wants to take active control
of the configuration).

Feel free to move this discussion to petsc-users at mcs.anl.gov or
petsc-maint at mcs.anl.gov
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 835 bytes
Desc: not available
URL: <http://software.sandia.gov/pipermail/trilinos-users/attachments/20140501/162abeb2/attachment.sig>


More information about the Trilinos-Users mailing list