ML  Version of the Day
 All Classes Namespaces Files Functions Variables Typedefs Enumerations Enumerator Friends Groups Pages

We now report and comment and example of usage of ML_Epetra::MultiLevelPreconditioner. The source code can be found in ml_preconditioner.cpp.

First, we need to include several header files. Note that the example works for MPI and non-MPI configurations. However, some coarse solver requires MPI (like for instance AMESOS-Superludist and AMESOS-Mumps). Trilinos_Util_CrsMatrixGallery.h is required by this example, and not by ML.

#ifdef HAVE_MPI
#include "mpi.h"
#include "Epetra_MpiComm.h"
#else
#include "Epetra_SerialComm.h"
#endif
#include "Epetra_Map.h"
#include "Epetra_SerialDenseVector.h"
#include "Epetra_Vector.h"
#include "Epetra_CrsMatrix.h"
#include "Epetra_LinearProblem.h"
#include "AztecOO.h"
#include "Trilinos_Util_CrsMatrixGallery.h"
#include "ml_include.h"
#include "ml_epetra_preconditioner.h"

The following namespace will be used quite often:

using namespace Teuchos;
using namespace Trilinos_Util;

We can now start with the main() function. We create the linear problem using the class Trilinos_Util::CrsMatrixGallery. Several matrix examples are supported; please refer to the Trilinos tutorial for more details. Most of the examples using the ML_Epetra::MultiLevelPreconditioner class are based on Epetra_CrsMatrix. Example ml_EpetraVbr.cpp shows how to define a Epetra_VbrMatrix.

laplace_2d is a symmetric matrix; an example of non-symmetric matrices is recirc_2d' (advection-diffusion in a box, with recirculating flow). The number of nodes must be a square number

int main(int argc, char *argv[])
{
#ifdef EPETRA_MPI
MPI_Init(&argc,&argv);
Epetra_MpiComm Comm(MPI_COMM_WORLD);
#else
#endif
CrsMatrixGallery Gallery("laplace_2d", Comm);
int nx = 128;;
Gallery.Set("problem_size", nx*nx);

The following methods of CrsMatrixGallery are used to get pointers to internally stored Epetra_RowMatrix and Epetra_LinearProblem. Then, as we wish to use AztecOO, we need to construct a solver object for this problem.

Epetra_RowMatrix * A = Gallery.GetMatrix();
Epetra_LinearProblem * Problem = Gallery.GetLinearProblem();
AztecOO solver(*Problem);

This is the beginning of the ML part. We proceed as follows:

ParameterList MLList;
MLList.set("max levels",6);
MLList.set("increasing or decreasing","decreasing");
MLList.set("aggregation: type", "Uncoupled");
MLList.set("aggregation: threshold", 0.0);
MLList.set("smoother: type","Gauss-Seidel");
MLList.set("smoother: pre or post", "both");
MLList.set("coarse: type","Amesos_KLU");
MLList.set("coarse: maximum size",30);

Now, we create the preconditioning object. We suggest to use `new' and `delete' because the destructor contains some calls to MPI (as required by ML and possibly Amesos). This is an issue only if the destructor is called after MPI_Finalize(). Then, we instruct AztecOO to use this preconditioner and solve with 500 iterations and 1e-12 tolerance.

solver.SetPrecOperator(MLPrec);
Problem->GetLHS()->PutScalar(0.0);
Problem->GetRHS()->PutScalar(1.0);
solver.SetAztecOption(AZ_solver, AZ_cg);
solver.SetAztecOption(AZ_conv, AZ_noscaled);
solver.SetAztecOption(AZ_output, 1);
solver.Iterate(500, 1e-8);
delete MLPrec;

At this point, we can compute the true residual, and quit.

double residual, diff;
Gallery.ComputeResidual(residual);
Gallery.ComputeDiffBetweenStartingAndExactSolutions(diff);
if( Comm.MyPID()==0) {
cout << "||b-Ax||_2 = " << residual << endl;
cout << "||x_exact - x||_2 = " << diff << endl;
}
#ifdef
EPETRA_MPI MPI_Finalize() ;
#endif
return 0 ;
}