[Trilinos-Users] [EXTERNAL] small question concerning ML

Riccardo Rossi rrossi at cimne.upc.edu
Thu Mar 21 11:45:14 MDT 2013


oops, false statement

it did work, but also the IFPACK statements were also commented, so the ML
was working with the default settings

libifpack is there and correctly resolved (i use libifpack.so)

the complete call i use is the following

            int rank;
            MPI_Comm_rank(MPI_COMM_WORLD, &rank);

            Epetra_LinearProblem AztecProblem(&A, &Dx, &b);

            Epetra_Vector scaling_vect(A.RowMap());
            A.InvColSums(scaling_vect);
            AztecProblem.LeftScale(scaling_vect);

            AztecOO solver(AztecProblem);

            // create an empty parameter list for ML options
            Teuchos::ParameterList MLList;

            int options[AZ_OPTIONS_SIZE];
            double params[AZ_PARAMS_SIZE];
            AZ_defaults(options, params);
        //    options[AZ_precond] = AZ_dom_decomp; <------------- this
lines seem to have no visible effect
        //    options[AZ_subdomain_solve] = AZ_ilu;
        //    options[AZ_graph_fill] = 0;
        //    options[AZ_overlap] = 0;

            ML_Epetra::SetDefaults("NSSA", MLList, options, params);

            MLList.set("PDE equations", mNumDof);
             MLList.set("null space: add default vectors", true);
            MLList.set("aggregation: type","Uncoupled");

        MLList.set("smoother: type","IFPACK");
        MLList.set("smoother: ifpack type", "ILU");
        MLList.set("smoother: ifpack overlap", 0);
         MLList.sublist("smoother: ifpack list").set("fact: level-of-fill",
0);

          MLList.set("ML output",5);
         MLList.set("print unused",1);

            ML_Epetra::MultiLevelPreconditioner* MLPrec =
                new ML_Epetra::MultiLevelPreconditioner(A, MLList, true);

            // tell AztecOO to use this preconditioner, then solve
            solver.SetPrecOperator(MLPrec);

            // =========================== end of ML part
=============================
            // Instruct AztecOO to use GMRES with estimation of the
condition
            // number. Also, requires output every 32 iterations
            // Then, solve with 500 iterations and 1e-8 as tolerance on the
            // relative residual

            solver.SetAztecOption(AZ_solver, AZ_gmres_condnum);
            solver.SetAztecOption(AZ_output, 1); //AZ_none);
            solver.SetAztecOption(AZ_kspace, 200);
            solver.Iterate(500, 1e-8);

            // delete the preconditioner. Do it BEFORE MPI_Finalize
            delete MLPrec;


thanks in advance for any help

Riccardo



On Thu, Mar 21, 2013 at 6:33 PM, Riccardo Rossi <rrossi at cimne.upc.edu>wrote:

> Dear Prof.
>
> it looks like i was messing up with the Aztec parameters.
>
> i had the following code which i did not notice, which for sure was
> messing up with the list settings
>
>         //    options[AZ_precond] = AZ_dom_decomp;
>         //    options[AZ_subdomain_solve] = AZ_ilu;
>         //    options[AZ_graph_fill] = 0;
>         //    options[AZ_overlap] = 0;
>
> commenting it it now appears to work.
>
> sorry for the noise
>
> Riccardo
>
>
> On Thu, Mar 21, 2013 at 3:58 PM, Tuminaro, Raymond S <rstumin at sandia.gov>wrote:
>
>>  Riccardo,
>>    It looks like the ILU smoother is never "really" set. I'm not sure why
>> this would be because it looks
>> you've set up the parameter list correctly. Is there any chance that
>> IFPACK is not configured in
>> your build? You should have somewhere a libifpack.a in your compiled
>> Trilinos?
>>
>> -Ray
>>
>>  ------------------------------
>> *From:* trilinos-users-bounces at software.sandia.gov [
>> trilinos-users-bounces at software.sandia.gov] on behalf of Riccardo Rossi [
>> rrossi at cimne.upc.edu]
>> *Sent:* Thursday, March 21, 2013 2:40 AM
>> *To:* trilinos-users at software.sandia.gov
>> *Subject:* [EXTERNAL] [Trilinos-Users] small question concerning ML
>>
>>  Dear list,
>>
>> i am trying to use ILU0 smoother together with ML to see if it can
>> improve convergence, with a problem which corresponds to a pde size of 4
>>
>>
>> the thing is that if i use the following
>>
>>             ML_Epetra::SetDefaults("NSSA", MLList, options, params);
>>             MLList.set("PDE equations", mNumDof);
>>              MLList.set("null space: add default vectors", true);
>>             MLList.set("aggregation: type","Uncoupled");
>>
>> it works correctly
>>
>> but if i put the following settings
>>
>>        MLList.set("PDE equations", mNumDof);
>>        MLList.set("null space: add default vectors", true);
>>        MLList.set("aggregation: type","Uncoupled");  <----- up to this
>> point exactly identical
>>
>>         MLList.set("smoother: type","IFPACK");
>>         MLList.set("smoother: ifpack type", "ILU");
>>         MLList.set("smoother: ifpack overlap", 0);
>>          MLList.sublist("smoother: ifpack list").set("fact:
>> level-of-fill", 0);
>>
>>          MLList.set("ML output",5);
>>         MLList.set("print unused",1);
>>
>>
>> than it does not even try to iterate and gives the output posted at the
>> end of the email (provding at the end a wrong result):
>>
>> can anyone give a hint of what can i be doing wrong? changing overlap and
>> level of fill does not improve the situation
>> also commenting the line
>>      MLList.set("null space: add default vectors", true);
>> does not help
>>
>> any suggestion is welcome
>>
>> thank you in advance
>> Riccardo
>>
>> System Solve Time : 5.44
>> Building Time : 2.86
>> ML_Aggregate_Coarsen (level 0) begins
>> ML_Aggregate_CoarsenUncoupled : current level = 0
>> ML_Aggregate_CoarsenUncoupled : current eps = 0.000000e+00
>> Aggregation(UVB) : Total nonzeros = 25542464 (Nrows=438016)
>> Aggregation(UVB) : Amalgamated matrix done
>> Aggregation(UC) : Phase 0 - no. of bdry pts  = 0
>> Aggregation(UC) : Phase 1 - nodes aggregated = 71810 (109504)
>> Aggregation(UC) : Phase 1 - total aggregates = 5698
>> Aggregation(UC_Phase2_3) : Phase 1 - nodes aggregated = 71027
>> Aggregation(UC_Phase2_3) : Phase 1 - total aggregates = 5698
>> Aggregation(UC_Phase2_3) : Phase 2a- additional aggregates = 237
>> Aggregation(UC_Phase2_3) : Phase 2 - total aggregates = 5935
>> Aggregation(UC_Phase2_3) : Phase 2 - boundary nodes   = 1226
>> Aggregation(UC_Phase2_3) : Phase 3 - leftovers = 0 and singletons = 0
>> ML_Aggregate_Coarsen (level 1) begins
>> ML_Aggregate_CoarsenUncoupled : current level = 1
>> ML_Aggregate_CoarsenUncoupled : current eps = 0.000000e+00
>> Aggregation(UVB) : Total nonzeros = 4840413 (Nrows=23740)
>> Aggregation(UVB) : Amalgamated matrix done
>> Aggregation(UC) : Phase 0 - no. of bdry pts  = 0
>> Aggregation(UC) : Phase 1 - nodes aggregated = 3526 (5935)
>> Aggregation(UC) : Phase 1 - total aggregates = 97
>> Aggregation(UC_Phase2_3) : Phase 1 - nodes aggregated = 3526
>> Aggregation(UC_Phase2_3) : Phase 1 - total aggregates = 97
>> Aggregation(UC_Phase2_3) : Phase 2a- additional aggregates = 25
>> Aggregation(UC_Phase2_3) : Phase 2 - total aggregates = 122
>> Aggregation(UC_Phase2_3) : Phase 2 - boundary nodes   = 2
>> Aggregation(UC_Phase2_3) : Phase 3 - leftovers = 0 and singletons = 0
>> ML_Aggregate_Coarsen (level 2) begins
>> ML_Aggregate_CoarsenUncoupled : current level = 2
>> ML_Aggregate_CoarsenUncoupled : current eps = 0.000000e+00
>> Aggregation(UVB) : Total nonzeros = 97790 (Nrows=488)
>> Aggregation(UVB) : Amalgamated matrix done
>> Aggregation(UC) : Phase 0 - no. of bdry pts  = 0
>> Aggregation(UC) : Phase 1 - nodes aggregated = 94 (122)
>> Aggregation(UC) : Phase 1 - total aggregates = 5
>> Aggregation(UC_Phase2_3) : Phase 1 - nodes aggregated = 94
>> Aggregation(UC_Phase2_3) : Phase 1 - total aggregates = 5
>> Aggregation(UC_Phase2_3) : Phase 2a- additional aggregates = 1
>> Aggregation(UC_Phase2_3) : Phase 2 - total aggregates = 6
>> Aggregation(UC_Phase2_3) : Phase 2 - boundary nodes   = 0
>> Aggregation(UC_Phase2_3) : Phase 3 - leftovers = 0 and singletons = 0
>> Smoothed Aggregation : operator complexity = 1.193356e+00.
>> Amesos (level 3) : NumGlobalRows = 24
>> Amesos (level 3) : NumGlobalNonzeros = 576
>> Amesos (level 3) : Fill-in = 100 %
>> Amesos (level 3) : Building KLU
>> Amesos (level 3) : Time for factorization  = 0.000393867 (s)
>>
>>         *******************************************************
>>         ***** Problem: Epetra::CrsMatrix
>>         ***** Preconditioned GMRES (with condnum) solution
>>         ***** ML (L=4, ~/IFPACK_post0, ~/Amesos_KLU_3)
>>         ***** No scaling
>>         *******************************************************
>>
>>                 iter:    0           residual = 1.000000e+00
>>
>>
>>     ***************************************************************
>>
>>     Warning: the GMRES Hessenberg matrix is ill-conditioned.  This may
>>     indicate that the application matrix is singular. In this case, GMRES
>>     may have a least-squares solution.
>>
>>     Solver:            gmres_condnum
>>     number of iterations:    1
>>
>>     Actual residual =  3.0313e+00    Recursive residual =  3.0313e+00
>>
>>     Calculated Norms                Requested Norm
>>     --------------------------------------------    --------------
>>
>>     ||r||_2 / ||r0||_2:        1.000000e+00    1.000000e-08
>>
>>                 -----------------------------------------------------
>>
>>                 Warning : The Hessenberg matrix is too small
>>
>>                 Analysis of the Hessenberg matrix:
>>
>>                 smallest eigenvalue (in module) = 1.000000e+00
>>                 largest eigenvalue (in module)  = 1.000000e+00
>>
>>
>>     ***************************************************************
>>
>>                 estimated condition number      = 1.000000e+00
>>
>>                 -----------------------------------------------------
>>
>>
>>         Solution time: 0.594218 (sec.)
>>         total iterations: 1
>> Linear solver
>>
>>
>>
>>
>> --
>>
>> Dr. Riccardo Rossi, Civil Engineer
>>
>> Member of Kratos Team
>>
>> International Center for Numerical Methods in Engineering - CIMNE
>> Campus Norte, Edificio C1
>>
>> c/ Gran Capitán s/n
>>
>> 08034 Barcelona, España
>>
>> Tel:        (+34) 93 401 56 96
>>
>> Fax:       (+34) 93.401.6517
>> web:       www.cimne.com
>>
>
>
>
> --
>
> Dr. Riccardo Rossi, Civil Engineer
>
> Member of Kratos Team
>
> International Center for Numerical Methods in Engineering - CIMNE
> Campus Norte, Edificio C1
>
> c/ Gran Capitán s/n
>
> 08034 Barcelona, España
>
> Tel:        (+34) 93 401 56 96
>
> Fax:       (+34) 93.401.6517
> web:       www.cimne.com
>



-- 

Dr. Riccardo Rossi, Civil Engineer

Member of Kratos Team

International Center for Numerical Methods in Engineering - CIMNE
Campus Norte, Edificio C1

c/ Gran Capitán s/n

08034 Barcelona, España

Tel:        (+34) 93 401 56 96

Fax:       (+34) 93.401.6517
web:       www.cimne.com
-------------- next part --------------
An HTML attachment was scrubbed...
URL: https://software.sandia.gov/pipermail/trilinos-users/attachments/20130321/3f2ee82e/attachment.html 


More information about the Trilinos-Users mailing list