[Trilinos-Users] Moocho breaking at unimplemented code

GUEDON Stéphane stephane.guedon at doceapower.com
Tue Jan 20 01:21:29 MST 2015


Hello,

When I solve a Linear Problem in Moocho overwriting 
NLPSerialPreprocessExplJac
I get the following error:
"MoochoSolver: Caught an std::exception of type class std::logic_error 
described as : 
..\..\..\..\packages\moocho\src\AbstractLinAlgPack\src\abstract\interfaces\AbstractLinAlgPack_MatrixOp.cpp:372:"
I fact, the code is not implemented. The way to come here is that code 
"if( jd < 0 ) {" is true line 1740 file 
"ConstrainedOptPack_QPSchur.cpp". The not implemented code is called at 
line 1792.
My question is: does the test must be always false and how to or is the 
implementation missing?

I attach the journal file to this mail for further information. The 
version of Trilinos is 11.12.

Thank you for your help.



-------------- next part --------------

********************************************************************
*** Algorithm iteration detailed journal output                  ***
***                                                              ***
*** Below, detailed information about the SQP algorithm is given ***
*** while it is running.  The amount of information that is      ***
*** produced can be specified using the option                   ***
*** NLPSolverClientInterface::journal_output_level (the default ***
*** is PRINT_NOTHING and produces no output                      ***
********************************************************************

*** Echoing input options ...

begin_options

options_group CalcFiniteDiffProd {
    fd_method_order = FD_ORDER_ONE;
}

options_group DecompositionSystemStateStepBuilderStd
 {
    null_space_matrix = EXPLICIT;
    range_space_matrix = ORTHOGONAL;
}

options_group NLPAlgoConfigMamaJama
 {
    line_search_method = FILTER;
    quasi_newton = BFGS;
}

options_group NLPSolverClientInterface
 {
    calc_conditioning = true;
    calc_matrix_info_null_space_only = true;
    calc_matrix_norms = true;
    feas_tol = 1e-7;
    journal_output_level = PRINT_ALGORITHM_STEPS;
    journal_print_digits = 10;
    max_iter = 500;
    max_run_time = 10.0;
    null_space_journal_output_level = PRINT_ITERATION_QUANTITIES;
    opt_tol = 1e-2;
}

end_options


*** Setting up to run MOOCHO on the NLP using a configuration object of type 'class MoochoPack::NLPAlgoConfigMamaJama' ...

*****************************
*** MoochoSolver::solve() ***
*****************************

test_nlp = true: Testing the NLP! ...

Testing the supported NLPFirstOrder interface ...

*********************************
*** test_nlp_first_order(...) ***
*********************************

Testing the vector spaces ...

Testing nlp->space_x() ...
nlp->space_x() checks out!

Testing nlp->space_c() ...
nlp->space_c() checks out!

**************************************
*** NLPTester::test_interface(...) ***
**************************************

nlp->force_xinit_in_bounds(true)
nlp->initialize(true)

*** Dimensions of the NLP ...

nlp->n()  = 81
nlp->m()  = 8

*** Validate the dimensions of the vector spaces ...

check: nlp->space_x()->dim() = 81 == nlp->n() = 81: true

check: nlp->space_c()->dim() = 8 == nlp->m() = 8: true

*** Validate that the initial starting point is in bounds ...

||nlp->xinit()||inf = 1.00000000e+000

check: xl <= x <= xu : true
xinit is in bounds with { max |u| | xl <= x + u <= xu } -> 9.90024481e+001

check: num_bounded(nlp->xl(),nlp->xu()) = 81 == nlp->num_bounded_x() = 81: true

Getting the initial estimates for the Lagrange mutipliers ...

||lambda||inf  = 0.00000000e+000
||nu||inf      = 0.00000000e+000
nu.nz()        = 0

*** Evaluate the point xo ...

||xo||inf = 1.00000000e+000

f(xo) = 1.00000000e+000
||c(xo)||inf = 2.51346737e+000

*** Report this point to the NLP as suboptimal ...

*** Print the number of evaluations ...

nlp->num_f_evals() = 1
nlp->num_c_evals() = 1

Calling nlp->calc_Gc(...) at nlp->xinit() ...

Calling nlp->calc_Gf(...) at nlp->xinit() ...

Comparing directional products Gf'*y and/or Gc'*y with finite difference values  FDGf'*y and/or FDGc'*y for random y's ...

****
**** Random directional vector 1 ( ||y||_1 / n = 9.89700568e-001 )
***
--------------------------- DEBUG -----------------------------------
Gradient of constraint Matrix Gc in NLPFirstDerivTester::fd_directional_check:
Matrix with permuted view:
mat_orig =
Sparse 81 x 8 matrix with 648 nonzero entries:
2.04830170e-002:1:1 2.12144852e-002:2:1 2.11305618e-002:3:1 1.89027786e-002:4:1 2.06022263e-002:5:1 2.13022232e-002:6:1 2.12097168e-002:7:1 1.89847946e-002:8:1 2.07357407e-002:9:1 2.13918686e-002:10:1 2.12850571e-002:11:1 1.90687180e-002:12:1 2.08826065e-002:13:1 2.14786530e-002:14:1 2.13623047e-002:15:1 1.91507339e-002:16:1 2.10475922e-002:17:1 2.15721130e-002:18:1 2.14414597e-002:19:1 1.92365646e-002:20:1 2.12411880e-002:21:1 2.16627121e-002:22:1 2.15158463e-002:23:1 1.93223953e-002:24:1 2.14738846e-002:25:1 2.17542648e-002:26:1 2.15883255e-002:27:1 1.94072723e-002:28:1 2.17657089e-002:29:1 2.18486786e-002:30:1 2.16617584e-002:31:1 1.94940567e-002:32:1 2.21385956e-002:33:1 2.19421387e-002:34:1 2.17304230e-002:35:1 1.95798874e-002:36:1 2.26278305e-002:37:1 2.20346451e-002:38:1 2.17962265e-002:39:1 1.96676254e-002:40:1 2.32934952e-002:41:1 2.21242905e-002:42:1 2.18553543e-002:43:1 1.97534561e-002:44:1 2.42061615e-002:45:1 2.22110748e-002:46:1 2.19087601e-002:47:1 1.98364258e-002:48:1 2.54869461e-002:49:1 2.22921371e-002:50:1 2.19545364e-002:51:1 1.99203491e-002:52:1 2.73027420e-002:53:1 2.23588943e-002:54:1 2.19860077e-002:55:1 1.99975967e-002:56:1 2.99110413e-002:57:1 2.24103928e-002:58:1 2.20079422e-002:59:1 2.00729370e-002:60:1 3.36866379e-002:61:1 2.24370956e-002:62:1 2.20155716e-002:63:1 2.01377869e-002:64:1 3.91969681e-002:65:1 2.24380493e-002:66:1 2.20117569e-002:67:1 2.01911926e-002:68:1 4.73070145e-002:69:1 2.24180222e-002:70:1 2.20098495e-002:71:1 2.02293396e-002:72:1 5.93423843e-002:73:1 2.24151611e-002:74:1 2.20260620e-002:75:1 2.02484131e-002:76:1 7.73839951e-002:77:1 2.25629807e-002:78:1 2.20937729e-002:79:1 2.02484131e-002:80:1 7.96743393e-001:81:1 8.70358944e-003:1:2 5.15415072e-002:2:2 5.62558174e-002:3:2 -1.11496449e-003:4:2 8.44138861e-003:5:2 5.30897975e-002:6:2 5.80014586e-002:7:2 -1.55162811e-003:8:2 8.10152292e-003:9:2 5.46968579e-002:10:2 5.98155260e-002:11:2 -2.00974941e-003:12:2 7.65269995e-003:13:2 5.63657880e-002:14:2 6.17017746e-002:15:2 -2.49022245e-003:16:2 7.04908371e-003:17:2 5.81007004e-002:18:2 6.36645555e-002:19:2 -2.99298763e-003:20:2 6.22463226e-003:21:2 5.99058867e-002:22:2 6.57087564e-002:23:2 -3.51887941e-003:24:2 5.08189201e-003:25:2 6.17873073e-002:26:2 6.78404570e-002:27:2 -4.06777859e-003:28:2 3.47924232e-003:29:2 6.37527704e-002:30:2 7.00674057e-002:31:2 -4.63974476e-003:32:2 1.20830536e-003:33:2 6.58121109e-002:34:2 7.23987818e-002:35:2 -5.23394346e-003:36:2 -2.03561783e-003:37:2 6.79796934e-002:38:2 7.48466253e-002:39:2 -5.84852695e-003:40:2 -6.70039654e-003:41:2 7.02745914e-002:42:2 7.74260163e-002:43:2 -6.48117065e-003:44:2 -1.34445429e-002:45:2 7.27239251e-002:46:2 8.01559687e-002:47:2 -7.12722540e-003:48:2 -2.32374668e-002:49:2 7.53661394e-002:50:2 8.30587149e-002:51:2 -7.77983665e-003:52:2 -3.75094414e-002:53:2 7.82553554e-002:54:2 8.61577392e-002:55:2 -8.42881203e-003:56:2 -5.83756566e-002:57:2 8.14681649e-002:58:2 8.94711614e-002:59:2 -9.05936956e-003:60:2 -8.89736414e-002:61:2 8.51134062e-002:62:2 9.29930210e-002:63:2 -9.65029001e-003:64:2 -1.33974433e-001:65:2 8.93457532e-002:66:2 9.66519713e-002:67:2 -1.01726651e-002:68:2 -2.00364590e-001:69:2 9.43865776e-002:70:2 1.00216329e-001:71:2 -1.05881691e-002:72:2 -2.98653126e-001:73:2 1.00561380e-001:74:2 1.03088975e-001:75:2 -1.08492970e-002:76:2 -4.44752574e-001:77:2 1.08374894e-001:78:2 1.03861272e-001:79:2 -1.09045506e-002:80:2 1.26963919e+000:81:2 1.95280463e-003:1:3 1.47089586e-002:2:3 1.76537335e-002:3:3 8.81433487e-004:4:3 1.62871927e-003:5:3 1.48880109e-002:6:3 1.81543529e-002:7:3 8.01317394e-004:8:3 1.26869231e-003:9:3 1.50351301e-002:10:3 1.86709389e-002:11:3 7.19994307e-004:12:3 8.76188278e-004:13:3 1.51424482e-002:14:3 1.92034617e-002:15:3 6.37985766e-004:16:3 4.59365547e-004:17:3 1.52003840e-002:18:3 1.97517946e-002:19:3 5.55895269e-004:20:3 3.44067812e-005:21:3 1.51976198e-002:22:3 2.03159228e-002:23:3 4.74363565e-004:24:3 -3.70435417e-004:25:3 1.51203200e-002:26:3 2.08955929e-002:27:3 3.94187868e-004:28:3 -7.07738101e-004:29:3 1.49516836e-002:30:3 2.14910433e-002:31:3 3.16143036e-004:32:3 -9.00730491e-004:33:3 1.46712884e-002:34:3 2.21037492e-002:35:3 2.41182745e-004:36:3 -8.27394426e-004:37:3 1.42536983e-002:38:3 2.27384493e-002:39:3 1.70312822e-004:40:3 -2.96585262e-004:41:3 1.36662722e-002:42:3 2.34077200e-002:43:3 1.04539096e-004:44:3 9.89519060e-004:45:3 1.28651038e-002:46:3 2.41423324e-002:47:3 4.52548265e-005:48:3 3.49146128e-003:49:3 1.17855445e-002:50:3 2.50143409e-002:51:3 -6.06477261e-006:52:3 7.91811198e-003:53:3 1.03196874e-002:54:3 2.61889473e-002:55:3 -4.68268991e-005:56:3 1.53510049e-002:57:3 8.25995207e-003:58:3 2.80417874e-002:59:3 -7.30454922e-005:60:3 2.74141431e-002:61:3 5.15640527e-003:62:3 3.14272642e-002:63:3 -7.73742795e-005:64:3 4.64663729e-002:65:3 -4.91961837e-005:66:3 3.82981896e-002:67:3 -4.76986170e-005:68:3 7.56959766e-002:69:3 -9.96033102e-003:70:3 5.31542525e-002:71:3 3.36617231e-005:72:3 1.18696623e-001:73:3 -3.11232731e-002:74:3 8.64652768e-002:75:3 1.82218850e-004:76:3 1.77250177e-001:77:3 -8.01493004e-002:78:3 1.62844010e-001:79:3 3.72990966e-004:80:3 3.70457619e-001:81:3 -1.94299519e-002:1:4 -4.29900885e-002:2:4 -4.46090400e-002:3:4 -9.32568312e-003:4:4 -1.99223757e-002:5:4 -4.41277027e-002:6:4 -4.56707776e-002:7:4 -9.19926167e-003:8:4 -2.05248296e-002:9:4 -4.53242958e-002:10:4 -4.67669964e-002:11:4 -9.05865431e-003:12:4 -2.12781727e-002:13:4 -4.65851128e-002:14:4 -4.78978455e-002:15:4 -8.90254974e-003:16:4 -2.22397447e-002:17:4 -4.79148626e-002:18:4 -4.90632951e-002:19:4 -8.72927904e-003:20:4 -2.34917402e-002:21:4 -4.93195057e-002:22:4 -5.02631664e-002:23:4 -8.53705406e-003:24:4 -2.51514316e-002:25:4 -5.08040190e-002:26:4 -5.14954925e-002:27:4 -8.32334161e-003:28:4 -2.73882747e-002:29:4 -5.23730218e-002:30:4 -5.27569056e-002:31:4 -8.08507204e-003:32:4 -3.04469764e-002:33:4 -5.40301800e-002:34:4 -5.40418625e-002:35:4 -7.81828165e-003:36:4 -3.46826315e-002:37:4 -5.57771325e-002:38:4 -5.53413630e-002:39:4 -7.51793385e-003:40:4 -4.06115055e-002:41:4 -5.76119721e-002:42:4 -5.66406250e-002:43:4 -7.17717409e-003:44:4 -4.89873886e-002:45:4 -5.95291257e-002:46:4 -5.79162836e-002:47:4 -6.78706169e-003:48:4 -6.09136820e-002:49:4 -6.15183413e-002:50:4 -5.91302514e-002:51:4 -6.33493066e-003:52:4 -7.80129433e-002:53:4 -6.35689497e-002:54:4 -6.02180958e-002:55:4 -5.80465794e-003:56:4 -1.02684200e-001:57:4 -6.56839907e-002:58:4 -6.10629916e-002:59:4 -5.17284870e-003:60:4 -1.38504446e-001:61:4 -6.79274797e-002:62:4 -6.14374876e-002:63:4 -4.40979004e-003:64:4 -1.90873832e-001:65:4 -7.05530643e-002:66:4 -6.08605742e-002:67:4 -3.47629189e-003:68:4 -2.68103808e-001:69:4 -7.43488073e-002:70:4 -5.82465529e-002:71:4 -2.32648849e-003:72:4 -3.83357882e-001:73:4 -8.15300345e-002:74:4 -5.10229766e-002:75:4 -9.18030739e-004:76:4 -5.58358610e-001:77:4 -9.80281234e-002:78:4 -3.29008698e-002:79:4 7.53343105e-004:80:4 -1.03487262e+000:81:4 -4.05086130e-002:1:5 -1.96813941e-002:2:5 -1.75170377e-002:3:5 -3.92956212e-002:4:5 -4.09694165e-002:5:5 -1.90779418e-002:6:5 -1.68355182e-002:7:5 -4.00355458e-002:8:5 -4.14078534e-002:9:5 -1.84290186e-002:10:5 -1.61111578e-002:11:5 -4.07974124e-002:12:5 -4.18117419e-002:13:5 -1.77319124e-002:14:5 -1.53421238e-002:15:5 -4.15822193e-002:16:5 -4.21635881e-002:17:5 -1.69837624e-002:18:5 -1.45270005e-002:19:5 -4.23910990e-002:20:5 -4.24384028e-002:21:5 -1.61815882e-002:22:5 -1.36642158e-002:23:5 -4.32253629e-002:24:5 -4.25997600e-002:25:5 -1.53224766e-002:26:5 -1.27525255e-002:27:5 -4.40863371e-002:28:5 -4.25953865e-002:29:5 -1.44037008e-002:30:5 -1.17910653e-002:31:5 -4.49757650e-002:32:5 -4.23490703e-002:33:5 -1.34227648e-002:34:5 -1.07794404e-002:35:5 -4.58955243e-002:36:5 -4.17499691e-002:37:5 -1.23779103e-002:38:5 -9.71803814e-003:39:5 -4.68479246e-002:40:5 -4.06362265e-002:41:5 -1.12682879e-002:42:5 -8.60828906e-003:43:5 -4.78358343e-002:44:5 -3.87709588e-002:45:5 -1.00947395e-002:46:5 -7.45297968e-003:47:5 -4.88627702e-002:48:5 -3.58059108e-002:49:5 -8.85991752e-003:50:5 -6.25684857e-003:51:5 -4.99333069e-002:52:5 -3.12276408e-002:53:5 -7.56842643e-003:54:5 -5.02681732e-003:55:5 -5.10531813e-002:56:5 -2.42749155e-002:57:5 -6.22512400e-003:58:5 -3.77351046e-003:59:5 -5.22299260e-002:60:5 -1.38112232e-002:61:5 -4.82842326e-003:62:5 -2.51346827e-003:63:5 -5.34730330e-002:64:5 1.87699497e-003:65:5 -3.35020572e-003:66:5 -1.27779692e-003:67:5 -5.47941476e-002:68:5 2.54091918e-002:69:5 -1.68122351e-003:70:5 -1.37954950e-004:71:5 -5.62063530e-002:72:5 6.08700365e-002:73:5 5.12130558e-004:74:5 7.16239214e-004:75:5 -5.77210709e-002:76:5 1.14806302e-001:77:5 4.15385514e-003:78:5 7.50802457e-004:79:5 -5.93390614e-002:80:5 -1.17719274e+000:81:5 -1.45127624e-003:1:6 -9.84221697e-005:2:6 -1.04084611e-004:3:6 -5.01779839e-004:4:6 -1.39864907e-003:5:6 -8.20849091e-005:6:6 -9.33613628e-005:7:6 -5.12396917e-004:8:6 -1.32069737e-003:9:6 -6.48293644e-005:10:6 -8.27312469e-005:11:6 -5.23675233e-004:12:6 -1.20591745e-003:13:6 -4.67635691e-005:14:6 -7.25444406e-005:15:6 -5.35745174e-004:16:6 -1.03775971e-003:17:6 -2.83177942e-005:18:6 -6.33690506e-005:19:6 -5.48779964e-004:20:6 -7.92028382e-004:21:6 -1.00601465e-005:22:6 -5.57769090e-005:23:6 -5.63060865e-004:24:6 -4.33258712e-004:25:6 6.93090260e-006:26:6 -5.09843230e-005:27:6 -5.78947365e-004:28:6 9.03252512e-005:29:6 2.10478902e-005:30:6 -5.06769866e-005:31:6 -5.97024336e-004:32:6 8.54667276e-004:33:6 2.97874212e-005:34:6 -5.70118427e-005:35:6 -6.17943704e-004:36:6 1.97103806e-003:37:6 2.93385237e-005:38:6 -7.34776258e-005:39:6 -6.42845407e-004:40:6 3.60256620e-003:41:6 1.40164047e-005:42:6 -1.04714185e-004:43:6 -6.73267990e-004:44:6 5.98910078e-003:45:6 -2.47024000e-005:46:6 -1.57305971e-004:47:6 -7.11312518e-004:48:6 9.48312879e-003:49:6 -9.90759581e-005:50:6 -2.40264460e-004:51:6 -7.60242343e-004:52:6 1.46039128e-002:53:6 -2.26780772e-004:54:6 -3.65389511e-004:55:6 -8.24410468e-004:56:6 2.21177638e-002:57:6 -4.32491302e-004:58:6 -5.47809526e-004:59:6 -9.10259783e-004:60:6 3.31591666e-002:61:6 -7.49129802e-004:62:6 -8.05424526e-004:63:6 -1.02660991e-003:64:6 4.94141188e-002:65:6 -1.21712685e-003:66:6 -1.15740858e-003:67:6 -1.18575804e-003:68:6 7.34022856e-002:69:6 -1.87662058e-003:70:6 -1.62126310e-003:71:6 -1.40411779e-003:72:6 1.08919285e-001:73:6 -2.74052471e-003:74:6 -2.20908411e-003:75:6 -1.70260295e-003:76:6 1.61746584e-001:77:6 -3.71697173e-003:78:6 -2.93228775e-003:79:6 -2.10548379e-003:80:6 -6.05039001e-002:81:6 -2.90507823e-003:1:7 1.50948390e-003:2:7 4.02688049e-003:3:7 -6.41778111e-004:4:7 -3.37523408e-003:5:7 1.27132796e-003:6:7 4.16023284e-003:7:7 -6.50528818e-004:8:7 -3.90856154e-003:9:7 9.74919647e-004:10:7 4.29584458e-003:11:7 -6.54717907e-004:12:7 -4.50918823e-003:13:7 6.09429553e-004:14:7 4.43298183e-003:15:7 -6.53367490e-004:16:7 -5.17908484e-003:17:7 1.62063166e-004:18:7 4.57046926e-003:19:7 -6.45328313e-004:20:7 -5.91635145e-003:21:7 -3.82438302e-004:22:7 4.70657274e-003:23:7 -6.29359856e-004:24:7 -6.71278685e-003:25:7 -1.04220770e-003:26:7 4.83842567e-003:27:7 -6.04011118e-004:28:7 -7.55042769e-003:29:7 -1.83900818e-003:30:7 4.96131927e-003:31:7 -5.67767769e-004:32:7 -8.39639269e-003:33:7 -2.79895216e-003:34:7 5.06721251e-003:35:7 -5.18988818e-004:36:7 -9.19603184e-003:37:7 -3.95348668e-003:38:7 5.14165498e-003:39:7 -4.56146896e-004:40:7 -9.86364484e-003:41:7 -5.34036942e-003:42:7 5.15804999e-003:43:7 -3.78021970e-004:44:7 -1.02704130e-002:45:7 -7.00404868e-003:46:7 5.06557152e-003:47:7 -2.84098089e-004:48:7 -1.02301054e-002:49:7 -8.99358280e-003:50:7 4.76430543e-003:51:7 -1.75558031e-004:52:7 -9.48252156e-003:53:7 -1.13532171e-002:54:7 4.05256078e-003:55:7 -5.66616654e-005:56:7 -7.67547637e-003:57:7 -1.40910298e-002:58:7 2.51310319e-003:59:7 6.25066459e-005:60:7 -4.34122235e-003:61:7 -1.70883480e-002:62:7 -7.36691058e-004:63:7 1.61487609e-004:64:7 1.15482323e-003:65:7 -1.98504366e-002:66:7 -7.60327280e-003:67:7 2.01907009e-004:68:7 9.78025608e-003:69:7 -2.08411776e-002:70:7 -2.22864486e-002:71:7 1.16288662e-004:72:7 2.34060939e-002:73:7 -1.57380030e-002:74:7 -5.42083792e-002:75:7 -2.05622986e-004:76:7 4.66572195e-002:77:7 7.09376484e-003:78:7 -1.24868069e-001:79:7 -9.23378393e-004:80:7 -5.55185452e-002:81:7 -9.78646800e-004:1:8 -2.02451460e-003:2:8 -1.74782239e-003:3:8 5.70174307e-005:4:8 -1.10706501e-003:5:8 -2.14413367e-003:6:8 -1.80262886e-003:7:8 7.84788281e-005:8:8 -1.26673654e-003:9:8 -2.27729045e-003:10:8 -1.85921974e-003:11:8 1.02141872e-004:12:8 -1.46715902e-003:13:8 -2.42580101e-003:14:8 -1.91726722e-003:15:8 1.28366053e-004:16:8 -1.72122568e-003:17:8 -2.59164348e-003:18:8 -1.97609328e-003:19:8 1.57551840e-004:20:8 -2.04651244e-003:21:8 -2.77687050e-003:22:8 -2.03435309e-003:23:8 1.90248713e-004:24:8 -2.46676244e-003:25:8 -2.98345648e-003:26:8 -2.08953209e-003:27:8 2.27181241e-004:28:8 -3.01392376e-003:29:8 -3.21294554e-003:30:8 -2.13691033e-003:31:8 2.69310549e-004:32:8 -3.72996740e-003:33:8 -3.46570276e-003:34:8 -2.16756016e-003:35:8 3.18052247e-004:36:8 -4.66819108e-003:37:8 -3.73936631e-003:38:8 -2.16489844e-003:39:8 3.75468284e-004:40:8 -5.89152239e-003:41:8 -4.02550399e-003:42:8 -2.09789537e-003:43:8 4.44870442e-004:44:8 -7.46315345e-003:45:8 -4.30252217e-003:46:8 -1.90894864e-003:47:8 5.31723723e-004:48:8 -9.41874087e-003:49:8 -4.51971591e-003:50:8 -1.49199925e-003:51:8 6.45468011e-004:52:8 -1.16957296e-002:53:8 -4.56152670e-003:54:8 -6.54591247e-004:55:8 8.03178176e-004:56:8 -1.39672030e-002:57:8 -4.16533835e-003:58:8 9.44061205e-004:59:8 1.03665330e-003:60:8 -1.52638610e-002:61:8 -2.72953138e-003:62:8 3.88371758e-003:63:8 1.40682794e-003:64:8 -1.31285693e-002:65:8 1.14106573e-003:66:8 9.08300467e-003:67:8 2.03314424e-003:68:8 -1.73482485e-003:69:8 1.07245725e-002:70:8 1.77985616e-002:71:8 3.15451436e-003:72:8 3.23075727e-002:73:8 3.38185690e-002:74:8 3.11273541e-002:75:8 5.25758415e-003:76:8 1.19360752e-001:77:8 8.91731512e-002:78:8 4.77759149e-002:79:8 9.35069658e-003:80:8 -1.90430228e-002:81:8 
row_perm =
Serial 81 x 81 permtutation matrix:
perm =
81
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 
inv_perm =
81
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 
col_perm =
Serial 8 x 8 permtutation matrix:
perm =
8
1 2 3 4 5 6 7 8 
inv_perm =
8
1 2 3 4 5 6 7 8 
mat_perm =
Matrix with permuted view:
mat_orig =
Sparse 81 x 8 matrix with 648 nonzero entries:
2.04830170e-002:1:1 2.12144852e-002:2:1 2.11305618e-002:3:1 1.89027786e-002:4:1 2.06022263e-002:5:1 2.13022232e-002:6:1 2.12097168e-002:7:1 1.89847946e-002:8:1 2.07357407e-002:9:1 2.13918686e-002:10:1 2.12850571e-002:11:1 1.90687180e-002:12:1 2.08826065e-002:13:1 2.14786530e-002:14:1 2.13623047e-002:15:1 1.91507339e-002:16:1 2.10475922e-002:17:1 2.15721130e-002:18:1 2.14414597e-002:19:1 1.92365646e-002:20:1 2.12411880e-002:21:1 2.16627121e-002:22:1 2.15158463e-002:23:1 1.93223953e-002:24:1 2.14738846e-002:25:1 2.17542648e-002:26:1 2.15883255e-002:27:1 1.94072723e-002:28:1 2.17657089e-002:29:1 2.18486786e-002:30:1 2.16617584e-002:31:1 1.94940567e-002:32:1 2.21385956e-002:33:1 2.19421387e-002:34:1 2.17304230e-002:35:1 1.95798874e-002:36:1 2.26278305e-002:37:1 2.20346451e-002:38:1 2.17962265e-002:39:1 1.96676254e-002:40:1 2.32934952e-002:41:1 2.21242905e-002:42:1 2.18553543e-002:43:1 1.97534561e-002:44:1 2.42061615e-002:45:1 2.22110748e-002:46:1 2.19087601e-002:47:1 1.98364258e-002:48:1 2.54869461e-002:49:1 2.22921371e-002:50:1 2.19545364e-002:51:1 1.99203491e-002:52:1 2.73027420e-002:53:1 2.23588943e-002:54:1 2.19860077e-002:55:1 1.99975967e-002:56:1 2.99110413e-002:57:1 2.24103928e-002:58:1 2.20079422e-002:59:1 2.00729370e-002:60:1 3.36866379e-002:61:1 2.24370956e-002:62:1 2.20155716e-002:63:1 2.01377869e-002:64:1 3.91969681e-002:65:1 2.24380493e-002:66:1 2.20117569e-002:67:1 2.01911926e-002:68:1 4.73070145e-002:69:1 2.24180222e-002:70:1 2.20098495e-002:71:1 2.02293396e-002:72:1 5.93423843e-002:73:1 2.24151611e-002:74:1 2.20260620e-002:75:1 2.02484131e-002:76:1 7.73839951e-002:77:1 2.25629807e-002:78:1 2.20937729e-002:79:1 2.02484131e-002:80:1 7.96743393e-001:81:1 8.70358944e-003:1:2 5.15415072e-002:2:2 5.62558174e-002:3:2 -1.11496449e-003:4:2 8.44138861e-003:5:2 5.30897975e-002:6:2 5.80014586e-002:7:2 -1.55162811e-003:8:2 8.10152292e-003:9:2 5.46968579e-002:10:2 5.98155260e-002:11:2 -2.00974941e-003:12:2 7.65269995e-003:13:2 5.63657880e-002:14:2 6.17017746e-002:15:2 -2.49022245e-003:16:2 7.04908371e-003:17:2 5.81007004e-002:18:2 6.36645555e-002:19:2 -2.99298763e-003:20:2 6.22463226e-003:21:2 5.99058867e-002:22:2 6.57087564e-002:23:2 -3.51887941e-003:24:2 5.08189201e-003:25:2 6.17873073e-002:26:2 6.78404570e-002:27:2 -4.06777859e-003:28:2 3.47924232e-003:29:2 6.37527704e-002:30:2 7.00674057e-002:31:2 -4.63974476e-003:32:2 1.20830536e-003:33:2 6.58121109e-002:34:2 7.23987818e-002:35:2 -5.23394346e-003:36:2 -2.03561783e-003:37:2 6.79796934e-002:38:2 7.48466253e-002:39:2 -5.84852695e-003:40:2 -6.70039654e-003:41:2 7.02745914e-002:42:2 7.74260163e-002:43:2 -6.48117065e-003:44:2 -1.34445429e-002:45:2 7.27239251e-002:46:2 8.01559687e-002:47:2 -7.12722540e-003:48:2 -2.32374668e-002:49:2 7.53661394e-002:50:2 8.30587149e-002:51:2 -7.77983665e-003:52:2 -3.75094414e-002:53:2 7.82553554e-002:54:2 8.61577392e-002:55:2 -8.42881203e-003:56:2 -5.83756566e-002:57:2 8.14681649e-002:58:2 8.94711614e-002:59:2 -9.05936956e-003:60:2 -8.89736414e-002:61:2 8.51134062e-002:62:2 9.29930210e-002:63:2 -9.65029001e-003:64:2 -1.33974433e-001:65:2 8.93457532e-002:66:2 9.66519713e-002:67:2 -1.01726651e-002:68:2 -2.00364590e-001:69:2 9.43865776e-002:70:2 1.00216329e-001:71:2 -1.05881691e-002:72:2 -2.98653126e-001:73:2 1.00561380e-001:74:2 1.03088975e-001:75:2 -1.08492970e-002:76:2 -4.44752574e-001:77:2 1.08374894e-001:78:2 1.03861272e-001:79:2 -1.09045506e-002:80:2 1.26963919e+000:81:2 1.95280463e-003:1:3 1.47089586e-002:2:3 1.76537335e-002:3:3 8.81433487e-004:4:3 1.62871927e-003:5:3 1.48880109e-002:6:3 1.81543529e-002:7:3 8.01317394e-004:8:3 1.26869231e-003:9:3 1.50351301e-002:10:3 1.86709389e-002:11:3 7.19994307e-004:12:3 8.76188278e-004:13:3 1.51424482e-002:14:3 1.92034617e-002:15:3 6.37985766e-004:16:3 4.59365547e-004:17:3 1.52003840e-002:18:3 1.97517946e-002:19:3 5.55895269e-004:20:3 3.44067812e-005:21:3 1.51976198e-002:22:3 2.03159228e-002:23:3 4.74363565e-004:24:3 -3.70435417e-004:25:3 1.51203200e-002:26:3 2.08955929e-002:27:3 3.94187868e-004:28:3 -7.07738101e-004:29:3 1.49516836e-002:30:3 2.14910433e-002:31:3 3.16143036e-004:32:3 -9.00730491e-004:33:3 1.46712884e-002:34:3 2.21037492e-002:35:3 2.41182745e-004:36:3 -8.27394426e-004:37:3 1.42536983e-002:38:3 2.27384493e-002:39:3 1.70312822e-004:40:3 -2.96585262e-004:41:3 1.36662722e-002:42:3 2.34077200e-002:43:3 1.04539096e-004:44:3 9.89519060e-004:45:3 1.28651038e-002:46:3 2.41423324e-002:47:3 4.52548265e-005:48:3 3.49146128e-003:49:3 1.17855445e-002:50:3 2.50143409e-002:51:3 -6.06477261e-006:52:3 7.91811198e-003:53:3 1.03196874e-002:54:3 2.61889473e-002:55:3 -4.68268991e-005:56:3 1.53510049e-002:57:3 8.25995207e-003:58:3 2.80417874e-002:59:3 -7.30454922e-005:60:3 2.74141431e-002:61:3 5.15640527e-003:62:3 3.14272642e-002:63:3 -7.73742795e-005:64:3 4.64663729e-002:65:3 -4.91961837e-005:66:3 3.82981896e-002:67:3 -4.76986170e-005:68:3 7.56959766e-002:69:3 -9.96033102e-003:70:3 5.31542525e-002:71:3 3.36617231e-005:72:3 1.18696623e-001:73:3 -3.11232731e-002:74:3 8.64652768e-002:75:3 1.82218850e-004:76:3 1.77250177e-001:77:3 -8.01493004e-002:78:3 1.62844010e-001:79:3 3.72990966e-004:80:3 3.70457619e-001:81:3 -1.94299519e-002:1:4 -4.29900885e-002:2:4 -4.46090400e-002:3:4 -9.32568312e-003:4:4 -1.99223757e-002:5:4 -4.41277027e-002:6:4 -4.56707776e-002:7:4 -9.19926167e-003:8:4 -2.05248296e-002:9:4 -4.53242958e-002:10:4 -4.67669964e-002:11:4 -9.05865431e-003:12:4 -2.12781727e-002:13:4 -4.65851128e-002:14:4 -4.78978455e-002:15:4 -8.90254974e-003:16:4 -2.22397447e-002:17:4 -4.79148626e-002:18:4 -4.90632951e-002:19:4 -8.72927904e-003:20:4 -2.34917402e-002:21:4 -4.93195057e-002:22:4 -5.02631664e-002:23:4 -8.53705406e-003:24:4 -2.51514316e-002:25:4 -5.08040190e-002:26:4 -5.14954925e-002:27:4 -8.32334161e-003:28:4 -2.73882747e-002:29:4 -5.23730218e-002:30:4 -5.27569056e-002:31:4 -8.08507204e-003:32:4 -3.04469764e-002:33:4 -5.40301800e-002:34:4 -5.40418625e-002:35:4 -7.81828165e-003:36:4 -3.46826315e-002:37:4 -5.57771325e-002:38:4 -5.53413630e-002:39:4 -7.51793385e-003:40:4 -4.06115055e-002:41:4 -5.76119721e-002:42:4 -5.66406250e-002:43:4 -7.17717409e-003:44:4 -4.89873886e-002:45:4 -5.95291257e-002:46:4 -5.79162836e-002:47:4 -6.78706169e-003:48:4 -6.09136820e-002:49:4 -6.15183413e-002:50:4 -5.91302514e-002:51:4 -6.33493066e-003:52:4 -7.80129433e-002:53:4 -6.35689497e-002:54:4 -6.02180958e-002:55:4 -5.80465794e-003:56:4 -1.02684200e-001:57:4 -6.56839907e-002:58:4 -6.10629916e-002:59:4 -5.17284870e-003:60:4 -1.38504446e-001:61:4 -6.79274797e-002:62:4 -6.14374876e-002:63:4 -4.40979004e-003:64:4 -1.90873832e-001:65:4 -7.05530643e-002:66:4 -6.08605742e-002:67:4 -3.47629189e-003:68:4 -2.68103808e-001:69:4 -7.43488073e-002:70:4 -5.82465529e-002:71:4 -2.32648849e-003:72:4 -3.83357882e-001:73:4 -8.15300345e-002:74:4 -5.10229766e-002:75:4 -9.18030739e-004:76:4 -5.58358610e-001:77:4 -9.80281234e-002:78:4 -3.29008698e-002:79:4 7.53343105e-004:80:4 -1.03487262e+000:81:4 -4.05086130e-002:1:5 -1.96813941e-002:2:5 -1.75170377e-002:3:5 -3.92956212e-002:4:5 -4.09694165e-002:5:5 -1.90779418e-002:6:5 -1.68355182e-002:7:5 -4.00355458e-002:8:5 -4.14078534e-002:9:5 -1.84290186e-002:10:5 -1.61111578e-002:11:5 -4.07974124e-002:12:5 -4.18117419e-002:13:5 -1.77319124e-002:14:5 -1.53421238e-002:15:5 -4.15822193e-002:16:5 -4.21635881e-002:17:5 -1.69837624e-002:18:5 -1.45270005e-002:19:5 -4.23910990e-002:20:5 -4.24384028e-002:21:5 -1.61815882e-002:22:5 -1.36642158e-002:23:5 -4.32253629e-002:24:5 -4.25997600e-002:25:5 -1.53224766e-002:26:5 -1.27525255e-002:27:5 -4.40863371e-002:28:5 -4.25953865e-002:29:5 -1.44037008e-002:30:5 -1.17910653e-002:31:5 -4.49757650e-002:32:5 -4.23490703e-002:33:5 -1.34227648e-002:34:5 -1.07794404e-002:35:5 -4.58955243e-002:36:5 -4.17499691e-002:37:5 -1.23779103e-002:38:5 -9.71803814e-003:39:5 -4.68479246e-002:40:5 -4.06362265e-002:41:5 -1.12682879e-002:42:5 -8.60828906e-003:43:5 -4.78358343e-002:44:5 -3.87709588e-002:45:5 -1.00947395e-002:46:5 -7.45297968e-003:47:5 -4.88627702e-002:48:5 -3.58059108e-002:49:5 -8.85991752e-003:50:5 -6.25684857e-003:51:5 -4.99333069e-002:52:5 -3.12276408e-002:53:5 -7.56842643e-003:54:5 -5.02681732e-003:55:5 -5.10531813e-002:56:5 -2.42749155e-002:57:5 -6.22512400e-003:58:5 -3.77351046e-003:59:5 -5.22299260e-002:60:5 -1.38112232e-002:61:5 -4.82842326e-003:62:5 -2.51346827e-003:63:5 -5.34730330e-002:64:5 1.87699497e-003:65:5 -3.35020572e-003:66:5 -1.27779692e-003:67:5 -5.47941476e-002:68:5 2.54091918e-002:69:5 -1.68122351e-003:70:5 -1.37954950e-004:71:5 -5.62063530e-002:72:5 6.08700365e-002:73:5 5.12130558e-004:74:5 7.16239214e-004:75:5 -5.77210709e-002:76:5 1.14806302e-001:77:5 4.15385514e-003:78:5 7.50802457e-004:79:5 -5.93390614e-002:80:5 -1.17719274e+000:81:5 -1.45127624e-003:1:6 -9.84221697e-005:2:6 -1.04084611e-004:3:6 -5.01779839e-004:4:6 -1.39864907e-003:5:6 -8.20849091e-005:6:6 -9.33613628e-005:7:6 -5.12396917e-004:8:6 -1.32069737e-003:9:6 -6.48293644e-005:10:6 -8.27312469e-005:11:6 -5.23675233e-004:12:6 -1.20591745e-003:13:6 -4.67635691e-005:14:6 -7.25444406e-005:15:6 -5.35745174e-004:16:6 -1.03775971e-003:17:6 -2.83177942e-005:18:6 -6.33690506e-005:19:6 -5.48779964e-004:20:6 -7.92028382e-004:21:6 -1.00601465e-005:22:6 -5.57769090e-005:23:6 -5.63060865e-004:24:6 -4.33258712e-004:25:6 6.93090260e-006:26:6 -5.09843230e-005:27:6 -5.78947365e-004:28:6 9.03252512e-005:29:6 2.10478902e-005:30:6 -5.06769866e-005:31:6 -5.97024336e-004:32:6 8.54667276e-004:33:6 2.97874212e-005:34:6 -5.70118427e-005:35:6 -6.17943704e-004:36:6 1.97103806e-003:37:6 2.93385237e-005:38:6 -7.34776258e-005:39:6 -6.42845407e-004:40:6 3.60256620e-003:41:6 1.40164047e-005:42:6 -1.04714185e-004:43:6 -6.73267990e-004:44:6 5.98910078e-003:45:6 -2.47024000e-005:46:6 -1.57305971e-004:47:6 -7.11312518e-004:48:6 9.48312879e-003:49:6 -9.90759581e-005:50:6 -2.40264460e-004:51:6 -7.60242343e-004:52:6 1.46039128e-002:53:6 -2.26780772e-004:54:6 -3.65389511e-004:55:6 -8.24410468e-004:56:6 2.21177638e-002:57:6 -4.32491302e-004:58:6 -5.47809526e-004:59:6 -9.10259783e-004:60:6 3.31591666e-002:61:6 -7.49129802e-004:62:6 -8.05424526e-004:63:6 -1.02660991e-003:64:6 4.94141188e-002:65:6 -1.21712685e-003:66:6 -1.15740858e-003:67:6 -1.18575804e-003:68:6 7.34022856e-002:69:6 -1.87662058e-003:70:6 -1.62126310e-003:71:6 -1.40411779e-003:72:6 1.08919285e-001:73:6 -2.74052471e-003:74:6 -2.20908411e-003:75:6 -1.70260295e-003:76:6 1.61746584e-001:77:6 -3.71697173e-003:78:6 -2.93228775e-003:79:6 -2.10548379e-003:80:6 -6.05039001e-002:81:6 -2.90507823e-003:1:7 1.50948390e-003:2:7 4.02688049e-003:3:7 -6.41778111e-004:4:7 -3.37523408e-003:5:7 1.27132796e-003:6:7 4.16023284e-003:7:7 -6.50528818e-004:8:7 -3.90856154e-003:9:7 9.74919647e-004:10:7 4.29584458e-003:11:7 -6.54717907e-004:12:7 -4.50918823e-003:13:7 6.09429553e-004:14:7 4.43298183e-003:15:7 -6.53367490e-004:16:7 -5.17908484e-003:17:7 1.62063166e-004:18:7 4.57046926e-003:19:7 -6.45328313e-004:20:7 -5.91635145e-003:21:7 -3.82438302e-004:22:7 4.70657274e-003:23:7 -6.29359856e-004:24:7 -6.71278685e-003:25:7 -1.04220770e-003:26:7 4.83842567e-003:27:7 -6.04011118e-004:28:7 -7.55042769e-003:29:7 -1.83900818e-003:30:7 4.96131927e-003:31:7 -5.67767769e-004:32:7 -8.39639269e-003:33:7 -2.79895216e-003:34:7 5.06721251e-003:35:7 -5.18988818e-004:36:7 -9.19603184e-003:37:7 -3.95348668e-003:38:7 5.14165498e-003:39:7 -4.56146896e-004:40:7 -9.86364484e-003:41:7 -5.34036942e-003:42:7 5.15804999e-003:43:7 -3.78021970e-004:44:7 -1.02704130e-002:45:7 -7.00404868e-003:46:7 5.06557152e-003:47:7 -2.84098089e-004:48:7 -1.02301054e-002:49:7 -8.99358280e-003:50:7 4.76430543e-003:51:7 -1.75558031e-004:52:7 -9.48252156e-003:53:7 -1.13532171e-002:54:7 4.05256078e-003:55:7 -5.66616654e-005:56:7 -7.67547637e-003:57:7 -1.40910298e-002:58:7 2.51310319e-003:59:7 6.25066459e-005:60:7 -4.34122235e-003:61:7 -1.70883480e-002:62:7 -7.36691058e-004:63:7 1.61487609e-004:64:7 1.15482323e-003:65:7 -1.98504366e-002:66:7 -7.60327280e-003:67:7 2.01907009e-004:68:7 9.78025608e-003:69:7 -2.08411776e-002:70:7 -2.22864486e-002:71:7 1.16288662e-004:72:7 2.34060939e-002:73:7 -1.57380030e-002:74:7 -5.42083792e-002:75:7 -2.05622986e-004:76:7 4.66572195e-002:77:7 7.09376484e-003:78:7 -1.24868069e-001:79:7 -9.23378393e-004:80:7 -5.55185452e-002:81:7 -9.78646800e-004:1:8 -2.02451460e-003:2:8 -1.74782239e-003:3:8 5.70174307e-005:4:8 -1.10706501e-003:5:8 -2.14413367e-003:6:8 -1.80262886e-003:7:8 7.84788281e-005:8:8 -1.26673654e-003:9:8 -2.27729045e-003:10:8 -1.85921974e-003:11:8 1.02141872e-004:12:8 -1.46715902e-003:13:8 -2.42580101e-003:14:8 -1.91726722e-003:15:8 1.28366053e-004:16:8 -1.72122568e-003:17:8 -2.59164348e-003:18:8 -1.97609328e-003:19:8 1.57551840e-004:20:8 -2.04651244e-003:21:8 -2.77687050e-003:22:8 -2.03435309e-003:23:8 1.90248713e-004:24:8 -2.46676244e-003:25:8 -2.98345648e-003:26:8 -2.08953209e-003:27:8 2.27181241e-004:28:8 -3.01392376e-003:29:8 -3.21294554e-003:30:8 -2.13691033e-003:31:8 2.69310549e-004:32:8 -3.72996740e-003:33:8 -3.46570276e-003:34:8 -2.16756016e-003:35:8 3.18052247e-004:36:8 -4.66819108e-003:37:8 -3.73936631e-003:38:8 -2.16489844e-003:39:8 3.75468284e-004:40:8 -5.89152239e-003:41:8 -4.02550399e-003:42:8 -2.09789537e-003:43:8 4.44870442e-004:44:8 -7.46315345e-003:45:8 -4.30252217e-003:46:8 -1.90894864e-003:47:8 5.31723723e-004:48:8 -9.41874087e-003:49:8 -4.51971591e-003:50:8 -1.49199925e-003:51:8 6.45468011e-004:52:8 -1.16957296e-002:53:8 -4.56152670e-003:54:8 -6.54591247e-004:55:8 8.03178176e-004:56:8 -1.39672030e-002:57:8 -4.16533835e-003:58:8 9.44061205e-004:59:8 1.03665330e-003:60:8 -1.52638610e-002:61:8 -2.72953138e-003:62:8 3.88371758e-003:63:8 1.40682794e-003:64:8 -1.31285693e-002:65:8 1.14106573e-003:66:8 9.08300467e-003:67:8 2.03314424e-003:68:8 -1.73482485e-003:69:8 1.07245725e-002:70:8 1.77985616e-002:71:8 3.15451436e-003:72:8 3.23075727e-002:73:8 3.38185690e-002:74:8 3.11273541e-002:75:8 5.25758415e-003:76:8 1.19360752e-001:77:8 8.91731512e-002:78:8 4.77759149e-002:79:8 9.35069658e-003:80:8 -1.90430228e-002:81:8 
row_perm =
Serial 81 x 81 permtutation matrix:
perm =
81
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 
inv_perm =
81
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 
col_perm =
Serial 8 x 8 permtutation matrix:
perm =
8
1 2 3 4 5 6 7 8 
inv_perm =
8
1 2 3 4 5 6 7 8 
mat_perm = NULL
------------------------- END DEBUG ---------------------------------
--------------------------- DEBUG -----------------------------------
Gc'*y:
 8
  -2.63007:1 -2.87092:2 -1.61917:3 5.44874:4 2.88726:5 -0.372076:6 0.346699:7 -0.22059:8
------------------------- END DEBUG ---------------------------------
--------------------------- DEBUG -----------------------------------
Compute finite difference values, with following inputs:
xo:
 81
  0.678569:1 0.946009:2 0.0915581:3 0.908438:4 0.509953:5 0.614904:6 0.316071:7 0.0774875:8 0.850614:9 0.144527:10 0.370486:11 0.622391:12 0.997552:13 0.517344:14 0.990511:15 0.226534:16 0.398005:17 0.696569:18 0.0646408:19 0.747662:20 0.4204:21 0.811317:22 0.379605:23 0.319068:24 0.986051:25 0.718181:26 0.413183:27 0.0986302:28 0.734559:29 0.637306:30 0.0738419:31 0.120508:32 0.981596:33 0.496799:34 0.0224137:35 0.0538315:36 0.140874:37 0.893474:38 0.46582:39 0.560857:40 0.494456:41 0.0677855:42 0.897647:43 0.288565:44 0.269047:45 0.594194:46 0.475879:47 0.368311:48 0.655611:49 0.9382:50 0.620425:51 0.28284:52 0.205181:53 0.439134:54 0.0272502:55 0.876184:56 0.610092:57 0.203592:58 0.519917:59 0.0538243:60 0.862187:61 0.442935:62 0.548009:63 0.566861:64 0.680395:65 0.371379:66 0.0782287:67 0.456351:68 0.0478438:69 0.738257:70 0.0380015:71 0.954244:72 0.742372:73 0.93745:74 0.513364:75 0.240905:76 0.259965:77 0.758974:78 0.993343:79 0.356706:80 1:81
xl:
 81
  0:1 0:2 0:3 0:4 0:5 0:6 0:7 0:8 0:9 0:10 0:11 0:12 0:13 0:14 0:15 0:16 0:17 0:18 0:19 0:20 0:21 0:22 0:23 0:24 0:25 0:26 0:27 0:28 0:29 0:30 0:31 0:32 0:33 0:34 0:35 0:36 0:37 0:38 0:39 0:40 0:41 0:42 0:43 0:44 0:45 0:46 0:47 0:48 0:49 0:50 0:51 0:52 0:53 0:54 0:55 0:56 0:57 0:58 0:59 0:60 0:61 0:62 0:63 0:64 0:65 0:66 0:67 0:68 0:69 0:70 0:71 0:72 0:73 0:74 0:75 0:76 0:77 0:78 0:79 0:80 0:81
xu:
 81
  100:1 100:2 100:3 100:4 100:5 100:6 100:7 100:8 100:9 100:10 100:11 100:12 100:13 100:14 100:15 100:16 100:17 100:18 100:19 100:20 100:21 100:22 100:23 100:24 100:25 100:26 100:27 100:28 100:29 100:30 100:31 100:32 100:33 100:34 100:35 100:36 100:37 100:38 100:39 100:40 100:41 100:42 100:43 100:44 100:45 100:46 100:47 100:48 100:49 100:50 100:51 100:52 100:53 100:54 100:55 100:56 100:57 100:58 100:59 100:60 100:61 100:62 100:63 100:64 100:65 100:66 100:67 100:68 100:69 100:70 100:71 100:72 100:73 100:74 100:75 100:76 100:77 100:78 100:79 100:80 1000:81
y:
 81
  -0.997681:1 -0.997497:2 -0.997253:3 -0.99707:4 -0.996887:5 -0.996704:6 -0.99646:7 -0.996277:8 -0.996094:9 -0.995849:10 -0.995666:11 -0.995483:12 -0.9953:13 -0.995056:14 -0.994873:15 -0.99469:16 -0.994507:17 -0.994263:18 -0.994079:19 -0.993896:20 -0.993713:21 -0.993469:22 -0.993286:23 -0.993103:24 -0.99292:25 -0.992676:26 -0.992492:27 -0.992309:28 -0.992065:29 -0.991882:30 -0.991699:31 -0.991516:32 -0.991272:33 -0.991089:34 -0.990905:35 -0.990722:36 -0.990478:37 -0.990295:38 -0.990112:39 -0.989929:40 -0.989685:41 -0.989502:42 -0.989319:43 -0.989074:44 -0.988891:45 -0.988708:46 -0.988525:47 -0.988281:48 -0.988098:49 -0.987915:50 -0.987732:51 -0.987487:52 -0.987304:53 -0.987121:54 -0.986938:55 -0.986694:56 -0.986511:57 -0.986328:58 -0.986084:59 -0.9859:60 -0.985717:61 -0.985534:62 -0.98529:63 -0.985107:64 -0.984924:65 -0.984741:66 -0.984497:67 -0.984313:68 -0.98413:69 -0.983947:70 -0.983703:71 -0.98352:72 -0.983337:73 -0.983093:74 -0.98291:75 -0.982727:76 -0.982543:77 -0.982299:78 -0.982116:79 -0.981933:80 -0.98175:81
------------------------- END DEBUG ---------------------------------
--------------------------- DEBUG -----------------------------------
Compute finite difference values, with following outputs:
FDGc vector computed:
 8
  -2.63003:1 -2.87093:2 -1.61917:3 5.44875:4 2.88726:5 -0.372078:6 0.346702:7 -0.220587:8
------------------------- END DEBUG ---------------------------------

rel_err(Gf'*y,FDGf'*y) = rel_err(-9.81749931e-001,-9.81749929e-001) = 9.39920451e-010

rel_err(sum(Gc'*y),sum(FDGc'*y)) = rel_err(9.69863326e-001,9.69914436e-001) = 2.63466735e-005

Warning, rel_err(sum(Gc'*y),sum(FDGc'*y)) = rel_err(9.69863326e-001,9.69914436e-001) = 2.63466735e-005
exceeded warning_tol = 1.00000000e-008

Congradulations!  All of the computed errors were within the specified error tolerance!

Successful end of testing of the nlp

************************************
*** MoochoSolver::solve_nlp()    ***
************************************

*** Starting iterations ...


(0) 1: "EvalNewPoint"

x is not updated for any k so set x_k = nlp.xinit() ...

||x_k||inf            = 1.000000e+000

Updating the decomposition ...

DecompositionSystemVarReductPerm object currently does not have a basis so we must select one ...

The decomposition system object is selecting the basis (k = 0)...

****************************************************************
*** DecompositionSystemVarReductImp::get_basis_matrices(...) ***
****************************************************************

Must allocate a new matrix object for D = -inv(C)*N since one has not been allocated yet ...

Allocated a new explicit matrix object for D = -inv(C)*N of type 'class AbstractLinAlgPack::MultiVectorMutableDense' ...

Must allocate a new basis matrix object for C since one has not been allocated yet ...

Allocated a new basis matrix object C of type 'class AbstractLinAlgPack::MatrixOpNonsingAggr' ...

End DecompositionSystemVarReductImp::get_basis_matrices(...)

Using a direct sparse solver to select a new basis ...

Using LAPACK xGETRF to analyze and factor a new matrix ...

Selected a new basis

bs.var_dep()            = [1,8]
ds.var_indep()          = [9,81]
ds.equ_decomp()         = [1,8]
ds.equ_undecomp()       = [1,0]

***********************************************************
*** DecompositionSystemVarReductImp::update_decomp(...) ***
************************************************************

Warning!!! mat_rel != MATRICES_INDEP_IMPS; The decompsition matrix objects may not be independent of each other!

End DecompositionSystemVarReductImp::update_decomp(...)

Printing the updated iteration quantities ...

f_k                      = 1.000000e+000
||Gf_k||inf              = 1.000000e+000
||Gf_k(var_dep)_k||inf   = 1.000000e+000
||Gf_k(var_indep)_k||inf = 0.000000e+000
||c_k||inf               = 2.513467e+000

(0) 2: "QuasiNormalStep"

||py||   = 4.698774e+001

||Ypy||2 = 1.473040e+002

(0) 2.1: "CheckDecompositionFromPy"

beta = ||py||/||c|| = 1.869439e+001

num_basis_k was updated so the basis changed so we will skip this check
    reset min ||py||/||c|| to current value + 1

(0) 2.2: "CheckDecompositionFromRPy"

beta = ||R*py_k + c_k(decomp)||inf / (||c_k(decomp)||inf + small_number)
     = 1.152411e-013 / (2.513467e+000 + 2.225074e-308)
     = 4.584947e-014

num_basis_k was updated so the basis changed so we will skip this check
    reset min ||R*py+c||/||c|| to current value + epsilon(2.220446e-016)

(0) 3: "ReducedGradient"

||rGf||inf = 6.598924e-002

(0) 4.-1: "CheckSkipBFGSUpdate"

(0) 4: "ReducedHessian"

Initializing rHL = eye(n-r) (k = 0)...

(0) 5.-1: "SetDBoundsStd"

(0) 5: "TangentialStep"

Determine if we can use simple bounds on pz ...
    m = 8
    dynamic_cast<const MatrixIdentConcat*>(&Z_k) = 0000000000304E20
    ||Ypy_k(var_indep)||inf = 4.041664e+001

Using bounds on full Z*pz ...

There are finite bounds on dependent variables.  Adding extra inequality constrints for D*pz ...

*** Entering QPSchur::solve_qp(...)

*** Warm start info

Number of variables                                          = 74
Number of initially fixed variables (not in Ko)              = 1
Number of changes to the initial KKT system (num_act_change) = 0

    Number of initially fixed variables freed from a bound   = 0
    Number of initially free variables fixed to a bound      = 0
    Number of general equality constraints added             = 0
    Number of general inequality constraints added           = 0

Solution to the initial KKT system, vo = inv(Ko)*fo:

||vo||inf = 6.598924e-002

***
*** Removing constriants until we are dual feasible
***

*** Start by removing constraints within the Schur complement first

There where 0 constraints dropped from the schur complement from the initial guess of the active set.

Current guess for unknowns x:

||x||inf = 6.598924e-002
-------------- next part --------------

********************************************************************
*** Algorithm iteration summary output                           ***
***                                                              ***
*** Below, a summary table of the SQP iterations is given as     ***
*** well as a table of the CPU times for each step (if the       ***
*** option MoochoSolver::algo_timing = true is set).             ***
********************************************************************

*** Echoing input options ...

begin_options

options_group CalcFiniteDiffProd {
    fd_method_order = FD_ORDER_ONE;
}

options_group DecompositionSystemStateStepBuilderStd
 {
    null_space_matrix = EXPLICIT;
    range_space_matrix = ORTHOGONAL;
}

options_group NLPAlgoConfigMamaJama
 {
    line_search_method = FILTER;
    quasi_newton = BFGS;
}

options_group NLPSolverClientInterface
 {
    calc_conditioning = true;
    calc_matrix_info_null_space_only = true;
    calc_matrix_norms = true;
    feas_tol = 1e-7;
    journal_output_level = PRINT_ALGORITHM_STEPS;
    journal_print_digits = 10;
    max_iter = 500;
    max_run_time = 10.0;
    null_space_journal_output_level = PRINT_ITERATION_QUANTITIES;
    opt_tol = 1e-2;
}

end_options



More information about the Trilinos-Users mailing list