[Trilinos-Users] ML & memory usage

Charles Boivin charles.boivin at mayahtt.com
Tue Nov 20 13:45:52 MST 2012


Hello,

Using ML as a preconditioner, I see a larger-than-expected hit on memory usage. The particular case that I have investigated has a sparse matrix (Epetra_CrsMatrix, to be precise) with about 136M non-zeros over 2.4M rows. From tracking the memory usage, instantiation of such a matrix needs about 1600 MB of RAM. This value is in line with what I would expect for CRS storage (hand calcs end up with about 1560 MB).

The preconditioner seems to require about 1070 MB additional memory, which is more than I expected. I checked, and it ends up with a total of 5 levels; the second-finest level drops to slightly more than 15M non-zeros over 100k rows; other levels after that are somewhat irrelevant compared to the size of the two finest levels, (they end up with 2300, 40, and even fewer for the final level).

I would expect 15M non-zeros over 100k rows to require roughly 180MB, using CRS data storage. The next level over should only require about 6 MB, and so on. What kind of storage does ML use for the different levels it computes? Of course, I realize that there is additional data required (aggregate and smoother info, etc), but the difference seems a bit much to me, so I am wondering if I might be doing something wrong.

I did set "use crs matrix storage" to true, but I'm not sure if that has an impact on the coarser levels or not.

Also, sifting through the code, I saw a "low memory usage" flag. I tried activating that but encountered errors while creating the aggregate hierarchy - is this supposed to work and still supported? What does it do exactly?

Any insight on what is happening exactly would be much appreciated.

Thank you,

Charles Boivin
-------------- next part --------------
An HTML attachment was scrubbed...
URL: https://software.sandia.gov/pipermail/trilinos-users/attachments/20121120/97ba81de/attachment.html 


More information about the Trilinos-Users mailing list