[Trilinos-Users] Write a non-contiguous Tpetra::Vector

Wen Yan wenyan4work at gmail.com
Thu Nov 7 11:44:36 EST 2019


Hi trilinos users&developers,

Could you please help me understand how Tpetra::MatrixMarket::Writer works
for non-contiguous Vector?
I have a vector partitioned like this on two ranks:
[0,1,0,0,1,1]
Here the entries are the ranks they belong to. Then I write the vector and
map to files using:
    Tpetra::MatrixMarket::Writer::writeMapFile( );
    Tpetra::MatrixMarket::Writer::writeDenseFile( );

The results I get are:
%%MatrixMarket matrix array integer general
% Format: Version 2.0
%
% This file encodes a Tpetra::Map.
% It is stored as a dense vector, with twice as many
% entries as the global number of GIDs (global indices).
% (GID, PID) pairs are stored contiguously, where the PID
% is the rank of the process owning that GID.
12 1
0
0
2
0
3
0
1
1
4
1
5
1
%%MatrixMarket matrix array real general
6 1
0.00000000000000000e+00
0.00000000000000000e+00
0.00000000000000000e+00
1.00000000000000000e+00
1.00000000000000000e+00
1.00000000000000000e+00

The map written to file is what I expected, but the vector written to file
is not. I expect [0,1,0,0,1,1], but it seems that all entries from one rank
are grouped together and written to the file.
Is this how Tpetra::MatrixMarket::Writer::writeDenseFile( ) is designed to
work, or did I do something wrong?
The short test code is attached to this email.

Thank you very much!
Wen Yan

// Teuchos utility
#include <Teuchos_ArrayViewDecl.hpp>
#include <Teuchos_GlobalMPISession.hpp>
#include <Teuchos_SerialDenseMatrix.hpp>
#include <Teuchos_TimeMonitor.hpp>
#include <Teuchos_oblackholestream.hpp>

// Tpetra container
#include <MatrixMarket_Tpetra.hpp>
#include <Tpetra_Core.hpp>
#include <Tpetra_CrsMatrix.hpp>
#include <Tpetra_Map.hpp>
#include <Tpetra_MultiVector.hpp>
#include <Tpetra_Operator.hpp>
#include <Tpetra_RowMatrixTransposer_decl.hpp>
#include <Tpetra_Vector.hpp>
#include <Tpetra_Version.hpp>

void test() {

    using TMAP = Tpetra::Map<int, int>;          ///< default Teuchos::Map
type
    using TV = Tpetra::Vector<double, int, int>; ///< default to
Tpetra::Vector type

    int rank, nprocs;
    MPI_Comm_rank(MPI_COMM_WORLD, &rank);
    MPI_Comm_size(MPI_COMM_WORLD, &nprocs);
    const int localSize1 = 1;
    const int localSize2 = 2;
    const int globalSize1 = localSize1 * nprocs;
    const int globalSize2 = localSize2 * nprocs;

    std::vector<int> vecGlobalIndexOnLocal(localSize1 + localSize2);
    for (int i = 0; i < localSize1; i++) {
        vecGlobalIndexOnLocal[i] = rank * localSize1 + i;
    }
    for (int i = 0; i < localSize2; i++) {
        vecGlobalIndexOnLocal[i + localSize1] = rank * localSize2 + i +
globalSize1;
    }
    auto commRcp = Teuchos::rcp(new Teuchos::MpiComm<int>(MPI_COMM_WORLD));
    auto map = Teuchos::rcp(
        new TMAP(globalSize1 + globalSize2, vecGlobalIndexOnLocal.data(),
localSize1 + localSize2, 0, commRcp));

    auto vec = Teuchos::rcp(new TV(map, true));
    for (int i = 0; i < localSize1 + localSize2; i++) {
        vec->replaceLocalValue(i, rank);
    }

    Tpetra::MatrixMarket::Writer<TV> writer;
    writer.writeMapFile("map.mtx", *map);
    writer.writeDenseFile("vec.mtx", vec);
}

int main(int argc, char **argv) {
    MPI_Init(&argc, &argv);

    test();

    MPI_Finalize();
    return 0;
}
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://trilinos.org/pipermail/trilinos-users/attachments/20191107/d88246c0/attachment.html>


More information about the Trilinos-Users mailing list