This message finally appeared in the spam filters for the libmesh-devel mailing list, but I believe it has already been discussed.
-- John On Mon, Jan 9, 2017 at 2:57 PM, Nestola Maria Giuseppina Chiara < maria.giuseppina.chiara.nest...@usi.ch> wrote: > Dear all, > we are trying to build parallel meshes. What i need is that each > processor has just its own mesh. > > We are coupling a fem code with FD code thus we need to guarantee also the > same partition. > > We have already implemented a routine to do this but we have some > questions. > > 1) > > If our setup of elements already includes elements across process > boundaries, do we need to still call gather_neighboring_elements()? > I added the following line to our code: > > MeshCommunication().gather_neighboring_elements(cast_ref<DistributedMesh > &>(mesh)); > > but it seems to take forever to execute that in dbg mode (Even with very > few grid points i.e. ~100 [8 procs]). > > 2) > > The mesh looks ok if I output it even without the above line added but I > trip different asserts in dbg mode depending on the type of element I use > (I have both implemented). > > If I use Tri3 I trip: > > Assertion `oldvar == static_cast<Told>(static_cast<Tnew>(oldvar))' failed. > oldvar = 18446744073709551615 > static_cast<Told>(static_cast<Tnew>(oldvar)) = 4294967295 > > If I use Quad4 I trip: > > Assertion `i < _points.size()' failed. > i = 0 > _points.size() = 0 > > The messages are identical on each process. > > > Best > Barna & Maria > > ------------------------------------------------------------ > ------------------ > Check out the vibrant tech community on one of the world's most > engaging tech sites, SlashDot.org! http://sdm.link/slashdot > _______________________________________________ > Libmesh-devel mailing list > libmesh-de...@lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/libmesh-devel > > ------------------------------------------------------------------------------ Check out the vibrant tech community on one of the world's most engaging tech sites, SlashDot.org! http://sdm.link/slashdot _______________________________________________ Libmesh-users mailing list Libmesh-users@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/libmesh-users