Hi all,

In my quest to understand how to use Trilinos with MPI and Metis (with deal.II version 7.1.0), I am trying to convert step 17 from using PETSc to Trilinos. What I've done seems to work when running on a single process, but if I start say 2 processes, I hit a snag when creating a sparsity pattern:

GridTools::partition_triangulation (n_mpi_processes, triangulation);
dof_handler.distribute_dofs (fe);
DoFRenumbering::subdomain_wise (dof_handler);

const IndexSet local_dofs
       = DoFTools::dof_indices_with_subdomain_association  (dof_handler,
this_mpi_process);

    TrilinosWrappers::SparsityPattern sp (local_dofs,
                                          mpi_communicator);

DoFTools::make_sparsity_pattern (dof_handler,
                                     sp,
                                     hanging_node_constraints, true,
                                     this_mpi_process);

Additional information: the number of DOF at this point is 162, and the number of elements in the local_dofs indexset for the two processes are 92 and 94.
The last line results in the error:

source/dofs/dof_tools.cc> in function
void dealii::DoFTools::make_sparsity_pattern(const DH&, SparsityPattern&, const dealii::ConstraintMatrix&, bool, dealii::types::subdomain_id_t) [with DH = dealii::DoFHandler<2, 2>, SparsityPattern = dealii::TrilinosWrappers::SparsityPattern]
The violated condition was:
    sparsity.n_rows() == n_dofs
The name and call sequence of the exception was:
    ExcDimensionMismatch (sparsity.n_rows(), n_dofs)
Additional Information:
Dimension 186 not equal to 162

Can someone explain why the sparsity pattern constructor creates a larger sparsity pattern than there are DOF's and suggest what I should do to rectify the problem?

Thanks for the help!
Jean-Paul

_______________________________________________
dealii mailing list http://poisson.dealii.org/mailman/listinfo/dealii

Reply via email to