Sorry, I forgot to include the link to MUST. Here it 
is: https://doc.itc.rwth-aachen.de/display/CCP/Project+MUST

On Friday, 17 November 2017 14:12:18 UTC+1, Lucas Campos wrote:
>
> Dear all,
>
> First of all, a bit of context:
> I am trying to debug an error in my application where randomly I start 
> seeing nan's. The probability of this increases with the number of MPI 
> processors I use, so it looks like it is a data race of some sort. Any 
> advice on the best way to find the error?
>
> My current approach is to use project MUST[1] to help me find the issues. 
> When I ran MUST with the debug version of my code on the local cluster, it 
> returned a errors related to the MPI internalities of 
> dealii/petsc(/MUMPS?). An exemplary output can be seen on errors.txt. The 
> output stopping in "Solving... " suggested that the error was in between 
> the following lines of my code:
>
> PetscPrintf(mpi_communicator, "Solving... \n");
>>
>> computing_timer.enter_section("solve");
>>>
>>
>>> SolverControl cn;
>>
>> PETScWrappers::SparseDirectMUMPS solver(cn, mpi_communicator);
>>
>> solver.set_symmetric_mode(false);
>>
>> solver.solve(system_matrix, distributed_dU, system_rhs); 
>>
>>
>>> computing_timer.exit_section("solve");
>>
>> PetscPrintf(mpi_communicator, "Solved! \n");
>>
>>
>>
>  Indeed, when I comment out the "solver.solve(system_matrix, 
> distributed_dU, system_rhs); " line, it runs with no errors at all.
>
> Could this be the source of my issues? Also, how can I solve this specific 
> issue?
>

-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to