> On Aug 20, 2020, at 11:21 AM, Stefano Zampini <stefano.zamp...@gmail.com> 
> wrote:
> 
> 
> 
>> On Aug 20, 2020, at 5:59 PM, Manav Bhatia <bhatiama...@gmail.com 
>> <mailto:bhatiama...@gmail.com>> wrote:
>> 
>> 
>> 
>>> On Aug 20, 2020, at 8:31 AM, Stefano Zampini <stefano.zamp...@gmail.com 
>>> <mailto:stefano.zamp...@gmail.com>> wrote:
>>> 
>>> ((Mat_SeqAIJ*)aij->B->data)->nonew
>>> mat->was_assembled
>>> aij->donotstash
>>> mat->nooffprocentries
>>> 
>> 
>> The values for the last three variables are all False on all 8 processes. 
> 
> Thanks, it seems a bug in MPI or on our side. As Matt said, can you run with 
> -matstash_legacy?

I did that with the result that a smaller case that had been running got stuck 
in the MatAssemblyEnd_MPIAIJ routine with -matstash_legacy. Not sure what this 
implies. 

> Also, make sure to run with a debug version of PETSc (configure using 
> —with-debugging=1).

Trying that now. 

> How feasible is to write a driver code to run with the same mesh, same 
> discretization and same equations to solve but with the matrix assembly only? 
> Does it hang in that case too?

My code is on GitHub (https://github.com/MASTmultiphysics/MAST3), including the 
specific example that is producing this error 
(https://github.com/MASTmultiphysics/MAST3/tree/master/examples/structural/example_6),
 but has multiple dependencies, including libMesh/PETSc/SLEPc/Eigen. 

The mesh generation and sparsity pattern creation is currently done by libMesh. 

If needed, I may be able to store all the necessary information (mesh, nnz, 
noz) in a text file, remove dependencies on libMesh and directly try to 
initialize with information from a file. Would that help? 

> 
>> 
>> Regards,
>> Manav

Reply via email to