Hi PETSc-developers,

In our application we call KSP_Solve as part of a step to propagate a beam 
through a lattice. I am observing a crash within KSP_Solve for an application 
only after the 43rd call to KSP_Solve when building the application and PETSc 
in debug mode, full logs for which are attached with this email (1 MPI rank and 
4 OMP threads were used, but this crash occurs with multiple MPI ranks as well 
). I am also including the last few lines of the configuration for this build. 
This crash does not occur when building the application and PETSc in release 
mode.

Could someone tell me what causes this crash and if anything can be done to 
prevent it? Thanks in advance.

The configuration of this solver is here : 
https://github.com/fnalacceleratormodeling/synergia2/blob/sajid/features/openpmd_basic_integration/src/synergia/collective/space_charge_3d_fd_utils.cc#L273-L292

Thank You,
Sajid Ali (he/him) | Research Associate
Scientific Computing Division
Fermi National Accelerator Laboratory
s-sajid-ali.github.io<http://s-sajid-ali.github.io>

​

Attachment: ksp_crash_log
Description: ksp_crash_log

Reply via email to