The reference implementation of LAPACK tries a divide by zero in its setup to 
see if it can divide by zero and that is happening for you. 

  Hence the PETSc code has

  ierr = PetscFPTrapPush(PETSC_FP_TRAP_OFF);CHKERRQ(ierr);
#if !defined(PETSC_USE_COMPLEX)
  
PetscStackCallBLAS("LAPACKgesvd",LAPACKgesvd_("N","N",&bn,&bn,R,&bN,realpart,&sdummy,&idummy,&sdummy,&idummy,work,&lwork,&lierr));
#else
  
PetscStackCallBLAS("LAPACKgesvd",LAPACKgesvd_("N","N",&bn,&bn,R,&bN,realpart,&sdummy,&idummy,&sdummy,&idummy,work,&lwork,realpart+N,&lierr));
#endif
  if (lierr) SETERRQ1(PETSC_COMM_SELF,PETSC_ERR_LIB,"Error in SVD Lapack 
routine %d",(int)lierr);
  ierr = PetscFPTrapPop();CHKERRQ(ierr);

which is suppose to turn off the trapping. The code that turns off the trapping 
is OS dependent, perhaps it does not work for you.

There is a bit better code in the current release than 3.11 I recommend you 
first upgrade.

What system are you running on?

Barry




> On Oct 22, 2020, at 2:12 PM, baikadi pranay <[email protected]> wrote:
> 
> Hello,
> 
> I am trying to find the condition number of the A matrix for a linear system 
> I am solving. I have used the following commands.
> ./a.out -ksp_monitor_singular_value -ksp_type gmres -ksp_gmres_restart 1000 
> -pc_type none
> However, the execution comes to a halt after a few iterations with the 
> following error.
> [0]PETSC ERROR: 
> ------------------------------------------------------------------------
> [0]PETSC ERROR: Caught signal number 8 FPE: Floating Point Exception,probably 
> divide by zero
> [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
> [0]PETSC ERROR: or see 
> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind 
> <http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind>
> [0]PETSC ERROR: or try http://valgrind.org <http://valgrind.org/> on 
> GNU/linux and Apple Mac OS X to find memory corruption errors
> [0]PETSC ERROR: likely location of problem given in stack below
> [0]PETSC ERROR: ---------------------  Stack Frames 
> ------------------------------------
> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,
> [0]PETSC ERROR:       INSTEAD the line number of the start of the function
> [0]PETSC ERROR:       is given.
> [0]PETSC ERROR: [0] LAPACKgesvd line 40 
> /packages/7x/petsc/3.11.1/petsc-3.11.1/src/ksp/ksp/impls/gmres/gmreig.c
> [0]PETSC ERROR: [0] KSPComputeExtremeSingularValues_GMRES line 22 
> /packages/7x/petsc/3.11.1/petsc-3.11.1/src/ksp/ksp/impls/gmres/gmreig.c
> [0]PETSC ERROR: [0] KSPComputeExtremeSingularValues line 59 
> /packages/7x/petsc/3.11.1/petsc-3.11.1/src/ksp/ksp/interface/itfunc.c
> [0]PETSC ERROR: [0] KSPMonitorSingularValue line 130 
> /packages/7x/petsc/3.11.1/petsc-3.11.1/src/ksp/ksp/interface/iterativ.c
> [0]PETSC ERROR: [0] KSPMonitor line 1765 
> /packages/7x/petsc/3.11.1/petsc-3.11.1/src/ksp/ksp/interface/itfunc.c
> [0]PETSC ERROR: [0] KSPGMRESCycle line 122 
> /packages/7x/petsc/3.11.1/petsc-3.11.1/src/ksp/ksp/impls/gmres/gmres.c
> [0]PETSC ERROR: [0] KSPSolve_GMRES line 225 
> /packages/7x/petsc/3.11.1/petsc-3.11.1/src/ksp/ksp/impls/gmres/gmres.c
> [0]PETSC ERROR: [0] KSPSolve line 678 
> /packages/7x/petsc/3.11.1/petsc-3.11.1/src/ksp/ksp/interface/itfunc.c
> [0]PETSC ERROR: --------------------- Error Message 
> --------------------------------------------------------------
> [0]PETSC ERROR: Signal received
> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html 
> <http://www.mcs.anl.gov/petsc/documentation/faq.html> for trouble shooting.
> [0]PETSC ERROR: Petsc Release Version 3.11.1, Apr, 12, 2019 
> [0]PETSC ERROR: ./a.out on a linux-gnu-c-debug named cg17-9.agave.rc.asu.edu 
> <http://cg17-9.agave.rc.asu.edu/> by pbaikadi Thu Oct 22 12:07:11 2020
> [0]PETSC ERROR: Configure options 
> [0]PETSC ERROR: #1 User provided function() line 0 in  unknown file
> --------------------------------------------------------------------------
> MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
> with errorcode 59.
> 
> NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
> You may or may not see output from other processes, depending on
> exactly when Open MPI kills them.
> --------------------------------------------------------------------------
> Is the error because the A matrix is singular (causing the max/min to be 
> undefined)? Please let me know.
> 
> Thank you,
> Sincerely,
> Pranay.
> ᐧ

Reply via email to