On 06.05.2012 14:27, Matthew Knepley wrote:
> On Sun, May 6, 2012 at 7:28 AM, Alexander Grayver 
> <agrayver at gfz-potsdam.de <mailto:agrayver at gfz-potsdam.de>> wrote:
>
>     Hello,
>
>     I use KSP and random rhs to compute largest singular value:
>
>
> 1) Is this the whole program? If not, this can be caused by memory 
> corruption somewhere else. This is what I suspect.

Matt,

I can reproduce error using attached test programm and this matrix (7 mb):
http://dl.dropbox.com/u/60982984/A.dat

>
> 2) You can put in CHKMEMQ; throughout the code to find exactly where 
> the memory corruption happens.
>
>    Matt
>
>      ! create solver and set options for singular value estimation
>      call KSPCreate(MPI_COMM_WORLD,ksp,ierr);CHKERRQ(ierr)
>      call KSPSetType(ksp,KSPGMRES,ierr);CHKERRQ(ierr)
>      call
>     
> KSPSetTolerances(ksp,solvertol,PETSC_DEFAULT_DOUBLE_PRECISION,PETSC_DEFAULT_DOUBLE_PRECISION,its,ierr);CHKERRQ(ierr)
>      call KSPGMRESSetRestart(ksp, its, ierr);CHKERRQ(ierr)
>      call KSPSetComputeSingularValues(ksp, flg, ierr);CHKERRQ(ierr)
>      call KSPSetFromOptions(ksp,ierr);CHKERRQ(ierr)
>
>      ! generate random RHS
>      call PetscRandomCreate(PETSC_COMM_WORLD,rctx,ierr)
>      call PetscRandomSetFromOptions(rctx,ierr)
>      call VecSetRandom(b,rctx,ierr)
>
>      !no preconditioning
>      call KSPGetPC(ksp,pc,ierr);CHKERRQ(ierr)
>      call PCSetType(pc,PCNONE,ierr);CHKERRQ(ierr)
>      call KSPSetOperators(ksp,A,A,SAME_PRECONDITIONER,ierr);CHKERRQ(ierr)
>      !solve system
>      call KSPSolve(ksp,b,x,ierr);CHKERRQ(ierr)
>      call KSPComputeExtremeSingularValues(ksp, smax, smin,
>     ierr);CHKERRQ(ierr)
>
>      call KSPDestroy(ksp,ierr);CHKERRQ(ierr)
>
>     However it crashes:
>
>     [1]PETSC ERROR:
>     ------------------------------------------------------------------------
>     [1]PETSC ERROR: Caught signal number 11 SEGV: Segmentation
>     Violation, probably memory access out of range
>     [1]PETSC ERROR: Try option -start_in_debugger or
>     -on_error_attach_debugger
>     [1]PETSC ERROR: or see
>     http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[1]PETSC
>     ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X
>     to find memory corruption errors
>     [1]PETSC ERROR: PetscMallocValidate: error detected at
>      PetscDefaultSignalHandler() line 157 in
>     /home/lib/petsc-dev1/src/sys/error/signal.c
>     [1]PETSC ERROR: Memory at address 0x4aa3f00 is corrupted
>     [1]PETSC ERROR: Probably write past beginning or end of array
>     [1]PETSC ERROR: Last intact block allocated in KSPSetUp_GMRES()
>     line 73 in /home/lib/petsc-dev1/src/ksp/ksp/impls/gmres/gmres.c
>     [1]PETSC ERROR: --------------------- Error Message
>     ------------------------------------
>     [1]PETSC ERROR: Memory corruption!
>     [1]PETSC ERROR:  !
>     [1]PETSC ERROR:
>     ------------------------------------------------------------------------
>     [1]PETSC ERROR: Petsc Development HG revision:
>     f3c119f7ddbfee243b51907a90acab15127ccb39  HG Date: Sun Apr 29
>     21:37:29 2012 -0500
>     [1]PETSC ERROR: See docs/changes/index.html for recent updates.
>     [1]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
>     [1]PETSC ERROR: See docs/index.html for manual pages.
>     [1]PETSC ERROR:
>     ------------------------------------------------------------------------
>     [1]PETSC ERROR: /home/prog on a openmpi-i named node207 by user
>     Sun May  6 12:58:24 2012
>     [1]PETSC ERROR: Libraries linked from
>     /home/lib/petsc-dev1/openmpi-intel-complex-debug-f/lib
>     [1]PETSC ERROR: Configure run at Mon Apr 30 10:20:49 2012
>     [1]PETSC ERROR: Configure options
>     --with-blacs-include=/opt/intel/Compiler/11.1/072/mkl/include
>     
> --with-blacs-lib=/opt/intel/Compiler/11.1/072/mkl/lib/em64t/libmkl_blacs_openmpi_lp64.a
>     
> --with-blas-lapack-lib="[/opt/intel/Compiler/11.1/072/mkl/lib/em64t/libmkl_intel_lp64.a,/opt/intel/Compiler/11.1/072/mkl/lib/em64t/libmkl_intel_thread.a,/opt/intel/Compiler/11.1/072/mkl/lib/em64t/libmkl_core.a,/opt/intel/Compiler/11.1/072/lib/intel64/libiomp5.a]"
>     --with-fortran-interfaces=1
>     --with-mpi-dir=/opt/mpi/intel/openmpi-1.4.2
>     --with-petsc-arch=openmpi-intel-complex-debug-f
>     --with-precision=double
>     --with-scalapack-include=/opt/intel/Compiler/11.1/072/mkl/include
>     
> --with-scalapack-lib=/opt/intel/Compiler/11.1/072/mkl/lib/em64t/libmkl_scalapack_lp64.a
>     --with-scalar-type=complex --with-x=0
>     PETSC_ARCH=openmpi-intel-complex-debug-f
>     [1]PETSC ERROR:
>     ------------------------------------------------------------------------
>     [1]PETSC ERROR: PetscMallocValidate() line 138 in
>     /home/lib/petsc-dev1/src/sys/memory/mtr.c
>     [1]PETSC ERROR: PetscDefaultSignalHandler() line 157 in
>     /home/lib/petsc-dev1/src/sys/error/signal.c
>
>
>     Call stack from debugger:
>
>     opal_memory_ptmalloc2_int_free, FP=7fffd4765300
>     opal_memory_ptmalloc2_free_hook, FP=7fffd4765330
>     PetscFreeAlign,      FP=7fffd4765370
>     PetscTrFreeDefault,  FP=7fffd4765520
>     KSPReset_GMRES,      FP=7fffd4765740
>     KSPReset,            FP=7fffd4765840
>     KSPDestroy,          FP=7fffd47659a0
>     kspdestroy_,         FP=7fffd47659d0
>
>
>     Any ideas?
>
>     Thanks.
>
>     -- 
>     Regards,
>     Alexander
>
>
>
>
> -- 
> What most experimenters take for granted before they begin their 
> experiments is infinitely more interesting than any results to which 
> their experiments lead.
> -- Norbert Wiener


-- 
Regards,
Alexander

-------------- next part --------------
An HTML attachment was scrubbed...
URL: 
<http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20120506/b76d0547/attachment.html>
-------------- next part --------------
An embedded and charset-unspecified text was scrubbed...
Name: test.c
URL: 
<http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20120506/b76d0547/attachment.c>

Reply via email to