I attached the complete output.
I used 2 processors for this run. I wanted to use only one to have a cleaner output but I could not because of this error:

[0]PCSetData_AGG bs=1 MM=32768
[0]PETSC ERROR: --------------------- Error Message ------------------------------------
[0]PETSC ERROR: Arguments are incompatible!
[0]PETSC ERROR: MatMatMult requires A, mpiaij, to be compatible with B, seqaij! [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012
[0]PETSC ERROR: See docs/changes/index.html for recent updates.
[0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
[0]PETSC ERROR: See docs/index.html for manual pages.
[0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: ./hit on a named nid15343 by Unknown Wed May 29 18:15:24 2013
[0]PETSC ERROR: Libraries linked from
[0]PETSC ERROR: Configure run at
[0]PETSC ERROR: Configure options
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: MatMatMult() line 8614 in src/mat/interface/matrix.c
[0]PETSC ERROR: PCGAMGOptprol_AGG() line 1358 in src/ksp/pc/impls/gamg/agg.c
[0]PETSC ERROR: PCSetUp_GAMG() line 673 in src/ksp/pc/impls/gamg/gamg.c
[0]PETSC ERROR: PCSetUp() line 832 in src/ksp/pc/interface/precon.c
[0]PETSC ERROR: KSPSetUp() line 278 in src/ksp/ksp/interface/itfunc.c
hit: gamg.c:568: PCSetUp_GAMG: Assertion `pc->setupcalled' failed.

On 05/29/2013 02:29 PM, Matthew Knepley wrote:
On Wed, May 29, 2013 at 5:25 PM, Michele Rosso <[email protected] <mailto:[email protected]>> wrote:

    Hi,

    I proceeded as Matt suggested. I am running without nullspace
    (Dirichlet's BCs) with the following options:

    -pc_type gamg  -pc_mg_cycle_type v -pc_gamg_agg_nsmooths 1
    -mg_coarse_sub_pc_factor_shift_type NONZERO -ksp_view -options_left

    The results are fine and the run finishes. The output of -ksp_view
    does not say anything about the coarse solver being shifted.
    In fact the option -mg_coarse_sub_pc_factor_shift_type NONZERO is
    not used:


Please show us the COMPLETE output.

  Thanks,

     Matt

    #PETSc Option Table entries:
    -ksp_view
    -mg_coarse_sub_pc_factor_shift_type NONZERO
    -options_left
    -pc_gamg_agg_nsmooths 1
    -pc_mg_cycle_type v
    -pc_type gamg
    #End of PETSc Option Table entries
    There is one unused database option. It is:
    Option left: name:-mg_coarse_sub_pc_factor_shift_type value: NONZERO

    Michele


    On 05/24/2013 03:20 PM, Matthew Knepley wrote:
    On Fri, May 24, 2013 at 5:18 PM, Michele Rosso <[email protected]
    <mailto:[email protected]>> wrote:

        Using

        -pc_type gamg -pc_mg_cycle_type v -pc_gamg_agg_nsmooths 1
        -mg_coarse_sub_pc_factor_shift_type NONZERO  -option_left
        -ksp_view


    This is what debugging is about. We are not running your problem.
    How could this be debugged?

    1) Run the problem w/o a null space so that it finishes

    2) Look at the output for -ksp_view

    3) Does the coarse solver say that it is shifted?

    4) Are there options which were unused?

       Matt

        still produces the same error:

        [0]PCSetData_AGG bs=1 MM=131072
        [0]PETSC ERROR: --------------------- Error Message
        ------------------------------------
        [0]PETSC ERROR: Detected zero pivot in LU factorization:
        see
        http://www.mcs.anl.gov/petsc/documentation/faq.html#ZeroPivot!
        [0]PETSC ERROR: Zero pivot row 280 value 6.56964e-17
        tolerance 2.22045e-14!
        [0]PETSC ERROR:
        ------------------------------------------------------------------------
        [0]PETSC ERROR: Petsc Release Version 3.3.0, Patch 3, Wed Aug
        29 11:26:24 CDT 2012
        [0]PETSC ERROR: See docs/changes/index.html for recent updates.
        [0]PETSC ERROR: See docs/faq.html for hints about trouble
        shooting.
        [0]PETSC ERROR: See docs/index.html for manual pages.
        [0]PETSC ERROR:
        ------------------------------------------------------------------------
        [0]PETSC ERROR: ./hit on a  named nid15363 by Unknown Fri May
        24 17:06:50 2013
        [0]PETSC ERROR: Libraries linked from
        [0]PETSC ERROR: Configure run at
        [0]PETSC ERROR: Configure options
        [0]PETSC ERROR:
        ------------------------------------------------------------------------
        [0]PETSC ERROR: MatPivotCheck_none() line 583 in
        
src/mat/impls/aij/seq//ptmp/skelly/petsc/3.3.03/cray_interlagos_build/real/src/include/petsc-private/matimpl.h
        [0]PETSC ERROR: MatPivotCheck() line 602 in
        
src/mat/impls/aij/seq//ptmp/skelly/petsc/3.3.03/cray_interlagos_build/real/src/include/petsc-private/matimpl.h
        [0]PETSC ERROR: MatLUFactorNumeric_SeqAIJ() line 585 in
        src/mat/impls/aij/seq/aijfact.c
        [0]PETSC ERROR: MatLUFactorNumeric() line 2803 in
        src/mat/interface/matrix.c
        [0]PETSC ERROR: PCSetUp_LU() line 160 in
        src/ksp/pc/impls/factor/lu/lu.c
        [0]PETSC ERROR: PCSetUp() line 832 in
        src/ksp/pc/interface/precon.c
        [0]PETSC ERROR: KSPSetUp() line 278 in
        src/ksp/ksp/interface/itfunc.c
        [0]PETSC ERROR: PCSetUpOnBlocks_BJacobi_Singleblock() line
        715 in src/ksp/pc/impls/bjacobi/bjacobi.c
        [0]PETSC ERROR: PCSetUpOnBlocks() line 865 in
        src/ksp/pc/interface/precon.c
        [0]PETSC ERROR: KSPSetUpOnBlocks() line 154 in
        src/ksp/ksp/interface/itfunc.c
        [0]PETSC ERROR: KSPSolve() line 403 in
        src/ksp/ksp/interface/itfunc.c
        [0]PETSC ERROR: PCSetUpOnBlocks_BJacobi_Singleblock() line
        715 in src/ksp/pc/impls/bjacobi/bjacobi.c
        [0]PETSC ERROR: PCSetUpOnBlocks() line 865 in
        src/ksp/pc/interface/precon.c
        [0]PETSC ERROR: KSPSetUpOnBlocks() line 154 in
        src/ksp/ksp/interface/itfunc.c
        [0]PETSC ERROR: KSPSolve() line 403 in
        src/ksp/ksp/interface/itfunc.c
        [0]PETSC ERROR: PCMGMCycle_Private() line 20 in
        src/ksp/pc/impls/mg/mg.c
        [0]PETSC ERROR: PCMGMCycle_Private() line 49 in
        src/ksp/pc/impls/mg/mg.c
        [0]PETSC ERROR: PCMGMCycle_Private() line 49 in
        src/ksp/pc/impls/mg/mg.c
        [0]PETSC ERROR: PCMGMCycle_Private() line 49 in
        src/ksp/pc/impls/mg/mg.c
        [0]PETSC ERROR: PCApply_MG() line 326 in src/ksp/pc/impls/mg/mg.c
        [0]PETSC ERROR: PCApply() line 384 in
        src/ksp/pc/interface/precon.c
        [0]PETSC ERROR: KSPSolve_CG() line 139 in
        src/ksp/ksp/impls/cg/cg.c
        [0]PETSC ERROR: KSPSolve() line 446 in
        src/ksp/ksp/interface/itfunc.c

        On 05/24/2013 02:51 PM, Jed Brown wrote:
        Michele Rosso<[email protected]>  <mailto:[email protected]>  writes:

             With petsc-3.4 (which you should upgrade to), use
             -mg_coarse_sub_pc_factor_shift_type NONZERO
        Actually, use this with petsc-3.3 also (and please upgrade to
        petsc-3.4).

        The option you were passing was not being used.





-- What most experimenters take for granted before they begin their
    experiments is infinitely more interesting than any results to
    which their experiments lead.
    -- Norbert Wiener




--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

[0]PCSetData_AGG bs=1 MM=16384
KSP Object: 2 MPI processes
  type: cg
  maximum iterations=10000
  tolerances:  relative=1e-14, absolute=1e-50, divergence=10000
  left preconditioning
  using nonzero initial guess
  using PRECONDITIONED norm type for convergence test
PC Object: 2 MPI processes
  type: gamg
    MG: type is MULTIPLICATIVE, levels=3 cycles=v
      Cycles per PCApply=1
      Using Galerkin computed coarse grid matrices
  Coarse grid solver -- level -------------------------------
    KSP Object:    (mg_coarse_)     2 MPI processes
      type: gmres
        GMRES: restart=30, using Classical (unmodified) Gram-Schmidt 
Orthogonalization with no iterative refinement
        GMRES: happy breakdown tolerance 1e-30
      maximum iterations=1, initial guess is zero
      tolerances:  relative=1e-05, absolute=1e-50, divergence=10000
      left preconditioning
      using NONE norm type for convergence test
    PC Object:    (mg_coarse_)     2 MPI processes
      type: bjacobi
        block Jacobi: number of blocks = 2
        Local solve info for each block is in the following KSP and PC objects:
      [0] number of local blocks = 1, first local block number = 0
              [0] local block number 0
    KSP Object:            KSP Object:    (mg_coarse_sub_)            
(mg_coarse_sub_)     1 MPI processes
             1 MPI processes
      type: preonly
              type: preonly
      maximum iterations=10000, initial guess is zero
              maximum iterations=10000, initial guess is zero
            tolerances:  relative=1e-05, absolute=1e-50, divergence=10000
  tolerances:  relative=1e-05, absolute=1e-50, divergence=10000
                  left preconditioning
  left preconditioning
                  using NONE norm type for convergence test
  using NONE norm type for convergence test
                PC Object:PC Object:                
(mg_coarse_sub_)(mg_coarse_sub_)                 1 MPI processes
 1 MPI processes
                  type: lu
  type: lu
                      LU: out-of-place factorization
  LU: out-of-place factorization
                      tolerance for zero pivot 2.22045e-14
  tolerance for zero pivot 2.22045e-14
                      matrix ordering: nd
  matrix ordering: nd
                      factor fill ratio given 5, needed 1.07862
  factor fill ratio given 5, needed 0
                        Factored matrix follows:
    Factored matrix follows:
                                Matrix Object:Matrix Object:                    
             1 MPI processes
 1 MPI processes
                                  type: seqaij
  type: seqaij
                                    rows=61, cols=61
rows=0, cols=0
                                    package used to perform factorization: petsc
package used to perform factorization: petsc
                                    total: nonzeros=3677, allocated 
nonzeros=3677
total: nonzeros=1, allocated nonzeros=1
                                    total number of mallocs used during 
MatSetValues calls =0
total number of mallocs used during MatSetValues calls =0
                                        not using I-node routines
not using I-node routines
                  linear system matrix = precond matrix:
  linear system matrix = precond matrix:
                    Matrix Object:Matrix Object:                     1 MPI 
processes
 1 MPI processes
                      type: seqaij
  type: seqaij
                        rows=61, cols=61
rows=0, cols=0
                        total: nonzeros=3409, allocated nonzeros=3409
total: nonzeros=0, allocated nonzeros=0
                        total number of mallocs used during MatSetValues calls 
=0
total number of mallocs used during MatSetValues calls =0
                            not using I-node routines
not using I-node routines
        - - - - - - - - - - - - - - - - - -
      [1] number of local blocks = 1, first local block number = 1
        [1] local block number 0
        - - - - - - - - - - - - - - - - - -
      linear system matrix = precond matrix:
      Matrix Object:       2 MPI processes
        type: mpiaij
        rows=61, cols=61
        total: nonzeros=3409, allocated nonzeros=3409
        total number of mallocs used during MatSetValues calls =0
          not using I-node (on process 0) routines
  Down solver (pre-smoother) on level 1 -------------------------------
    KSP Object:    (mg_levels_1_)     2 MPI processes
      type: chebyshev
        Chebyshev: eigenvalue estimates:  min = 0.0266892, max = 1.42507
      maximum iterations=2
      tolerances:  relative=1e-05, absolute=1e-50, divergence=10000
      left preconditioning
      using nonzero initial guess
      using NONE norm type for convergence test
    PC Object:    (mg_levels_1_)     2 MPI processes
      type: jacobi
      linear system matrix = precond matrix:
      Matrix Object:       2 MPI processes
        type: mpiaij
        rows=3102, cols=3102
        total: nonzeros=108330, allocated nonzeros=108330
        total number of mallocs used during MatSetValues calls =0
          not using I-node (on process 0) routines
  Up solver (post-smoother) same as down solver (pre-smoother)
  Down solver (pre-smoother) on level 2 -------------------------------
    KSP Object:    (mg_levels_2_)     2 MPI processes
      type: chebyshev
        Chebyshev: eigenvalue estimates:  min = 0.182939, max = 2.0291
      maximum iterations=2
      tolerances:  relative=1e-05, absolute=1e-50, divergence=10000
      left preconditioning
      using nonzero initial guess
      using NONE norm type for convergence test
    PC Object:    (mg_levels_2_)     2 MPI processes
      type: jacobi
      linear system matrix = precond matrix:
      Matrix Object:       2 MPI processes
        type: mpiaij
        rows=32768, cols=32768
        total: nonzeros=229376, allocated nonzeros=229376
        total number of mallocs used during MatSetValues calls =0
  Up solver (post-smoother) same as down solver (pre-smoother)
  linear system matrix = precond matrix:
  Matrix Object:   2 MPI processes
    type: mpiaij
    rows=32768, cols=32768
    total: nonzeros=229376, allocated nonzeros=229376
    total number of mallocs used during MatSetValues calls =0
#PETSc Option Table entries:
-ksp_view
-mg_coarse_sub_pc_factor_shift_type NONZERO
-options_left
-pc_gamg_agg_nsmooths 1
-pc_mg_cycle_type v
-pc_type gamg
#End of PETSc Option Table entries
There is one unused database option. It is:
Option left: name:-mg_coarse_sub_pc_factor_shift_type value: NONZERO

Reply via email to