Re: [petsc-users] Convergence of AMG
Var: 0,…,5 are the 6 variables that I am solving for: u, v, w, theta_x, theta_y, theta_z. The norms identified in my email are the L2 norms of all dofs corresponding to each variable in the solution vector. So, var: 0: u: norm is the L2 norm of the dofs for u only, and so on. I expect u, v, theta_z to be zero for the solution, which ends up being the case. If I plot the solution, they look sensible, but the reduction of KSP norm is slow. Thanks, Manav > On Oct 28, 2018, at 3:55 PM, Smith, Barry F. wrote: > > > >> On Oct 28, 2018, at 12:16 PM, Manav Bhatia wrote: >> >> Hi, >> >> I am attempting to solve a Mindlin plate bending problem with AMG solver >> in petsc. This test case is with a mesh of 300x300 elements and 543,606 >> dofs. >> >> The discretization includes 6 variables (u, v, w, tx, ty, tz), but only >> three are relevant for plate bending (w, tx, ty). >> >> I am calling the solver with the following options: >> >> -pc_type gamg -pc_gamg_threshold 0. --node-major-dofs -mat_block_size 6 >> -ksp_rtol 1.e-8 -ksp_monitor -ksp_converged_reason -ksp_view >> >> And the convergence behavior is shown below, along with the ksp_view >> information. Based on notes in the manual, this seems to be subpar >> convergence rate. At the end of the solution the norm of each variable is : >> >> var: 0: u : norm: 5.505909e-18 >> var: 1: v : norm: 7.639640e-18 >> var: 2: w : norm: 3.901464e-03 >> var: 3: tx : norm: 4.403576e-02 >> var: 4: ty : norm: 4.403576e-02 >> var: 5: tz : norm: 1.148409e-16 > > What do you mean by var: 2: w : norm etc? Is this the norm of the error for > that variable, the norm of the residual, something else? How exactly are you > calculating it? > >Thanks > > > Barry > >> >> I tried different values of -ksp_rtol from 1e-1 to 1e-8 and this does not >> make a lot of difference in the norms of (w, tx, ty). >> >> I do provide the solver with 6 rigid-body vectors to approximate the >> null-space of the problem. Without these the solver shows very poor >> convergence. >> >> I would appreciate advice on possible strategies to improve this behavior. >> >> Thanks, >> Manav >> >>0 KSP Residual norm 1.696304497261e+00 >>1 KSP Residual norm 1.120485505777e+00 >>2 KSP Residual norm 8.324222302402e-01 >>3 KSP Residual norm 6.477349534115e-01 >>4 KSP Residual norm 5.080936471292e-01 >>5 KSP Residual norm 4.051099646638e-01 >>6 KSP Residual norm 3.260432664653e-01 >>7 KSP Residual norm 2.560483838143e-01 >>8 KSP Residual norm 2.029943986124e-01 >>9 KSP Residual norm 1.560985741610e-01 >> 10 KSP Residual norm 1.163720702140e-01 >> 11 KSP Residual norm 8.488411085459e-02 >> 12 KSP Residual norm 5.888041729034e-02 >> 13 KSP Residual norm 4.027792209980e-02 >> 14 KSP Residual norm 2.819048087304e-02 >> 15 KSP Residual norm 1.904674196962e-02 >> 16 KSP Residual norm 1.289302447822e-02 >> 17 KSP Residual norm 9.162203296376e-03 >> 18 KSP Residual norm 7.016781679507e-03 >> 19 KSP Residual norm 5.399170865328e-03 >> 20 KSP Residual norm 4.254385887482e-03 >> 21 KSP Residual norm 3.530831740621e-03 >> 22 KSP Residual norm 2.946780747923e-03 >> 23 KSP Residual norm 2.339361361128e-03 >> 24 KSP Residual norm 1.815072489282e-03 >> 25 KSP Residual norm 1.408814185342e-03 >> 26 KSP Residual norm 1.063795714320e-03 >> 27 KSP Residual norm 7.828540233117e-04 >> 28 KSP Residual norm 5.683910750067e-04 >> 29 KSP Residual norm 4.131151010250e-04 >> 30 KSP Residual norm 3.065608221019e-04 >> 31 KSP Residual norm 2.634114273459e-04 >> 32 KSP Residual norm 2.198180137626e-04 >> 33 KSP Residual norm 1.748956510799e-04 >> 34 KSP Residual norm 1.317539710010e-04 >> 35 KSP Residual norm 9.790121566055e-05 >> 36 KSP Residual norm 7.465935386094e-05 >> 37 KSP Residual norm 5.689506626052e-05 >> 38 KSP Residual norm 4.413136619126e-05 >> 39 KSP Residual norm 3.512194236402e-05 >> 40 KSP Residual norm 2.877755408287e-05 >> 41 KSP Residual norm 2.340080556431e-05 >> 42 KSP Residual norm 1.904544450345e-05 >> 43 KSP Residual norm 1.504723478235e-05 >> 44 KSP Residual norm 1.141381950576e-05 >> 45 KSP Residual norm 8.206151384599e-06 >> 46 KSP Residual norm 5.911426091276e-06 >> 47 KSP Residual norm 4.233669089283e-06 >> 48 KSP Residual norm 2.898052944223e-06 >> 49 KSP Residual norm 2.023556779973e-06 >> 50 KSP Residual norm 1.459108043935e-06 >> 51 KSP Residual norm 1.097335545865e-06 >> 52 KSP Residual norm 8.440457332262e-07 >> 53 KSP Residual norm 6.705616854004e-07 >> 54 KSP Residual norm 5.404888680234e-07 >> 55 KSP Residual norm 4.391368084979e-07 >> 56 KSP Residual norm 3.697063014621e-07 >> 57 KSP Residual norm 3.021772094146e-07 >> 58 KSP Residual norm 2.479354520792e-07 >> 59 KSP Residual norm 2.013077841968e-07 >> 60 KSP Residual norm 1.553159612793e-07
Re: [petsc-users] [SLEPc] ex5 fails, error in lapack
On Sun, 28 Oct 2018 at 21:46, Santiago Andres Triana wrote: > Hi Dave, > > Indeed, I added that last arg myself after the configure script asked for > it (--with-batch seems to need it). I just tried with petsc-3.9.1, without > the --with-batch and --known-64-blas-indices=1 options and everything is > working nicely. > Great. I believe as general rule, flags such as -known-64-bit-xxx are only required to be specified by the user when using system provided packages (actually any package not installed by petsc' configure). If you use --download-yyy then petscs' configure defines how package yyy is to be configured and built, hence it knows whether it used 64 bit ints, or not - the user does not (and probably should not) provide a flag to indicate what petsc configuration already knows Thanks, Dave I will try again later with the latest version. > Ok. > Thanks! > > Santiago > > On Sun, Oct 28, 2018 at 10:31 AM Dave May wrote: > >> >> >> On Sun, 28 Oct 2018 at 09:37, Santiago Andres Triana >> wrote: >> >>> Hi petsc-users, >>> >>> I am experiencing problems running ex5 and ex7 from the slepc tutorial. >>> This is after upgrade to petsc-3.10.2 and slepc-3.10.1. Has anyone run into >>> this problem? see the error message below. Any help or advice would be >>> highly appreciated. Thanks in advance! >>> >>> Santiago >>> >>> >>> >>> trianas@hpcb-n02:/home/trianas/slepc-3.10.1/src/eps/examples/tutorials> >>> ./ex5 -eps_nev 4 >>> >>> Markov Model, N=120 (m=15) >>> >>> [0]PETSC ERROR: - Error Message >>> -- >>> [0]PETSC ERROR: Error in external library >>> [0]PETSC ERROR: Error in LAPACK subroutine hseqr: info=0 >>> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html >>> for trouble shooting. >>> [0]PETSC ERROR: Petsc Release Version 3.10.2, Oct, 09, 2018 >>> [0]PETSC ERROR: ./ex5 on a arch-linux2-c-opt named hpcb-n02 by trianas >>> Sun Oct 28 09:30:18 2018 >>> [0]PETSC ERROR: Configure options --known-level1-dcache-size=32768 >>> --known-level1-dcache-linesize=64 --known-level1-dcache-assoc=8 >>> --known-sizeof-char=1 --known-sizeof-void-p=8 --known-sizeof-short=2 >>> --known-sizeof-int=4 --known-sizeof-long=8 --known-sizeof-long-long=8 >>> --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=8 >>> --known-bits-per-byte=8 --known-memcmp-ok=1 --known-sizeof-MPI_Comm=4 >>> --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --known-mpi-int64_t=1 >>> --known-mpi-c-double-complex=1 --known-has-attribute-aligned=1 >>> --with-scalar-type=complex --download-mumps=1 --download-parmetis >>> --download-metis --download-scalapack=1 --download-fblaslapack=1 >>> --with-debugging=0 --download-superlu_dist=1 --download-ptscotch=1 >>> CXXOPTFLAGS="-O3 -march=native" FOPTFLAGS="-O3 -march=native" >>> COPTFLAGS="-O3 -march=native" --with-batch --known-64-bit-blas-indices=1 >>> >> >> I think this last arg is wrong if you use --download-fblaslapack. >> >> Did you explicitly add this option yourself? >> >> >> [0]PETSC ERROR: #1 DSSolve_NHEP() line 586 in >>> /space/hpc-home/trianas/slepc-3.10.1/src/sys/classes/ds/impls/nhep/dsnhep.c >>> [0]PETSC ERROR: #2 DSSolve() line 586 in >>> /space/hpc-home/trianas/slepc-3.10.1/src/sys/classes/ds/interface/dsops.c >>> [0]PETSC ERROR: #3 EPSSolve_KrylovSchur_Default() line 275 in >>> /space/hpc-home/trianas/slepc-3.10.1/src/eps/impls/krylov/krylovschur/krylovschur.c >>> [0]PETSC ERROR: #4 EPSSolve() line 148 in >>> /space/hpc-home/trianas/slepc-3.10.1/src/eps/interface/epssolve.c >>> [0]PETSC ERROR: #5 main() line 90 in >>> /home/trianas/slepc-3.10.1/src/eps/examples/tutorials/ex5.c >>> [0]PETSC ERROR: PETSc Option Table entries: >>> [0]PETSC ERROR: -eps_nev 4 >>> [0]PETSC ERROR: End of Error Message ---send entire >>> error message to petsc-ma...@mcs.anl.gov-- >>> application called MPI_Abort(MPI_COMM_WORLD, 76) - process 0 >>> [unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=76 >>> : >>> system msg for write_line failure : Bad file descriptor >>> >>>
Re: [petsc-users] Convergence of AMG
> On Oct 28, 2018, at 12:16 PM, Manav Bhatia wrote: > > Hi, > >I am attempting to solve a Mindlin plate bending problem with AMG solver > in petsc. This test case is with a mesh of 300x300 elements and 543,606 dofs. > >The discretization includes 6 variables (u, v, w, tx, ty, tz), but only > three are relevant for plate bending (w, tx, ty). > >I am calling the solver with the following options: > > -pc_type gamg -pc_gamg_threshold 0. --node-major-dofs -mat_block_size 6 > -ksp_rtol 1.e-8 -ksp_monitor -ksp_converged_reason -ksp_view > > And the convergence behavior is shown below, along with the ksp_view > information. Based on notes in the manual, this seems to be subpar > convergence rate. At the end of the solution the norm of each variable is : > > var: 0: u : norm: 5.505909e-18 > var: 1: v : norm: 7.639640e-18 > var: 2: w : norm: 3.901464e-03 > var: 3: tx : norm: 4.403576e-02 > var: 4: ty : norm: 4.403576e-02 > var: 5: tz : norm: 1.148409e-16 What do you mean by var: 2: w : norm etc? Is this the norm of the error for that variable, the norm of the residual, something else? How exactly are you calculating it? Thanks Barry > > I tried different values of -ksp_rtol from 1e-1 to 1e-8 and this does not > make a lot of difference in the norms of (w, tx, ty). > > I do provide the solver with 6 rigid-body vectors to approximate the > null-space of the problem. Without these the solver shows very poor > convergence. > > I would appreciate advice on possible strategies to improve this behavior. > > Thanks, > Manav > > 0 KSP Residual norm 1.696304497261e+00 > 1 KSP Residual norm 1.120485505777e+00 > 2 KSP Residual norm 8.324222302402e-01 > 3 KSP Residual norm 6.477349534115e-01 > 4 KSP Residual norm 5.080936471292e-01 > 5 KSP Residual norm 4.051099646638e-01 > 6 KSP Residual norm 3.260432664653e-01 > 7 KSP Residual norm 2.560483838143e-01 > 8 KSP Residual norm 2.029943986124e-01 > 9 KSP Residual norm 1.560985741610e-01 >10 KSP Residual norm 1.163720702140e-01 >11 KSP Residual norm 8.488411085459e-02 >12 KSP Residual norm 5.888041729034e-02 >13 KSP Residual norm 4.027792209980e-02 >14 KSP Residual norm 2.819048087304e-02 >15 KSP Residual norm 1.904674196962e-02 >16 KSP Residual norm 1.289302447822e-02 >17 KSP Residual norm 9.162203296376e-03 >18 KSP Residual norm 7.016781679507e-03 >19 KSP Residual norm 5.399170865328e-03 >20 KSP Residual norm 4.254385887482e-03 >21 KSP Residual norm 3.530831740621e-03 >22 KSP Residual norm 2.946780747923e-03 >23 KSP Residual norm 2.339361361128e-03 >24 KSP Residual norm 1.815072489282e-03 >25 KSP Residual norm 1.408814185342e-03 >26 KSP Residual norm 1.063795714320e-03 >27 KSP Residual norm 7.828540233117e-04 >28 KSP Residual norm 5.683910750067e-04 >29 KSP Residual norm 4.131151010250e-04 >30 KSP Residual norm 3.065608221019e-04 >31 KSP Residual norm 2.634114273459e-04 >32 KSP Residual norm 2.198180137626e-04 >33 KSP Residual norm 1.748956510799e-04 >34 KSP Residual norm 1.317539710010e-04 >35 KSP Residual norm 9.790121566055e-05 >36 KSP Residual norm 7.465935386094e-05 >37 KSP Residual norm 5.689506626052e-05 >38 KSP Residual norm 4.413136619126e-05 >39 KSP Residual norm 3.512194236402e-05 >40 KSP Residual norm 2.877755408287e-05 >41 KSP Residual norm 2.340080556431e-05 >42 KSP Residual norm 1.904544450345e-05 >43 KSP Residual norm 1.504723478235e-05 >44 KSP Residual norm 1.141381950576e-05 >45 KSP Residual norm 8.206151384599e-06 >46 KSP Residual norm 5.911426091276e-06 >47 KSP Residual norm 4.233669089283e-06 >48 KSP Residual norm 2.898052944223e-06 >49 KSP Residual norm 2.023556779973e-06 >50 KSP Residual norm 1.459108043935e-06 >51 KSP Residual norm 1.097335545865e-06 >52 KSP Residual norm 8.440457332262e-07 >53 KSP Residual norm 6.705616854004e-07 >54 KSP Residual norm 5.404888680234e-07 >55 KSP Residual norm 4.391368084979e-07 >56 KSP Residual norm 3.697063014621e-07 >57 KSP Residual norm 3.021772094146e-07 >58 KSP Residual norm 2.479354520792e-07 >59 KSP Residual norm 2.013077841968e-07 >60 KSP Residual norm 1.553159612793e-07 >61 KSP Residual norm 1.400784224898e-07 >62 KSP Residual norm 9.707453662195e-08 >63 KSP Residual norm 7.263173080146e-08 >64 KSP Residual norm 5.593723572132e-08 >65 KSP Residual norm 4.448788809586e-08 >66 KSP Residual norm 3.613992590778e-08 >67 KSP Residual norm 2.946099051876e-08 >68 KSP Residual norm 2.408053564170e-08 >69 KSP Residual norm 1.945257374856e-08 >70 KSP Residual norm 1.572494535110e-08 > > > KSP Object: 4 MPI processes > type: gmres > restart=30, using Classical (unmodified) Gram-Schmidt
Re: [petsc-users] [SLEPc] ex5 fails, error in lapack
Hi Dave, Indeed, I added that last arg myself after the configure script asked for it (--with-batch seems to need it). I just tried with petsc-3.9.1, without the --with-batch and --known-64-blas-indices=1 options and everything is working nicely. I will try again later with the latest version. Thanks! Santiago On Sun, Oct 28, 2018 at 10:31 AM Dave May wrote: > > > On Sun, 28 Oct 2018 at 09:37, Santiago Andres Triana > wrote: > >> Hi petsc-users, >> >> I am experiencing problems running ex5 and ex7 from the slepc tutorial. >> This is after upgrade to petsc-3.10.2 and slepc-3.10.1. Has anyone run into >> this problem? see the error message below. Any help or advice would be >> highly appreciated. Thanks in advance! >> >> Santiago >> >> >> >> trianas@hpcb-n02:/home/trianas/slepc-3.10.1/src/eps/examples/tutorials> >> ./ex5 -eps_nev 4 >> >> Markov Model, N=120 (m=15) >> >> [0]PETSC ERROR: - Error Message >> -- >> [0]PETSC ERROR: Error in external library >> [0]PETSC ERROR: Error in LAPACK subroutine hseqr: info=0 >> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html >> for trouble shooting. >> [0]PETSC ERROR: Petsc Release Version 3.10.2, Oct, 09, 2018 >> [0]PETSC ERROR: ./ex5 on a arch-linux2-c-opt named hpcb-n02 by trianas >> Sun Oct 28 09:30:18 2018 >> [0]PETSC ERROR: Configure options --known-level1-dcache-size=32768 >> --known-level1-dcache-linesize=64 --known-level1-dcache-assoc=8 >> --known-sizeof-char=1 --known-sizeof-void-p=8 --known-sizeof-short=2 >> --known-sizeof-int=4 --known-sizeof-long=8 --known-sizeof-long-long=8 >> --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=8 >> --known-bits-per-byte=8 --known-memcmp-ok=1 --known-sizeof-MPI_Comm=4 >> --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --known-mpi-int64_t=1 >> --known-mpi-c-double-complex=1 --known-has-attribute-aligned=1 >> --with-scalar-type=complex --download-mumps=1 --download-parmetis >> --download-metis --download-scalapack=1 --download-fblaslapack=1 >> --with-debugging=0 --download-superlu_dist=1 --download-ptscotch=1 >> CXXOPTFLAGS="-O3 -march=native" FOPTFLAGS="-O3 -march=native" >> COPTFLAGS="-O3 -march=native" --with-batch --known-64-bit-blas-indices=1 >> > > I think this last arg is wrong if you use --download-fblaslapack. > > Did you explicitly add this option yourself? > > > [0]PETSC ERROR: #1 DSSolve_NHEP() line 586 in >> /space/hpc-home/trianas/slepc-3.10.1/src/sys/classes/ds/impls/nhep/dsnhep.c >> [0]PETSC ERROR: #2 DSSolve() line 586 in >> /space/hpc-home/trianas/slepc-3.10.1/src/sys/classes/ds/interface/dsops.c >> [0]PETSC ERROR: #3 EPSSolve_KrylovSchur_Default() line 275 in >> /space/hpc-home/trianas/slepc-3.10.1/src/eps/impls/krylov/krylovschur/krylovschur.c >> [0]PETSC ERROR: #4 EPSSolve() line 148 in >> /space/hpc-home/trianas/slepc-3.10.1/src/eps/interface/epssolve.c >> [0]PETSC ERROR: #5 main() line 90 in >> /home/trianas/slepc-3.10.1/src/eps/examples/tutorials/ex5.c >> [0]PETSC ERROR: PETSc Option Table entries: >> [0]PETSC ERROR: -eps_nev 4 >> [0]PETSC ERROR: End of Error Message ---send entire >> error message to petsc-ma...@mcs.anl.gov-- >> application called MPI_Abort(MPI_COMM_WORLD, 76) - process 0 >> [unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=76 >> : >> system msg for write_line failure : Bad file descriptor >> >>
[petsc-users] Convergence of AMG
Hi, I am attempting to solve a Mindlin plate bending problem with AMG solver in petsc. This test case is with a mesh of 300x300 elements and 543,606 dofs. The discretization includes 6 variables (u, v, w, tx, ty, tz), but only three are relevant for plate bending (w, tx, ty). I am calling the solver with the following options: -pc_type gamg -pc_gamg_threshold 0. --node-major-dofs -mat_block_size 6 -ksp_rtol 1.e-8 -ksp_monitor -ksp_converged_reason -ksp_view And the convergence behavior is shown below, along with the ksp_view information. Based on notes in the manual, this seems to be subpar convergence rate. At the end of the solution the norm of each variable is : var: 0: u : norm: 5.505909e-18 var: 1: v : norm: 7.639640e-18 var: 2: w : norm: 3.901464e-03 var: 3: tx : norm: 4.403576e-02 var: 4: ty : norm: 4.403576e-02 var: 5: tz : norm: 1.148409e-16 I tried different values of -ksp_rtol from 1e-1 to 1e-8 and this does not make a lot of difference in the norms of (w, tx, ty). I do provide the solver with 6 rigid-body vectors to approximate the null-space of the problem. Without these the solver shows very poor convergence. I would appreciate advice on possible strategies to improve this behavior. Thanks, Manav 0 KSP Residual norm 1.696304497261e+00 1 KSP Residual norm 1.120485505777e+00 2 KSP Residual norm 8.324222302402e-01 3 KSP Residual norm 6.477349534115e-01 4 KSP Residual norm 5.080936471292e-01 5 KSP Residual norm 4.051099646638e-01 6 KSP Residual norm 3.260432664653e-01 7 KSP Residual norm 2.560483838143e-01 8 KSP Residual norm 2.029943986124e-01 9 KSP Residual norm 1.560985741610e-01 10 KSP Residual norm 1.163720702140e-01 11 KSP Residual norm 8.488411085459e-02 12 KSP Residual norm 5.888041729034e-02 13 KSP Residual norm 4.027792209980e-02 14 KSP Residual norm 2.819048087304e-02 15 KSP Residual norm 1.904674196962e-02 16 KSP Residual norm 1.289302447822e-02 17 KSP Residual norm 9.162203296376e-03 18 KSP Residual norm 7.016781679507e-03 19 KSP Residual norm 5.399170865328e-03 20 KSP Residual norm 4.254385887482e-03 21 KSP Residual norm 3.530831740621e-03 22 KSP Residual norm 2.946780747923e-03 23 KSP Residual norm 2.339361361128e-03 24 KSP Residual norm 1.815072489282e-03 25 KSP Residual norm 1.408814185342e-03 26 KSP Residual norm 1.063795714320e-03 27 KSP Residual norm 7.828540233117e-04 28 KSP Residual norm 5.683910750067e-04 29 KSP Residual norm 4.131151010250e-04 30 KSP Residual norm 3.065608221019e-04 31 KSP Residual norm 2.634114273459e-04 32 KSP Residual norm 2.198180137626e-04 33 KSP Residual norm 1.748956510799e-04 34 KSP Residual norm 1.317539710010e-04 35 KSP Residual norm 9.790121566055e-05 36 KSP Residual norm 7.465935386094e-05 37 KSP Residual norm 5.689506626052e-05 38 KSP Residual norm 4.413136619126e-05 39 KSP Residual norm 3.512194236402e-05 40 KSP Residual norm 2.877755408287e-05 41 KSP Residual norm 2.340080556431e-05 42 KSP Residual norm 1.904544450345e-05 43 KSP Residual norm 1.504723478235e-05 44 KSP Residual norm 1.141381950576e-05 45 KSP Residual norm 8.206151384599e-06 46 KSP Residual norm 5.911426091276e-06 47 KSP Residual norm 4.233669089283e-06 48 KSP Residual norm 2.898052944223e-06 49 KSP Residual norm 2.023556779973e-06 50 KSP Residual norm 1.459108043935e-06 51 KSP Residual norm 1.097335545865e-06 52 KSP Residual norm 8.440457332262e-07 53 KSP Residual norm 6.705616854004e-07 54 KSP Residual norm 5.404888680234e-07 55 KSP Residual norm 4.391368084979e-07 56 KSP Residual norm 3.697063014621e-07 57 KSP Residual norm 3.021772094146e-07 58 KSP Residual norm 2.479354520792e-07 59 KSP Residual norm 2.013077841968e-07 60 KSP Residual norm 1.553159612793e-07 61 KSP Residual norm 1.400784224898e-07 62 KSP Residual norm 9.707453662195e-08 63 KSP Residual norm 7.263173080146e-08 64 KSP Residual norm 5.593723572132e-08 65 KSP Residual norm 4.448788809586e-08 66 KSP Residual norm 3.613992590778e-08 67 KSP Residual norm 2.946099051876e-08 68 KSP Residual norm 2.408053564170e-08 69 KSP Residual norm 1.945257374856e-08 70 KSP Residual norm 1.572494535110e-08 KSP Object: 4 MPI processes type: gmres restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement happy breakdown tolerance 1e-30 maximum iterations=1, initial guess is zero tolerances: relative=1e-08, absolute=1e-50, divergence=1. left preconditioning using PRECONDITIONED norm type for convergence test PC Object: 4 MPI processes type: gamg type is MULTIPLICATIVE, levels=6 cycles=v Cycles per PCApply=1 Using externally compute Galerkin coarse grid matrices GAMG specific options
Re: [petsc-users] [petsc-maint] How to impose boundary conditions using DMDA
Matt, How difficult would it be to impose such boundary conditions with DMPlex? Presumably you just connect the mesh up "properly" and it is straightforward? Barry > On Oct 27, 2018, at 10:23 AM, Matthew Knepley wrote: > > On Sat, Oct 27, 2018 at 2:02 AM Fengwen Wang wrote: > Dear Colleagues, > > > > I use the finite element method to solve my problem in Petsc. > > > > Mesh is defined as a regular mesh using DMDA. I have a special boundary > condition which I do not know how to impose it in Petsc. > > > > In a 2D problem, the domain is unit size, two degrees of freedom per node (u, > v). I would like impose the following boundary condition: > > > > u(x=1) = -v ( y=1) and v(x=1 )= -u (y=1) . > > > > How can I impose such a boundary condition in Petsc? > > In a serial code, you could do this just by equating those variables, but in > parallel we have no support for such a boundary condition. > > Thanks, > > Matt > Thanks a lot. > > > > Best regards > > Fengwen > > > > > > > Senior Researcher > Department of Mechanical Engineering, DTU > Nils Koppels Allé > Building 404 > 2800 Kgs. Lyngby > f...@mek.dtu.dk > > > -- > What most experimenters take for granted before they begin their experiments > is infinitely more interesting than any results to which their experiments > lead. > -- Norbert Wiener > > https://www.cse.buffalo.edu/~knepley/
Re: [petsc-users] [SLEPc] ex5 fails, error in lapack
On Sun, 28 Oct 2018 at 09:37, Santiago Andres Triana wrote: > Hi petsc-users, > > I am experiencing problems running ex5 and ex7 from the slepc tutorial. > This is after upgrade to petsc-3.10.2 and slepc-3.10.1. Has anyone run into > this problem? see the error message below. Any help or advice would be > highly appreciated. Thanks in advance! > > Santiago > > > > trianas@hpcb-n02:/home/trianas/slepc-3.10.1/src/eps/examples/tutorials> > ./ex5 -eps_nev 4 > > Markov Model, N=120 (m=15) > > [0]PETSC ERROR: - Error Message > -- > [0]PETSC ERROR: Error in external library > [0]PETSC ERROR: Error in LAPACK subroutine hseqr: info=0 > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html > for trouble shooting. > [0]PETSC ERROR: Petsc Release Version 3.10.2, Oct, 09, 2018 > [0]PETSC ERROR: ./ex5 on a arch-linux2-c-opt named hpcb-n02 by trianas Sun > Oct 28 09:30:18 2018 > [0]PETSC ERROR: Configure options --known-level1-dcache-size=32768 > --known-level1-dcache-linesize=64 --known-level1-dcache-assoc=8 > --known-sizeof-char=1 --known-sizeof-void-p=8 --known-sizeof-short=2 > --known-sizeof-int=4 --known-sizeof-long=8 --known-sizeof-long-long=8 > --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=8 > --known-bits-per-byte=8 --known-memcmp-ok=1 --known-sizeof-MPI_Comm=4 > --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --known-mpi-int64_t=1 > --known-mpi-c-double-complex=1 --known-has-attribute-aligned=1 > --with-scalar-type=complex --download-mumps=1 --download-parmetis > --download-metis --download-scalapack=1 --download-fblaslapack=1 > --with-debugging=0 --download-superlu_dist=1 --download-ptscotch=1 > CXXOPTFLAGS="-O3 -march=native" FOPTFLAGS="-O3 -march=native" > COPTFLAGS="-O3 -march=native" --with-batch --known-64-bit-blas-indices=1 > I think this last arg is wrong if you use --download-fblaslapack. Did you explicitly add this option yourself? [0]PETSC ERROR: #1 DSSolve_NHEP() line 586 in > /space/hpc-home/trianas/slepc-3.10.1/src/sys/classes/ds/impls/nhep/dsnhep.c > [0]PETSC ERROR: #2 DSSolve() line 586 in > /space/hpc-home/trianas/slepc-3.10.1/src/sys/classes/ds/interface/dsops.c > [0]PETSC ERROR: #3 EPSSolve_KrylovSchur_Default() line 275 in > /space/hpc-home/trianas/slepc-3.10.1/src/eps/impls/krylov/krylovschur/krylovschur.c > [0]PETSC ERROR: #4 EPSSolve() line 148 in > /space/hpc-home/trianas/slepc-3.10.1/src/eps/interface/epssolve.c > [0]PETSC ERROR: #5 main() line 90 in > /home/trianas/slepc-3.10.1/src/eps/examples/tutorials/ex5.c > [0]PETSC ERROR: PETSc Option Table entries: > [0]PETSC ERROR: -eps_nev 4 > [0]PETSC ERROR: End of Error Message ---send entire > error message to petsc-ma...@mcs.anl.gov-- > application called MPI_Abort(MPI_COMM_WORLD, 76) - process 0 > [unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=76 > : > system msg for write_line failure : Bad file descriptor > >
[petsc-users] [SLEPc] ex5 fails, error in lapack
Hi petsc-users, I am experiencing problems running ex5 and ex7 from the slepc tutorial. This is after upgrade to petsc-3.10.2 and slepc-3.10.1. Has anyone run into this problem? see the error message below. Any help or advice would be highly appreciated. Thanks in advance! Santiago trianas@hpcb-n02:/home/trianas/slepc-3.10.1/src/eps/examples/tutorials> ./ex5 -eps_nev 4 Markov Model, N=120 (m=15) [0]PETSC ERROR: - Error Message -- [0]PETSC ERROR: Error in external library [0]PETSC ERROR: Error in LAPACK subroutine hseqr: info=0 [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [0]PETSC ERROR: Petsc Release Version 3.10.2, Oct, 09, 2018 [0]PETSC ERROR: ./ex5 on a arch-linux2-c-opt named hpcb-n02 by trianas Sun Oct 28 09:30:18 2018 [0]PETSC ERROR: Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=64 --known-level1-dcache-assoc=8 --known-sizeof-char=1 --known-sizeof-void-p=8 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=8 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=8 --known-bits-per-byte=8 --known-memcmp-ok=1 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --known-mpi-int64_t=1 --known-mpi-c-double-complex=1 --known-has-attribute-aligned=1 --with-scalar-type=complex --download-mumps=1 --download-parmetis --download-metis --download-scalapack=1 --download-fblaslapack=1 --with-debugging=0 --download-superlu_dist=1 --download-ptscotch=1 CXXOPTFLAGS="-O3 -march=native" FOPTFLAGS="-O3 -march=native" COPTFLAGS="-O3 -march=native" --with-batch --known-64-bit-blas-indices=1 [0]PETSC ERROR: #1 DSSolve_NHEP() line 586 in /space/hpc-home/trianas/slepc-3.10.1/src/sys/classes/ds/impls/nhep/dsnhep.c [0]PETSC ERROR: #2 DSSolve() line 586 in /space/hpc-home/trianas/slepc-3.10.1/src/sys/classes/ds/interface/dsops.c [0]PETSC ERROR: #3 EPSSolve_KrylovSchur_Default() line 275 in /space/hpc-home/trianas/slepc-3.10.1/src/eps/impls/krylov/krylovschur/krylovschur.c [0]PETSC ERROR: #4 EPSSolve() line 148 in /space/hpc-home/trianas/slepc-3.10.1/src/eps/interface/epssolve.c [0]PETSC ERROR: #5 main() line 90 in /home/trianas/slepc-3.10.1/src/eps/examples/tutorials/ex5.c [0]PETSC ERROR: PETSc Option Table entries: [0]PETSC ERROR: -eps_nev 4 [0]PETSC ERROR: End of Error Message ---send entire error message to petsc-ma...@mcs.anl.gov-- application called MPI_Abort(MPI_COMM_WORLD, 76) - process 0 [unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=76 : system msg for write_line failure : Bad file descriptor