Re: [petsc-users] a question on DMPlexSetAnchors

2017-01-05 Thread Matthew Knepley
On Thu, Jan 5, 2017 at 6:35 PM, Rochan Upadhyay wrote: > Thanks for prompt reply. I don't need hanging nodes or Dirichlet > conditions which can > be easily done by adding constraint DoFs in the Section as you mention. > My requirement is the following: > >>> Constraints among Fields: > >>> I wou

Re: [petsc-users] Best way to scatter a Seq vector ?

2017-01-05 Thread Barry Smith
> On Jan 5, 2017, at 6:21 PM, Manuel Valera wrote: > > Hello Devs is me again, > > I'm trying to distribute a vector to all called processes, the vector would > be originally in root as a sequential vector and i would like to scatter it, > what would the best call to do this ? > > I already

Re: [petsc-users] a question on DMPlexSetAnchors

2017-01-05 Thread Rochan Upadhyay
Thanks for prompt reply. I don't need hanging nodes or Dirichlet conditions which can be easily done by adding constraint DoFs in the Section as you mention. My requirement is the following: >>> Constraints among Fields: >>> I would recommend just putting the constraint in as an equation. In your c

[petsc-users] Best way to scatter a Seq vector ?

2017-01-05 Thread Manuel Valera
Hello Devs is me again, I'm trying to distribute a vector to all called processes, the vector would be originally in root as a sequential vector and i would like to scatter it, what would the best call to do this ? I already know how to gather a distributed vector to root with VecScatterCreateToZ

Re: [petsc-users] problems after glibc upgrade to 2.17-157

2017-01-05 Thread Satish Balay
On Thu, 5 Jan 2017, Satish Balay wrote: > Well its more of RHEL - than SL. And its just Intel .so files [as far > as we know] thats triggering this issue. > > RHEL generally doesn't make changes that break old binaries. But any > code change [wihch bug fixes are] - can introduce changed behavior

Re: [petsc-users] Fieldsplit with sub pc MUMPS in parallel

2017-01-05 Thread Barry Smith
This is not good. Something is out of whack. First run 1 and 2 processes with -ksp_view_mat binary -ksp_view_rhs binary in each case this will generate a file called binaryoutput . Send both files to petsc-ma...@mcs.anl.gov I want to confirm that the matrices are the same in both c

Re: [petsc-users] problems after glibc upgrade to 2.17-157

2017-01-05 Thread Satish Balay
On Thu, 5 Jan 2017, Matthew Knepley wrote: > On Thu, Jan 5, 2017 at 2:37 AM, Klaij, Christiaan wrote: > > So problem solved for now, thanks to you and Matt for all your > > help! On the long run I will go for Intel-17 on SL7.3. > > > > What worries me though is that a simple update (which happen

Re: [petsc-users] Fieldsplit with sub pc MUMPS in parallel

2017-01-05 Thread Barry Smith
> On Jan 5, 2017, at 5:58 AM, Dave May wrote: > > Do you now see identical residual histories for a job using 1 rank and 4 > ranks? Please send the residual histories with the extra options, I'm curious too, because a Krylov method should not be needed in the inner solve, I just asked for

Re: [petsc-users] make test freeze

2017-01-05 Thread Matthew Knepley
On Thu, Jan 5, 2017 at 6:31 AM, Patrick Begou < patrick.be...@legi.grenoble-inp.fr> wrote: > I am unable to run any test on petsc. It looks like if the ex19 run freeze > on the server as it do not use any cpu time and pstree shows > > sshd---bash-+-gedit > `-make---sh-+-gmake---sh---gm

Re: [petsc-users] pc_gamg_threshol

2017-01-05 Thread Jeremy Theler
Yes, I read that page and it was that paragraph that made me want to learn more. For example, that pages says: “-pc_gamg_threshold 0.0 is the most robust option (the reason for this is not obvious) ...” Where can I find more math-based background on this subject? I mean, some text that describe

Re: [petsc-users] problems after glibc upgrade to 2.17-157

2017-01-05 Thread Matthew Knepley
On Thu, Jan 5, 2017 at 2:37 AM, Klaij, Christiaan wrote: > Satish, Matt > > Our sysadmin tells me Scientific Linux is still busy with the > RedHat 7.3 update, so yes, this is a partial update somewhere > between 7.2 and 7.3... > > No luck with the quotes on my system, but the option > --with-shar

Re: [petsc-users] pc_gamg_threshol

2017-01-05 Thread Mark Adams
You want the bottom of page 84 in the manual. On Wed, Jan 4, 2017 at 4:33 PM, Barry Smith wrote: > >The manual page gives a high-level description > http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/ > PCGAMGSetThreshold.html the exact details can be found in the code here > htt

[petsc-users] make test freeze

2017-01-05 Thread Patrick Begou
I am unable to run any test on petsc. It looks like if the ex19 run freeze on the server as it do not use any cpu time and pstree shows sshd---bash-+-gedit `-make---sh-+-gmake---sh---gmake---sh---mpiexec---ex19 `-tee I've tested petsc-3.7.5.tar.gz and the lat

Re: [petsc-users] Fieldsplit with sub pc MUMPS in parallel

2017-01-05 Thread Dave May
Do you now see identical residual histories for a job using 1 rank and 4 ranks? If not, I am inclined to believe that the IS's you are defining for the splits in the parallel case are incorrect. The operator created to approximate the Schur complement with selfp should not depend on the number of

Re: [petsc-users] Fieldsplit with sub pc MUMPS in parallel

2017-01-05 Thread
Dear Barry, dear Dave, THANK YOU! You two pointed out the right problem.By using the options you provided (-fieldsplit_0_ksp_type gmres -fieldsplit_0_ksp_pc_side right -fieldsplit_1_ksp_type gmres -fieldsplit_1_ksp_pc_side right), the solver converges in 3 iterations whatever the size of the commu

Re: [petsc-users] problems after glibc upgrade to 2.17-157

2017-01-05 Thread Klaij, Christiaan
Satish, Matt Our sysadmin tells me Scientific Linux is still busy with the RedHat 7.3 update, so yes, this is a partial update somewhere between 7.2 and 7.3... No luck with the quotes on my system, but the option --with-shared-libraries=0 does work! make test gives: Running test examples to veri