Re: [petsc-users] What block size means in amg aggregation type

2016-07-13 Thread Barry Smith
Sorry know one answered it. I had hoped Mark Adams would since he knows much more about it then me. > On Jul 6, 2016, at 2:50 PM, Eduardo Jourdan > wrote: > > Hi, > > I am kind of new to algebraic multigrid methods. I tried to figure it on my > own but I'm not be sure about it. > > How t

Re: [petsc-users] Multigrid with PML

2016-07-13 Thread Barry Smith
Can you run with the additional option -ksp_view_mat binary and email the resulting file which will be called binaryoutput to petsc-ma...@mcs.anl.gov Barry > On Jul 13, 2016, at 2:30 PM, Safin, Artur wrote: > > Dear PETSc community, > > I am working on solving a Helmholtz problem with P

Re: [petsc-users] Question about memory usage in Multigrid preconditioner

2016-07-13 Thread Dave May
On 14 July 2016 at 01:07, frank wrote: > Hi Dave, > > Sorry for the late reply. > Thank you so much for your detailed reply. > > I have a question about the estimation of the memory usage. There are > 4223139840 allocated non-zeros and 18432 MPI processes. Double precision is > used. So the memor

Re: [petsc-users] Question about memory usage in Multigrid preconditioner

2016-07-13 Thread Barry Smith
> On Jul 13, 2016, at 6:07 PM, frank wrote: > > Hi Dave, > > Sorry for the late reply. > Thank you so much for your detailed reply. > > I have a question about the estimation of the memory usage. There are > 4223139840 allocated non-zeros and 18432 MPI processes. Double precision is > used.

Re: [petsc-users] Question about memory usage in Multigrid preconditioner

2016-07-13 Thread frank
Hi Dave, Sorry for the late reply. Thank you so much for your detailed reply. I have a question about the estimation of the memory usage. There are 4223139840 allocated non-zeros and 18432 MPI processes. Double precision is used. So the memory per process is: 4223139840 * 8bytes / 18432 / 1

Re: [petsc-users] Distribution of DMPlex for FEM

2016-07-13 Thread Matthew Knepley
On Wed, Jul 13, 2016 at 3:57 AM, Morten Nobel-Jørgensen wrote: > I’m having problems distributing a simple FEM model using DMPlex. For test > case I use 1x1x2 hex box elements (/cells) with 12 vertices. Each vertex > has one DOF. > When I distribute the system to two processors, each get a single

[petsc-users] Multigrid with PML

2016-07-13 Thread Safin, Artur
Dear PETSc community, I am working on solving a Helmholtz problem with PML. The issue is that I am finding it very hard to deal with the resulting matrix system; I can get the correct solution for coarse meshes, but it takes roughly 2-4 times as long to converge for each successively refined me

Re: [petsc-users] different convergence behaviour

2016-07-13 Thread Barry Smith
> On Jul 13, 2016, at 11:05 AM, Matthew Knepley wrote: > > On Wed, Jul 13, 2016 at 10:34 AM, Hoang Giang Bui wrote: > Thanks Barry > > This is a good comment. Since material behaviour depends very much on the > trajectory of the solution. I suspect that the error may concatenate during > tim

Re: [petsc-users] different convergence behaviour

2016-07-13 Thread Matthew Knepley
On Wed, Jul 13, 2016 at 10:34 AM, Hoang Giang Bui wrote: > Thanks Barry > > This is a good comment. Since material behaviour depends very much on the > trajectory of the solution. I suspect that the error may concatenate during > time stepping. > > I have re-run the simulation as you suggested an

Re: [petsc-users] different convergence behaviour

2016-07-13 Thread Hoang Giang Bui
Thanks Barry This is a good comment. Since material behaviour depends very much on the trajectory of the solution. I suspect that the error may concatenate during time stepping. I have re-run the simulation as you suggested and post the log file here: https://www.dropbox.com/s/d6l8ixme37uh47a/log

Re: [petsc-users] Question about memory usage in Multigrid preconditioner

2016-07-13 Thread Barry Smith
> On Jul 13, 2016, at 4:17 AM, Dave May wrote: > > Hi Barry, > > > Dave, > >MatPtAP has to generate some work space. Is it possible the "guess" it > uses for needed work space is so absurdly (and unnecessarily) large that it > triggers a memory issue? It is possible that other place

Re: [petsc-users] Question about memory usage in Multigrid preconditioner

2016-07-13 Thread Dave May
Hi Barry, > Dave, > >MatPtAP has to generate some work space. Is it possible the "guess" it > uses for needed work space is so absurdly (and unnecessarily) large that it > triggers a memory issue? It is possible that other places that require > "guesses" for work space produce a problem?

[petsc-users] Distribution of DMPlex for FEM

2016-07-13 Thread Morten Nobel-Jørgensen
I’m having problems distributing a simple FEM model using DMPlex. For test case I use 1x1x2 hex box elements (/cells) with 12 vertices. Each vertex has one DOF. When I distribute the system to two processors, each get a single element and the local vector has the size 8 (one DOF for each vertex o