Re: [petsc-users] Question about memory usage in Multigrid preconditioner

2016-09-16 Thread Jed Brown
Barry Smith writes: >> On Sep 15, 2016, at 1:10 PM, Dave May wrote: >> >> >> >> On Thursday, 15 September 2016, Barry Smith wrote: >> >>Should we have some simple selection of default algorithms based on >> problem

Re: [petsc-users] Question about memory usage in Multigrid preconditioner

2016-09-16 Thread Hengjie Wang
Hi Dave, I add both options and test it by solving the poisson eqn in a 1024 cube with 32^3 cores. This test used to give the OOM error. Now it runs well. I attach the ksp_view and log_view's output in case you want to know. I also test my original code with those petsc options by simulating

Re: [petsc-users] Question about memory usage in Multigrid preconditioner

2016-09-15 Thread Barry Smith
> On Sep 15, 2016, at 1:10 PM, Dave May wrote: > > > > On Thursday, 15 September 2016, Barry Smith wrote: > >Should we have some simple selection of default algorithms based on > problem size/number of processes? For example if using more

Re: [petsc-users] Question about memory usage in Multigrid preconditioner

2016-09-15 Thread Dave May
On Thursday, 15 September 2016, Barry Smith wrote: > >Should we have some simple selection of default algorithms based on > problem size/number of processes? For example if using more than 1000 > processes then use scalable version etc? How would we decide on the >

Re: [petsc-users] Question about memory usage in Multigrid preconditioner

2016-09-15 Thread Barry Smith
Should we have some simple selection of default algorithms based on problem size/number of processes? For example if using more than 1000 processes then use scalable version etc? How would we decide on the parameter values? Barry > On Sep 15, 2016, at 5:35 AM, Dave May

Re: [petsc-users] Question about memory usage in Multigrid preconditioner

2016-09-15 Thread Dave May
HI all, I the only unexpected memory usage I can see is associated with the call to MatPtAP(). Here is something you can try immediately. Run your code with the additional options -matrap 0 -matptap_scalable I didn't realize this before, but the default behaviour of MatPtAP in parallel is

Re: [petsc-users] Question about memory usage in Multigrid preconditioner

2016-09-15 Thread Dave May
On Thursday, 15 September 2016, Hengjie Wang wrote: > Hi Dave, > > Sorry, I should have put more comment to explain the code. > No problem. I was looking at the code after only 3 hrs of sleep > > The number of process in each dimension is the same: Px = Py=Pz=P. So is >

Re: [petsc-users] Question about memory usage in Multigrid preconditioner

2016-09-15 Thread Hengjie Wang
Hi Dave, Sorry, I should have put more comment to explain the code. The number of process in each dimension is the same: Px = Py=Pz=P. So is the domain size. So if the you want to run the code for a 512^3 grid points on 16^3 cores, you need to set "-N 512 -P 16" in the command line. I add

Re: [petsc-users] Question about memory usage in Multigrid preconditioner

2016-09-14 Thread Dave May
On Thursday, 15 September 2016, Dave May wrote: > > > On Thursday, 15 September 2016, frank > wrote: > >> Hi, >> >> I write a simple code to re-produce the error. I hope this can help to >> diagnose

Re: [petsc-users] Question about memory usage in Multigrid preconditioner

2016-09-14 Thread Dave May
On Thursday, 15 September 2016, frank wrote: > Hi, > > I write a simple code to re-produce the error. I hope this can help to > diagnose the problem. > The code just solves a 3d poisson equation. > Why is the stencil width a runtime parameter?? And why is the default value 2?

Re: [petsc-users] Question about memory usage in Multigrid preconditioner

2016-09-14 Thread Dave May
Hi Frank, On Thursday, 15 September 2016, frank wrote: > Hi, > > I write a simple code to re-produce the error. I hope this can help to > diagnose the problem. > The code just solves a 3d poisson equation. > I run the code on a 1024^3 mesh. The process partition is 32 * 32 *

Re: [petsc-users] Question about memory usage in Multigrid preconditioner

2016-09-14 Thread frank
Hi, I write a simple code to re-produce the error. I hope this can help to diagnose the problem. The code just solves a 3d poisson equation. I run the code on a 1024^3 mesh. The process partition is 32 * 32 * 32. That's when I re-produce the OOM error. Each core has about 2G memory. I also

Re: [petsc-users] Question about memory usage in Multigrid preconditioner

2016-09-09 Thread Hengjie Wang
Hi Barry, I checked. On the supercomputer, I had the option "-ksp_view_pre" but it is not in file I sent you. I am sorry for the confusion. Regards, Frank On Friday, September 9, 2016, Barry Smith wrote: > > > On Sep 9, 2016, at 3:11 PM, frank

Re: [petsc-users] Question about memory usage in Multigrid preconditioner

2016-09-09 Thread Barry Smith
> On Sep 9, 2016, at 3:11 PM, frank wrote: > > Hi Barry, > > I think the first KSP view output is from -ksp_view_pre. Before I submitted > the test, I was not sure whether there would be OOM error or not. So I added > both -ksp_view_pre and -ksp_view. But the options

Re: [petsc-users] Question about memory usage in Multigrid preconditioner

2016-09-09 Thread frank
Hi Barry, I think the first KSP view output is from -ksp_view_pre. Before I submitted the test, I was not sure whether there would be OOM error or not. So I added both -ksp_view_pre and -ksp_view. Frank On 09/09/2016 12:38 PM, Barry Smith wrote: Why does ksp_view2.txt have two KSP

Re: [petsc-users] Question about memory usage in Multigrid preconditioner

2016-09-09 Thread Barry Smith
Why does ksp_view2.txt have two KSP views in it while ksp_view1.txt has only one KSPView in it? Did you run two different solves in the 2 case but not the one? Barry > On Sep 9, 2016, at 10:56 AM, frank wrote: > > Hi, > > I want to continue digging into the memory

Re: [petsc-users] Question about memory usage in Multigrid preconditioner

2016-09-09 Thread frank
Hi, I want to continue digging into the memory problem here. I did find a work around in the past, which is to use less cores per node so that each core has 8G memory. However this is deficient and expensive. I hope to locate the place that uses the most memory. Here is a brief summary of

Re: [petsc-users] Question about memory usage in Multigrid preconditioner

2016-07-13 Thread Dave May
On 14 July 2016 at 01:07, frank wrote: > Hi Dave, > > Sorry for the late reply. > Thank you so much for your detailed reply. > > I have a question about the estimation of the memory usage. There are > 4223139840 allocated non-zeros and 18432 MPI processes. Double precision is >

Re: [petsc-users] Question about memory usage in Multigrid preconditioner

2016-07-13 Thread Barry Smith
> On Jul 13, 2016, at 6:07 PM, frank wrote: > > Hi Dave, > > Sorry for the late reply. > Thank you so much for your detailed reply. > > I have a question about the estimation of the memory usage. There are > 4223139840 allocated non-zeros and 18432 MPI processes. Double

Re: [petsc-users] Question about memory usage in Multigrid preconditioner

2016-07-13 Thread frank
Hi Dave, Sorry for the late reply. Thank you so much for your detailed reply. I have a question about the estimation of the memory usage. There are 4223139840 allocated non-zeros and 18432 MPI processes. Double precision is used. So the memory per process is: 4223139840 * 8bytes / 18432 /

Re: [petsc-users] Question about memory usage in Multigrid preconditioner

2016-07-12 Thread Barry Smith
> On Jul 11, 2016, at 3:18 PM, Dave May wrote: > > Hi Frank, > > > On 11 July 2016 at 19:14, frank wrote: > Hi Dave, > > I re-run the test using bjacobi as the preconditioner on the coarse mesh of > telescope. The Grid is 3072*256*768 and process

Re: [petsc-users] Question about memory usage in Multigrid preconditioner

2016-07-11 Thread Dave May
Hi Frank, On 11 July 2016 at 19:14, frank wrote: > Hi Dave, > > I re-run the test using bjacobi as the preconditioner on the coarse mesh > of telescope. The Grid is 3072*256*768 and process mesh is 96*8*24. The > petsc option file is attached. > I still got the "Out Of

[petsc-users] Question about memory usage in Multigrid preconditioner

2016-07-08 Thread Dave May
On Saturday, 9 July 2016, frank > wrote: > Hi Barry and Dave, > > Thank both of you for the advice. > > @Barry > I made a mistake in the file names in last email. I attached the correct > files this time. > For all the three

Re: [petsc-users] Question about memory usage in Multigrid preconditioner

2016-07-08 Thread Barry Smith
Frank, I don't think we yet have enough information to figure out what is going on. Can you please run the test1 but on the larger number of processes? Our goal is to determine the memory usage scaling as you increase the mesh size with a fixed number of processes, from test 2 to

Re: [petsc-users] Question about memory usage in Multigrid preconditioner

2016-07-08 Thread frank
Hi Barry and Dave, Thank both of you for the advice. @Barry I made a mistake in the file names in last email. I attached the correct files this time. For all the three tests, 'Telescope' is used as the coarse preconditioner. == Test1: Grid: 1536*128*384, Process Mesh: 48*4*12 Part of the

Re: [petsc-users] Question about memory usage in Multigrid preconditioner

2016-07-07 Thread Dave May
Hi Frank, On 6 July 2016 at 00:23, frank wrote: > Hi, > > I am using the CG ksp solver and Multigrid preconditioner to solve a > linear system in parallel. > I chose to use the 'Telescope' as the preconditioner on the coarse mesh > for its good performance. > The petsc

Re: [petsc-users] Question about memory usage in Multigrid preconditioner

2016-07-06 Thread Barry Smith
> On Jul 6, 2016, at 4:19 PM, frank wrote: > > Hi Barry, > > Thank you for you advice. > I tried three test. In the 1st test, the grid is 3072*256*768 and the process > mesh is 96*8*24. > The linear solver is 'cg' the preconditioner is 'mg' and 'telescope' is used > as the

Re: [petsc-users] Question about memory usage in Multigrid preconditioner

2016-07-06 Thread frank
Hi Barry, Thank you for you advice. I tried three test. In the 1st test, the grid is 3072*256*768 and the process mesh is 96*8*24. The linear solver is 'cg' the preconditioner is 'mg' and 'telescope' is used as the preconditioner at the coarse mesh. The system gives me the "Out of Memory"

Re: [petsc-users] Question about memory usage in Multigrid preconditioner

2016-07-05 Thread Barry Smith
Frank, You can run with -ksp_view_pre to have it "view" the KSP before the solve so hopefully it gets that far. Please run the problem that does fit with -memory_info when the problem completes it will show the "high water mark" for PETSc allocated memory and total memory used. We

[petsc-users] Question about memory usage in Multigrid preconditioner

2016-07-05 Thread frank
Hi, I am using the CG ksp solver and Multigrid preconditioner to solve a linear system in parallel. I chose to use the 'Telescope' as the preconditioner on the coarse mesh for its good performance. The petsc options file is attached. The domain is a 3d box. It works well when the grid is