Re: [petsc-users] Out of memory and parallel issues

2015-01-02 Thread Mark Adams
On Thu, Jan 1, 2015 at 12:20 AM, TAY wee-beng wrote: > Hi, > > I used to run my CFD code with 96 procs, with a grid size of 231 x 461 x > 368. > > I used MPI and partition my grid in the z direction. Hence with 96 procs > (8 nodes, each 12 procs), each procs has a size of 231 x 461 x 3 or 231 x >

Re: [petsc-users] Out of memory and parallel issues

2015-01-01 Thread Matthew Knepley
On Thu, Jan 1, 2015 at 2:15 AM, TAY wee-beng wrote: > > On 1/1/2015 2:06 PM, Matthew Knepley wrote: > > On Wed, Dec 31, 2014 at 11:20 PM, TAY wee-beng wrote: > >> Hi, >> >> I used to run my CFD code with 96 procs, with a grid size of 231 x 461 x >> 368. >> >> I used MPI and partition my grid i

Re: [petsc-users] Out of memory and parallel issues

2015-01-01 Thread TAY wee-beng
On 1/1/2015 2:06 PM, Matthew Knepley wrote: On Wed, Dec 31, 2014 at 11:20 PM, TAY wee-beng > wrote: Hi, I used to run my CFD code with 96 procs, with a grid size of 231 x 461 x 368. I used MPI and partition my grid in the z direction. Hence with 96

Re: [petsc-users] Out of memory and parallel issues

2014-12-31 Thread Matthew Knepley
On Wed, Dec 31, 2014 at 11:20 PM, TAY wee-beng wrote: > Hi, > > I used to run my CFD code with 96 procs, with a grid size of 231 x 461 x > 368. > > I used MPI and partition my grid in the z direction. Hence with 96 procs > (8 nodes, each 12 procs), each procs has a size of 231 x 461 x 3 or 231 x

[petsc-users] Out of memory and parallel issues

2014-12-31 Thread TAY wee-beng
Hi, I used to run my CFD code with 96 procs, with a grid size of 231 x 461 x 368. I used MPI and partition my grid in the z direction. Hence with 96 procs (8 nodes, each 12 procs), each procs has a size of 231 x 461 x 3 or 231 x 461 x 4. It worked fine. Now I modified the code and added s