Re: [petsc-users] Installation Error

2013-10-04 Thread Sonya Blade
> Date: Fri, 4 Oct 2013 23:48:10 -0500 > From: ba...@mcs.anl.gov > To: sonyablade2...@hotmail.com > CC: petsc-users@mcs.anl.gov > Subject: Re: [petsc-users] Installation Error > > On Sat, 5 Oct 2013, Sonya Blade wrote: > >> Dear All, >> >> I'm receiving the

Re: [petsc-users] Installation Error

2013-10-04 Thread Satish Balay
On Sat, 5 Oct 2013, Sonya Blade wrote: > Dear All, > > I'm receiving the following errors when I'm trying to install the  > the slepc library, I'm attaching the following error messages and content  > of the configure.log file residing in slepc-3.4.2\arch-mswin-c-debug\conf > folder  > > /cygdr

[petsc-users] Installation Error

2013-10-04 Thread Sonya Blade
Dear All, I'm receiving the following errors when I'm trying to install the  the slepc library, I'm attaching the following error messages and content  of the configure.log file residing in slepc-3.4.2\arch-mswin-c-debug\conf folder    Here is the output from Cygwin terminal  noname@noname /cygd

Re: [petsc-users] Petsc-3.4.2 with MinGW-w64 on Windows 7

2013-10-04 Thread Satish Balay
On Thu, 3 Oct 2013, Mengda Wu wrote: > Hi Satis, > > I've tried to use mingw distributed with cygwin. And it worked well for > static libraries. > > Another problem I am facing is I cannot build shared libraries with it > even I supplied with "--with-shared-libraries=1". Only .a libs are > g

Re: [petsc-users] Unable to create >4GB sized HDF5 files on Cray XC30

2013-10-04 Thread Jed Brown
Juha Jäykkä writes: > On Sunday 18 August 2013 08:10:19 Jed Brown wrote: >> Output uses a collective write, so the granularity of the IO node is >> probably more relevant for writing (e.g., BG/Q would have one IO node >> per 128 compute nodes), but almost any chunk size should perform >> similarl

Re: [petsc-users] Writing a domain decomposition code with PETSc

2013-10-04 Thread Jed Brown
Matthew Knepley writes: > On Fri, Oct 4, 2013 at 2:10 PM, Åsmund Ervik wrote: > >> Is there any reason I should prefer Xdmf over Cgns? I think both use >> hdf5 in the background. >> > > I guess not. It was just easier for me to get Xdmf going. We have trial > CGNS as well. I don't think we hav

Re: [petsc-users] cannot find PetscViewerHDF5 functions in libraries

2013-10-04 Thread Roc Wang
Thanks a lot. It works. Date: Fri, 4 Oct 2013 14:00:09 -0500 Subject: Re: [petsc-users] cannot find PetscViewerHDF5 functions in libraries From: knep...@gmail.com To: pengxw...@hotmail.com CC: petsc-users@mcs.anl.gov On Fri, Oct 4, 2013 at 1:48 PM, Roc Wang wrote: Hello, The version of P

Re: [petsc-users] Writing a domain decomposition code with PETSc

2013-10-04 Thread Matthew Knepley
On Fri, Oct 4, 2013 at 2:10 PM, Åsmund Ervik wrote: > Is there any reason I should prefer Xdmf over Cgns? I think both use > hdf5 in the background. > I guess not. It was just easier for me to get Xdmf going. We have trial CGNS as well. Matt > Apologies by the way for my phone's inability

Re: [petsc-users] Writing a domain decomposition code with PETSc

2013-10-04 Thread Åsmund Ervik
Is there any reason I should prefer Xdmf over Cgns? I think both use hdf5 in the background. Apologies by the way for my phone's inability to reply inline properly. Åsmund Sent from my VT-102 Matthew Knepley skrev: On Fri, Oct 4, 2013 at 1:51 PM, Åsmund Ervik mailto:asmund.er...@ntnu.no>> w

Re: [petsc-users] Writing a domain decomposition code with PETSc

2013-10-04 Thread Matthew Knepley
On Fri, Oct 4, 2013 at 1:51 PM, Åsmund Ervik wrote: > > Matthew Knepley skrev: > On Fri, Oct 4, 2013 at 12:57 PM, Åsmund Ervik wrote: > >> Barry, >> >> Thanks for the quick answer. >> >> Good to hear that I can use the DMDA framework for all variables. >> Should I put all scalars (e.g. pressu

Re: [petsc-users] cannot find PetscViewerHDF5 functions in libraries

2013-10-04 Thread Matthew Knepley
On Fri, Oct 4, 2013 at 1:48 PM, Roc Wang wrote: > Hello, > >The version of Petsc I am using is 3.3-p6. I have hdf5 1.8.11 > installed as well as. I am trying to use PetscViewerHDF5Open(), > PetscViewerHDF5PushGroup(). But the compiling error shows undefined > reference to `PetscViewerHDF5Ope

Re: [petsc-users] Writing a domain decomposition code with PETSc

2013-10-04 Thread Åsmund Ervik
Matthew Knepley skrev: On Fri, Oct 4, 2013 at 12:57 PM, Åsmund Ervik mailto:asmund.er...@ntnu.no>> wrote: Barry, Thanks for the quick answer. Good to hear that I can use the DMDA framework for all variables. Should I put all scalars (e.g. pressure, level set function, etc) in the same DA, or

[petsc-users] cannot find PetscViewerHDF5 functions in libraries

2013-10-04 Thread Roc Wang
Hello, The version of Petsc I am using is 3.3-p6. I have hdf5 1.8.11 installed as well as. I am trying to use PetscViewerHDF5Open(), PetscViewerHDF5PushGroup(). But the compiling error shows undefined reference to `PetscViewerHDF5Open' and undefined reference to `PetscViewerHDF5PushGroup'.

Re: [petsc-users] Writing a domain decomposition code with PETSc

2013-10-04 Thread Matthew Knepley
On Fri, Oct 4, 2013 at 12:57 PM, Åsmund Ervik wrote: > Barry, > > Thanks for the quick answer. > > Good to hear that I can use the DMDA framework for all variables. Should > I put all scalars (e.g. pressure, level set function, etc) in the same DA, > or should I keep a distinct one for the pre

Re: [petsc-users] Writing a domain decomposition code with PETSc

2013-10-04 Thread Åsmund Ervik
Barry, Thanks for the quick answer. Good to hear that I can use the DMDA framework for all variables. Should I put all scalars (e.g. pressure, level set function, etc) in the same DA, or should I keep a distinct one for the pressure (where I want to use multigrid)? The reason I was unsure is t

Re: [petsc-users] Unable to create >4GB sized HDF5 files on Cray XC30

2013-10-04 Thread Juha Jäykkä
On Sunday 18 August 2013 08:10:19 Jed Brown wrote: > Output uses a collective write, so the granularity of the IO node is > probably more relevant for writing (e.g., BG/Q would have one IO node > per 128 compute nodes), but almost any chunk size should perform > similarly. It would make a lot more

Re: [petsc-users] Writing a domain decomposition code with PETSc

2013-10-04 Thread Barry Smith
Asmund, You can use the DMDA to manage the layout of your velocity variables as well as the pressure variables. You will have two DMDA, one that manages the cell-centered pressure variables (this is created with the dof argument of 1) and one that handles the velocities (that is created

[petsc-users] Writing a domain decomposition code with PETSc

2013-10-04 Thread Åsmund Ervik
Dear all, We have a two-phase incompressible Navier-Stokes solver written in Fortran where we use PETSc for solving the pressure Poisson equation. Since both PETSc and parallelism was an afterthought to this code, it doesn't scale well at all, so I am tasked with re-writing the whole thing now. Be

Re: [petsc-users] accessing sequential data (e.g. image) at (i, j, k) in DM in parallel

2013-10-04 Thread Bishesh Khanal
On Fri, Oct 4, 2013 at 1:23 PM, Matthew Knepley wrote: > On Fri, Oct 4, 2013 at 6:11 AM, Bishesh Khanal wrote: > >> Dear all, >> >> I have a part of a code that does NOT use Petsc, and contains an image K >> of dimensions mXnXr. >> It also provides me with a function to access any value at (i,j,k

Re: [petsc-users] accessing sequential data (e.g. image) at (i, j, k) in DM in parallel

2013-10-04 Thread Matthew Knepley
On Fri, Oct 4, 2013 at 6:11 AM, Bishesh Khanal wrote: > Dear all, > > I have a part of a code that does NOT use Petsc, and contains an image K > of dimensions mXnXr. > It also provides me with a function to access any value at (i,j,k), e.g. > using K.getVoxelAt(i,j,k). > > > Now in my petsc code

[petsc-users] accessing sequential data (e.g. image) at (i, j, k) in DM in parallel

2013-10-04 Thread Bishesh Khanal
Dear all, I have a part of a code that does NOT use Petsc, and contains an image K of dimensions mXnXr. It also provides me with a function to access any value at (i,j,k), e.g. using K.getVoxelAt(i,j,k). Now in my petsc code I create a DMDA of size mXnXr to solve a pde. The coefficients of the p