> Date: Fri, 4 Oct 2013 23:48:10 -0500
> From: ba...@mcs.anl.gov
> To: sonyablade2...@hotmail.com
> CC: petsc-users@mcs.anl.gov
> Subject: Re: [petsc-users] Installation Error
>
> On Sat, 5 Oct 2013, Sonya Blade wrote:
>
>> Dear All,
>>
>> I'm receiving the
On Sat, 5 Oct 2013, Sonya Blade wrote:
> Dear All,
>
> I'm receiving the following errors when I'm trying to install the
> the slepc library, I'm attaching the following error messages and content
> of the configure.log file residing in slepc-3.4.2\arch-mswin-c-debug\conf
> folder
>
> /cygdr
Dear All,
I'm receiving the following errors when I'm trying to install the
the slepc library, I'm attaching the following error messages and content
of the configure.log file residing in slepc-3.4.2\arch-mswin-c-debug\conf
folder
Here is the output from Cygwin terminal
noname@noname /cygd
On Thu, 3 Oct 2013, Mengda Wu wrote:
> Hi Satis,
>
> I've tried to use mingw distributed with cygwin. And it worked well for
> static libraries.
>
> Another problem I am facing is I cannot build shared libraries with it
> even I supplied with "--with-shared-libraries=1". Only .a libs are
> g
Juha Jäykkä writes:
> On Sunday 18 August 2013 08:10:19 Jed Brown wrote:
>> Output uses a collective write, so the granularity of the IO node is
>> probably more relevant for writing (e.g., BG/Q would have one IO node
>> per 128 compute nodes), but almost any chunk size should perform
>> similarl
Matthew Knepley writes:
> On Fri, Oct 4, 2013 at 2:10 PM, Åsmund Ervik wrote:
>
>> Is there any reason I should prefer Xdmf over Cgns? I think both use
>> hdf5 in the background.
>>
>
> I guess not. It was just easier for me to get Xdmf going. We have trial
> CGNS as well.
I don't think we hav
Thanks a lot. It works.
Date: Fri, 4 Oct 2013 14:00:09 -0500
Subject: Re: [petsc-users] cannot find PetscViewerHDF5 functions in libraries
From: knep...@gmail.com
To: pengxw...@hotmail.com
CC: petsc-users@mcs.anl.gov
On Fri, Oct 4, 2013 at 1:48 PM, Roc Wang wrote:
Hello,
The version of P
On Fri, Oct 4, 2013 at 2:10 PM, Åsmund Ervik wrote:
> Is there any reason I should prefer Xdmf over Cgns? I think both use
> hdf5 in the background.
>
I guess not. It was just easier for me to get Xdmf going. We have trial
CGNS as well.
Matt
> Apologies by the way for my phone's inability
Is there any reason I should prefer Xdmf over Cgns? I think both use hdf5 in
the background.
Apologies by the way for my phone's inability to reply inline properly.
Åsmund
Sent from my VT-102
Matthew Knepley skrev:
On Fri, Oct 4, 2013 at 1:51 PM, Åsmund Ervik
mailto:asmund.er...@ntnu.no>> w
On Fri, Oct 4, 2013 at 1:51 PM, Åsmund Ervik wrote:
>
> Matthew Knepley skrev:
> On Fri, Oct 4, 2013 at 12:57 PM, Åsmund Ervik wrote:
>
>> Barry,
>>
>> Thanks for the quick answer.
>>
>> Good to hear that I can use the DMDA framework for all variables.
>> Should I put all scalars (e.g. pressu
On Fri, Oct 4, 2013 at 1:48 PM, Roc Wang wrote:
> Hello,
>
>The version of Petsc I am using is 3.3-p6. I have hdf5 1.8.11
> installed as well as. I am trying to use PetscViewerHDF5Open(),
> PetscViewerHDF5PushGroup(). But the compiling error shows undefined
> reference to `PetscViewerHDF5Ope
Matthew Knepley skrev:
On Fri, Oct 4, 2013 at 12:57 PM, Åsmund Ervik
mailto:asmund.er...@ntnu.no>> wrote:
Barry,
Thanks for the quick answer.
Good to hear that I can use the DMDA framework for all variables. Should I put
all scalars (e.g. pressure, level set function, etc) in the same DA, or
Hello,
The version of Petsc I am using is 3.3-p6. I have hdf5 1.8.11 installed as
well as. I am trying to use PetscViewerHDF5Open(), PetscViewerHDF5PushGroup().
But the compiling error shows undefined reference to `PetscViewerHDF5Open' and
undefined reference to `PetscViewerHDF5PushGroup'.
On Fri, Oct 4, 2013 at 12:57 PM, Åsmund Ervik wrote:
> Barry,
>
> Thanks for the quick answer.
>
> Good to hear that I can use the DMDA framework for all variables. Should
> I put all scalars (e.g. pressure, level set function, etc) in the same DA,
> or should I keep a distinct one for the pre
Barry,
Thanks for the quick answer.
Good to hear that I can use the DMDA framework for all variables. Should I put
all scalars (e.g. pressure, level set function, etc) in the same DA, or should
I keep a distinct one for the pressure (where I want to use multigrid)?
The reason I was unsure is t
On Sunday 18 August 2013 08:10:19 Jed Brown wrote:
> Output uses a collective write, so the granularity of the IO node is
> probably more relevant for writing (e.g., BG/Q would have one IO node
> per 128 compute nodes), but almost any chunk size should perform
> similarly. It would make a lot more
Asmund,
You can use the DMDA to manage the layout of your velocity variables as
well as the pressure variables. You will have two DMDA, one that manages the
cell-centered pressure variables (this is created with the dof argument of 1)
and one that handles the velocities (that is created
Dear all,
We have a two-phase incompressible Navier-Stokes solver written in
Fortran where we use PETSc for solving the pressure Poisson equation.
Since both PETSc and parallelism was an afterthought to this code, it
doesn't scale well at all, so I am tasked with re-writing the whole
thing now. Be
On Fri, Oct 4, 2013 at 1:23 PM, Matthew Knepley wrote:
> On Fri, Oct 4, 2013 at 6:11 AM, Bishesh Khanal wrote:
>
>> Dear all,
>>
>> I have a part of a code that does NOT use Petsc, and contains an image K
>> of dimensions mXnXr.
>> It also provides me with a function to access any value at (i,j,k
On Fri, Oct 4, 2013 at 6:11 AM, Bishesh Khanal wrote:
> Dear all,
>
> I have a part of a code that does NOT use Petsc, and contains an image K
> of dimensions mXnXr.
> It also provides me with a function to access any value at (i,j,k), e.g.
> using K.getVoxelAt(i,j,k).
>
>
> Now in my petsc code
Dear all,
I have a part of a code that does NOT use Petsc, and contains an image K of
dimensions mXnXr.
It also provides me with a function to access any value at (i,j,k), e.g.
using K.getVoxelAt(i,j,k).
Now in my petsc code I create a DMDA of size mXnXr to solve a pde. The
coefficients of the p
21 matches
Mail list logo