On Wed, 16 Oct 2019 at 15:54, Jed Brown via petsc-dev
wrote:
> Stefano Zampini writes:
>
> > I just took a look at the ISGENERAL code. ISSetBlockSize_General just
> sets the block size of the layout (??)
> > ISGetIndices always return the data->idx memory.
> > So, a more profound question is:
On Mon, 19 Aug 2019 at 13:53, Matthew Knepley wrote:
[...]
>> OK, so I think I am getting there. Presently I am abusing
>> DMPlexCreateFromDAG to migrate a DM on oldComm onto newComm, but this
>> is very fragile. I attach what I have right now. You have to run it
>> with PTSCOTCH, because
On Wed, 7 Aug 2019 at 13:52, Matthew Knepley wrote:
>
> On Wed, Aug 7, 2019 at 7:13 AM Lawrence Mitchell via petsc-dev
> wrote:
>>
>> Dear petsc-dev,
>>
>> I would like to run with a geometric multigrid hierarchy built out of DMPlex
>> objects. On t
Dear petsc-dev,
I would like to run with a geometric multigrid hierarchy built out of DMPlex
objects. On the coarse grids, I would like to reduce the size of the
communicator being used. Since I build the hierarchy by regularly refining the
coarse grid problem, this means I need a way of
Hi Vaclav,
> On 21 Jun 2019, at 12:14, Hapla Vaclav via petsc-dev
> wrote:
>
> VecGetValuesSection() returns pointer values obtained as follows:
>
> VecGetArray(v, );
> *values = [s->atlasOff[p]];
> VecRestoreArray(v, );
>
> It looks to me scary.
> VecGetArray manpage says: If the underlying
> On 17 Jun 2019, at 19:50, Jed Brown via petsc-dev
> wrote:
[...]
>
> As for strategy, we have these primary forms of documentation:
>
> * Users Manual
>
> A lot of value here is in cross-references between the manual and man
> pages. PETSc has a special pass where it creates links
> On 14 Jun 2019, at 18:44, Zhang, Junchao via petsc-dev
> wrote:
>
> Hello,
>I am investigating petsc issue 306. One can produce the problem with
> src/snes/examples/tutorials/ex9.c and mpirun -n 3 ./ex9 -snes_grid_sequence 3
> -snes_converged_reason -pc_type mg
> The program can
> On 11 Apr 2019, at 21:02, Matthew Knepley via petsc-dev
> wrote:
>
> Jed, should we be doing this? My first impression is that our builds catch a
> lot of configure errors so we do not want it.
We still configure and build firedrake in the tests. This is just for
downstream applications
> On 12 Mar 2019, at 18:46, Zhang, Junchao via petsc-dev
> wrote:
>
> How to fix that? I do not think I need a Fortran stub for that. I found under
> ftn-auto, not all .c functions have a Fotran counterpart. What is the rule?
Comment block starts with /*@C I think
It's in the dev manual
Hi Barry,
> On 6 Mar 2019, at 01:39, Smith, Barry F. wrote:
>
>
> Lawrence,
>
> Is this issue resolved or still stuck. I totally agree with you that
> Matt's change seems inane, how can one possibly just take the
> function/pointer that operates on the whole DM and assume it will work
Hi Matt, all,
> On 21 Feb 2019, at 18:17, Matthew Knepley wrote:
>
> Here is why I did this. I wanted this to work
>
>
> https://bitbucket.org/petsc/petsc/src/4dbc1805575afffed4e440f1353fcfccbc893081/src/snes/examples/tutorials/ex62.c#lines-1062
>
> The DMKSP is initialized by SNES to have
Dear petsc-dev,
95dbaa6faf01fdfd99114b7c9e5668e4b2aa754d
which does this:
commit 95dbaa6faf01fdfd99114b7c9e5668e4b2aa754d
Author: Matthew G. Knepley
Date: Fri Oct 26 14:12:14 2018 -0400
PC: FieldSplit must copy DMKSP routines to subDM
diff --git
> On 16 Jan 2019, at 13:21, Matthew Knepley via petsc-dev
> wrote:
>
> I think you just Push and do not Pop. Do we check for a non-empty stack? We
> silently ignore too many Pops.
If you don't pop, at least in the past, you would get an error.
Lawrence
> On 17 Dec 2018, at 11:56, Hapla Vaclav wrote:
>
> Matt, great that your reminded this email. I actually completely missed it
> that time.
>
>> On 14 Dec 2018, at 19:54, Matthew Knepley via petsc-dev
>> wrote:
[...]
>> I would like:
>>
>> - To be able to dump the DMPlex, and fields, on
Hi Matt, all,
> On 14 Dec 2018, at 19:54, Matthew Knepley wrote:
>
> On Fri, Jul 20, 2018 at 5:34 AM Lawrence Mitchell wrote:
> Dear petsc-dev,
>
> I'm once again revisiting doing "proper" checkpoint-restart cycles. I would
> like to leverage the existing PETSc stuff for this as much as
15 matches
Mail list logo