thanks, using " -mat_superlu_dist_fact SamePattern_SameRowPerm" works also with options other than "-mat_superlu_dist_colperm NATURAL". The solution worked for 32bit integers but when I compile petsc using 64bit indices I get the following error regardless of the options used for
Hi, Matt: Thank you for your helps. The exodus file which I used
is generated by PointWise. I should try to make sure whether I can get
a periodic topology using PointWise.Can you point me out which software
can generate an exodus file with periodic topology? Thanks.
leejearlOn Thu,
Hi, Matt: I find that the periodic boundary is specified in the
following way
DMPlexCreateBoxMesh(comm, dim, user->simplex, user->cells, NULL, NULL,
user->periodicity, interpolate, dm).
If I use an exo file, how does the periodicity is specified?
Thanks
LeejearlOn Thu, 2018-12-06 at 21:42
Hi Matt:My mesh file is "*.exo". Can it works
fine?Thanks.leejearl
On Thu, 2018-12-06 at 21:20 -0500, Matthew Knepley wrote:
> On Thu, Dec 6, 2018 at 9:08 PM leejearl
> wrote:
> > Hi, Matt:
> > Is the code in following page?
> >
> >
Hi, Matt: Is the code in following page?
/petsc-master/src/snes/examples/tutorials/ex12.c.html
Thanksleejearl
On Thu, 2018-12-06 at 20:26 -0500, Matthew Knepley wrote:
> On Thu, Dec 6, 2018 at 8:06 PM leejearl
> wrote:
> > Hi Matt:
> > Thank you for your helps. I want to
Hi, Matt: Thanks for your helps. I will tell you if the code of
"SNES ex12" can solve my problem.leejearl On Thu, 2018-12-06 at 20:26
-0500, Matthew Knepley wrote:
> On Thu, Dec 6, 2018 at 8:06 PM leejearl
> wrote:
> > Hi Matt:
> > Thank you for your helps. I want to implement the
Hi Matt:Thank you for your helps. I want to implement the
periodic boundary in dmplex. For a cell or face, the donor cell or face
might be not distributed in the same local dm. Maybe I should ask that
howcan I implement the topology of the periodic boundary in
demplex.Thanks again
Thanks for the suggestion I will try this option. I used already "-mat_superlu_dist_colperm NATURAL" which seems to help but I did not test it thoroughly.
Betreff: Re: [petsc-users] seg fault with superlu
On Thu, Dec 6, 2018 at 2:55 AM Matthew Knepley via petsc-users
I am already using superlu_dist 6.1 but I used to "--download-superlu_dist=1 --download-superlu_dist-commit=001d994a22684fd60cab5530cc2192f19fc58e83" when configuring petsc, is this ok?
Don't remember - but you can try with latest superlu_dist and see if
the problem persists. [it has a
For parallel computations on the CPU, we allocate our own vectors and then give
PETSc a point to our vectors using VecCreateMPIWithArray. For computations only
on the GPU, we allocate vectors on the GPU and we’d like to do the same thing
and give PETSc a pointer to our vector.
1. Does
I'm attaching the file vector.dat and the file petsc saves (vec_old.dat)
with this email.
Thank You,
Sajid Ali
Applied Physics
Northwestern University
vector.dat
Description: Binary data
vec_old.dat
Description: Binary data
Hi, Petsc developer:
I have a problem for helps.
There is a dmplex object, and I have distribute it use the
routine DMPlexDistribute(). Then, I want to get the global integer ids
in the label "Face Sets". The code is as follow,
ierr = DMGetLabelIdIS(dm, "Face Sets",
Don't remember - but you can try with latest superlu_dist and see if
the problem persists. [it has a bunch of fixes]
To try latest superlu_dist - you can use
balay/update-superlu_dist_6.1.0/maint_no_merge
[this branch is off maint - so to try with master - you can merge this with
master]
git
I was wrong with the decomposed case structure:
In case of a multicore simulation, I get a decomposed case with local data
(matrix+rhs+x, neighbour processors (==rank) numbers) on each process/rank.
There are no halo regions (instead the original application works with data
streams to
14 matches
Mail list logo