Re: [petsc-users] Domain decomposition using DMPLEX

2019-11-26 Thread Matthew Knepley
On Tue, Nov 26, 2019 at 12:24 PM Danyang Su wrote: > On 2019-11-26 10:18 a.m., Matthew Knepley wrote: > > On Tue, Nov 26, 2019 at 11:43 AM Danyang Su wrote: > >> On 2019-11-25 7:54 p.m., Matthew Knepley wrote: >> >> On Mon, Nov 25, 2019 at 6:25 PM Swarnava Ghosh >> wrote: >> >>> Dear PETSc

Re: [petsc-users] Domain decomposition using DMPLEX

2019-11-26 Thread Danyang Su
On 2019-11-26 10:18 a.m., Matthew Knepley wrote: On Tue, Nov 26, 2019 at 11:43 AM Danyang Su > wrote: On 2019-11-25 7:54 p.m., Matthew Knepley wrote: On Mon, Nov 25, 2019 at 6:25 PM Swarnava Ghosh mailto:swarnav...@gmail.com>> wrote: Dear PETSc

Re: [petsc-users] Domain decomposition using DMPLEX

2019-11-26 Thread Matthew Knepley
On Tue, Nov 26, 2019 at 11:43 AM Danyang Su wrote: > On 2019-11-25 7:54 p.m., Matthew Knepley wrote: > > On Mon, Nov 25, 2019 at 6:25 PM Swarnava Ghosh > wrote: > >> Dear PETSc users and developers, >> >> I am working with dmplex to distribute a 3D unstructured mesh made of >> tetrahedrons in a

Re: [petsc-users] Domain decomposition using DMPLEX

2019-11-26 Thread Danyang Su
On 2019-11-25 7:54 p.m., Matthew Knepley wrote: On Mon, Nov 25, 2019 at 6:25 PM Swarnava Ghosh > wrote: Dear PETSc users and developers, I am working with dmplex to distribute a 3D unstructured mesh made of tetrahedrons in a cuboidal domain. I had a few

Re: [petsc-users] Memory optimization

2019-11-26 Thread Smith, Barry F.
> I am basically trying to solve a finite element problem, which is why in 3D I > have 7 non-zero diagonals that are quite farm apart from one another. In 2D I > only have 5 non-zero diagonals that are less far apart. So is it normal that > the set up time is around 400 times greater in the 3D

Re: [petsc-users] Memory optimization

2019-11-26 Thread Perceval Desforges
Hello, This is the output of -log_view. I selected what I thought were the important parts. I don't know if this is the best format to send the logs. If a text file is better let me know. Thanks again, -- PETSc Performance Summary:

Re: [petsc-users] Petsc Matrix modifications

2019-11-26 Thread Matthew Knepley
On Tue, Nov 26, 2019 at 8:04 AM Brandon Denton wrote: > Good Morning, > > Is it possible to expand a matrix in petsc? I current created and loaded a > matrix (6 x 5) which holds information required later in my program. I > would like to store additional information in the matrix by expanding

[petsc-users] Petsc Matrix modifications

2019-11-26 Thread Brandon Denton
Good Morning, Is it possible to expand a matrix in petsc? I current created and loaded a matrix (6 x 5) which holds information required later in my program. I would like to store additional information in the matrix by expanding its size, let's say make it at 10 x 5 matrix. How is this

Re: [petsc-users] petsc without MPI

2019-11-26 Thread Balay, Satish
Generally - even when one wants sequential build - its best use MPICH [or openMPI] when using multiple MPI based packages. [this is to avoid conflicts - if any - in the seqential MPI stubs of these packages] And run the code sequentially.. Satish On Tue, 26 Nov 2019, Smith, Barry F. wrote: