Le 2015-11-30 11:18, Lawrence Mitchell a écrit :
The block size of the submatrix comes from the block size that lives
on the IS used to define it. So set a block size on the IS you make
(ISSetBlockSize).
Great! It works! :)
Thanks!
Eric
Cheers,
LAwrence
Hi Matt
I don’t think the problem is within Petsc - rather somewhere in my code. When I
dump the DMPlex using DMView (ascii-info–detail) the ghost mapping seems to be
setup correctly.
Is there a better way to determine if a local point is a ghost point?
The way I iterate the DMPlex is like
On 30/11/15 16:14, Eric Chamberland wrote:
> Hi,
>
> Using PETSc 3.5.3.
>
> We have a "A" matrix, mpi_aij with block_size=3.
>
> We create a IS with ISCreateStride, then extract A_00 with
> MatGetSubMatrix(..., MAT_INITIAL_MATRIX,...).
>
> We know that A_00 is block_size = 3 and mpi_aij,
Hi,
Using PETSc 3.5.3.
We have a "A" matrix, mpi_aij with block_size=3.
We create a IS with ISCreateStride, then extract A_00 with
MatGetSubMatrix(..., MAT_INITIAL_MATRIX,...).
We know that A_00 is block_size = 3 and mpi_aij, however the matrix
created by PETSc doesn't have the
Is there an option for outputting the Newton step after my linear solve?
Alex
Thanks for the reply.
The error message shows
[0]PETSC ERROR: Invalid argument
[0]PETSC ERROR: Scalar value must be same on all processes, argument # 3
[0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for
trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.6.1, Jul,
Andrey Ovsyannikov writes:
> Thanks for your quick response. I like Massif tool and I have been using it
> recently. However, I was not able to run Valgrind for large jobs. I am
> interested in memory analysis of large scale runs with more than 1000 MPI
> ranks.
PETSc reporting of memory usage for objects is unfortunately not that great;
for example distinguishing between temporary work space allocation vs memory
that is kept for the life of the object is not always clear. Associating memory
with particular objects requires the PETSc source code to
> On Nov 30, 2015, at 2:19 PM, Alex Lindsay wrote:
>
> Is there an option for outputting the Newton step after my linear solve?
>
> Alex
Do you want the solution of the linear system before the line search (line
search may shrink the vector) use -ksp_view_solution or
Dear PETSc team,
I am working on optimization of Chombo-Crunch CFD code for next-generation
supercomputer architectures at NERSC (Berkeley Lab) and we use PETSc AMG
solver. During memory analysis study I faced with a difficulty to get
memory usage data from PETSc for all MPI ranks. I am looking
Hi Matt,
Thanks for your quick response. I like Massif tool and I have been using it
recently. However, I was not able to run Valgrind for large jobs. I am
interested in memory analysis of large scale runs with more than 1000 MPI
ranks. PetscMemoryGetCurrentUsage() works fine for this puprpose
Andrey,
Maybe this is what you tried, but did you try running only a handful of MPI
ranks (out of your 1000) with Massif? I've had success doing things that
way. You won't know what every rank is doing, but you may be able to get a
good idea from your sample.
--Richard
On Mon, Nov 30, 2015 at
On Mon, Nov 30, 2015 at 5:20 PM, Andrey Ovsyannikov
wrote:
> Dear PETSc team,
>
> I am working on optimization of Chombo-Crunch CFD code for next-generation
> supercomputer architectures at NERSC (Berkeley Lab) and we use PETSc AMG
> solver. During memory analysis study I
Hi all,
Is weighted Jacobi available as a preconditioner ? I can't find it in the
list of preconditioners. If not, what is the rationale between this choice
? It is pretty straightforward to code, so if it is not available I can do
it without problem I guess, but I am just wondering. In the
It is a PETSc error. And I just wanted to know if runs without an error in
your machine.
On Nov 30, 2015 4:34 AM, "Jose E. Roman" wrote:
>
> I am not going to run your code. We are not a free debugging service. You
have to debug the code yourself, and let us know only if the
On Mon, Nov 30, 2015 at 7:59 AM, Soumya Mukherjee wrote:
> It is a PETSc error. And I just wanted to know if runs without an error in
> your machine.
>
This is not a PETSc error, as such. PETSc installs a signal handler so that
we can try and get more
information
I have a very simple unstructured mesh composed of two triangles (four
vertices) with one shared edge using a DMPlex:
/|\
/ | \
\ | /
\|/
After distributing this mesh to two processes, each process owns a triangle.
However one process owns tree vertices, while the last vertex is owned by the
On Mon, Nov 30, 2015 at 7:01 AM, Morten Nobel-Jørgensen
wrote:
> I have a very simple unstructured mesh composed of two triangles (four
> vertices) with one shared edge using a DMPlex:
>
> /|\
> / | \
> \ | /
> \|/
>
> After distributing this mesh to two processes, each
18 matches
Mail list logo