On Fri, May 31, 2019 at 3:48 PM Sanjay Govindjee via petsc-users
mailto:petsc-users@mcs.anl.gov>> wrote:
Thanks Stefano.
Reading the manual pages a bit more carefully,
I think I can see what I should be doing. Which should be roughly to
1. Set up target Seq vectors on PETSC_COMM_SELF
2. Use
Thanks Stefano.
Reading the manual pages a bit more carefully,
I think I can see what I should be doing. Which should be roughly to
1. Set up target Seq vectors on PETSC_COMM_SELF
2. Use ISCreateGeneral to create ISs for the target Vecs and the source
Vec which will be MPI on
Yes, the issue is running out of memory on long runs.
Perhaps some clean-up happens latter when the memory pressure builds but
that
is a bit non-ideal.
-sanjay
On 5/31/19 12:53 PM, Zhang, Junchao wrote:
Sanjay,
I tried petsc with MPICH and OpenMPI on my Macbook. I
inserted
Sanjay,
I tried petsc with MPICH and OpenMPI on my Macbook. I inserted
PetscMemoryGetCurrentUsage/PetscMallocGetCurrentUsage at the beginning and end
of KSPSolve and then computed the delta and summed over processes. Then I
tested with
Hi Satish,
I have added these configure options
--batch=1 --known-64-bit-blas-indices=0 -known-mpi-shared-libraries=0
It is still hanging
Best,
Xiao
From: Balay, Satish
Sent: Friday, May 31, 2019 12:35
To: Ma, Xiao
Cc: petsc-users@mcs.anl.gov
Subject: Re:
Matt,
Here is the process as it currently stands:
1) I have a PETSc Vec (sol), which come from a KSPSolve
2) Each processor grabs its section of sol via VecGetOwnershipRange and
VecGetArrayReadF90
and inserts parts of its section of sol in a local array (locarr) using
a complex but easily
PETSc configure is attempting to run some MPI binaries - and that is hanging
with this MPI.
You can retry with the options:
--batch=1 --known-64-bit-blas-indices=0 -known-mpi-shared-libraries=0
[and follow instructions provided by configure]
Satish
On Fri, 31 May 2019, Ma, Xiao via
Hi ,
I am trying to install Pylith which is a earthquake simulator using Petsc
library, I am building it in PSC bridge cluster, during the steps of building
Petsc, the configuration hanging at
TESTING: configureMPITypes from
config.packages.MPI(config/BuildSystem/config/packages/MPI.py:247)