Dear All,
I have encountered a peculiar problem when fiddling with a code with PETSC
3.16.3 (which worked fine with PETSc 3.15). It is a very straight forward
PDE-based optimization code which repeatedly solves a linearized PDE problem
with KSP in a subroutine (the rest of the code does not con
Hi Samar,
Yes, with mpich, there is no such error. I will just use this configuration for
now.
Thanks,
Danyang
From: Samar Khatiwala
Date: Thursday, January 13, 2022 at 1:16 AM
To: Danyang Su
Cc: PETSc
Subject: Re: [petsc-users] PETSc configuration error on macOS Monterey with
On Thu, Jan 13, 2022 at 1:15 PM Nicolás Barnafi wrote:
> Dear all,
>
> I have created a first implementation. For now it must be called after
> setting the fields, eventually I would like to move it to the setup phase.
> The implementation seems clean, but it is giving me some memory errors
> (fr
Dear all,
I have created a first implementation. For now it must be called after
setting the fields, eventually I would like to move it to the setup phase.
The implementation seems clean, but it is giving me some memory errors
(free() corrupted unsorted chunks).
You may find the code below. After
Try:
https://petsc.org/release/faq/#when-should-can-i-use-the-configure-option-with-64-bit-indices
Also best to use the current release 3.16
Satish
On Thu, 13 Jan 2022, 佟莹 wrote:
> Dear PETSc developers:
> Recently the following problem appeared in my code:
>
>
> [0]PETSC ERROR:
Dear PETSc developers:
Recently the following problem appeared in my code:
[0]PETSC ERROR: - Error Message
--
[0]PETSC ERROR: Overflow in integer operation:
http://www.mcs.anl.gov/petsc/documentation/faq.html#64-bit
On Wed, Jan 12, 2022 at 7:55 PM Ferrand, Jesus A.
wrote:
> Dear PETSc Team:
>
> Hi! I'm working on a parallel version of a PETSc script that I wrote in
> serial using DMPlex. After calling DMPlexDistribute() each rank is assigned
> its own DAG where the points are numbered locally. For example, I
Hi Danyang,
Just to reiterate, the presence of -Wl,-flat_namespace *is* the problem. I got
rid of it by configuring mpich with --enable-two-level-namespace. I reported
this problem to the PETSc
folks a few weeks ago and they were going to patch MPICH.py (under
config/BuildSystem/config/package