PETSs is not necessarily faster than scipy for your problem when executed in
serial. But you get benefits when running in parallel. The PETSc code you wrote
uses float64 while your scipy code uses complex128, so the comparison may not
be fair.
In addition, using the RHS Jacobian does not necess
On Tue, Aug 15, 2023 at 2:06 AM Martin Diehl
wrote:
> Dear PETSc team,
>
> my simulation crashes after updating from 3.18.5 to 3.19.4.
>
> The error message is attached, so is the main code. The mesh (variable
> named geomMesh) is read with DMPlexCreateFromFile in a different part
> of the code).
I don't see a problem in the matrix assembly.
If you point me to your repo and show me how to build it, I can try to
reproduce.
--Junchao Zhang
On Mon, Aug 14, 2023 at 2:53 PM Vanella, Marcos (Fed) <
marcos.vane...@nist.gov> wrote:
> Hi Junchao, I've tried for my case using the -ksp_type gmres
On Mon, Aug 14, 2023 at 11:03 AM Stephan Kramer
wrote:
> Many thanks for looking into this, Mark
> > My 3D tests were not that different and I see you lowered the threshold.
> > Note, you can set the threshold to zero, but your test is running so much
> > differently than mine there is something
Yeah, it looks like ex60 was run correctly.
Double check your code again and if you still run into errors, we can try
to reproduce on our end.
Thanks.
--Junchao Zhang
On Mon, Aug 14, 2023 at 1:05 PM Vanella, Marcos (Fed) <
marcos.vane...@nist.gov> wrote:
> Hi Junchao, I compiled and run ex60 th
Dear PETSc team,
my simulation crashes after updating from 3.18.5 to 3.19.4.
The error message is attached, so is the main code. The mesh (variable
named geomMesh) is read with DMPlexCreateFromFile in a different part
of the code).
I did not start serious debugging yet in the hope that you can p
Many thanks for looking into this, Mark
My 3D tests were not that different and I see you lowered the threshold.
Note, you can set the threshold to zero, but your test is running so much
differently than mine there is something else going on.
Note, the new, bad, coarsening rate of 30:1 is what we
got it, thanks Pierre & Jose.
On Mon, Aug 14, 2023 at 12:50 PM Jose E. Roman wrote:
> See for instance ex3.c and ex9.c
> https://slepc.upv.es/documentation/current/src/eps/tutorials/index.html
>
> Jose
>
>
> > El 14 ago 2023, a las 10:45, Pierre Jolivet
> escribió:
> >
> >
> >
> >> On 14 Aug 20
See for instance ex3.c and ex9.c
https://slepc.upv.es/documentation/current/src/eps/tutorials/index.html
Jose
> El 14 ago 2023, a las 10:45, Pierre Jolivet escribió:
>
>
>
>> On 14 Aug 2023, at 10:39 AM, maitri ksh wrote:
>>
>>
>> Hi,
>> I need to solve an eigenvalue problem Ax=lmbda*x
> On 14 Aug 2023, at 10:39 AM, maitri ksh wrote:
>
>
> Hi,
> I need to solve an eigenvalue problem Ax=lmbda*x, where A=(B^-H)*Q*B^-1 is a
> hermitian matrix, 'B^-H' refers to the hermitian of the inverse of the matrix
> B. Theoretically it would take around 1.8TB to explicitly compute the
Hi,
I need to solve an eigenvalue problem *Ax=lmbda*x*, where A=(B^-H)*Q*B^-1
is a hermitian matrix, 'B^-H' refers to the hermitian of the inverse of the
matrix B. Theoretically it would take around 1.8TB to explicitly compute
the matrix B^-1 . A feasible way to solve this eigenvalue problem would
11 matches
Mail list logo