Re: [petsc-users] 3D Poisson solver failed in KSPSolve when number of cores is larger than one

2023-07-24 Thread Thuc Bui
Thank you so much Matt, for getting back to me so quickly. Yes, using another PC fixes the issues. Many thanks again for your help, Thuc From: Matthew Knepley [mailto:knep...@gmail.com] Sent: Monday, July 24, 2023 6:50 PM To: Thuc Bui Cc: petsc-users Subject: Re: [petsc-users] 3D

Re: [petsc-users] 3D Poisson solver failed in KSPSolve when number of cores is larger than one

2023-07-24 Thread Matthew Knepley
On Mon, Jul 24, 2023 at 8:16 PM Thuc Bui wrote: > Dear PETSc Users/Developers. > > > > I have been successfully using PETsc on Windows without MPI for a while > now. I have now attempted to implement PETSc with MPI on Windows 10. I have > built a release version of PETSc 3.18.6 with MS MPI

[petsc-users] 3D Poisson solver failed in KSPSolve when number of cores is larger than one

2023-07-24 Thread Thuc Bui
Dear PETSc Users/Developers. I have been successfully using PETsc on Windows without MPI for a while now. I have now attempted to implement PETSc with MPI on Windows 10. I have built a release version of PETSc 3.18.6 with MS MPI 10.1.2, Intel MKL 3.279 (2020) and Visual Studio 2019 as a static

Re: [petsc-users] support for mixed block size matrices/AIM in PETSc?

2023-07-24 Thread Mark Adams
I interpreted Danial's question as referring to small block sizes for multi-dof per point/vertex/cell in the mesh, like Satish, but please clarify. With that assumption, as you know PETSc does not support explicit variable block sizes, like ML does for instance, but the conventional wisdom has

Re: [petsc-users] (no subject)

2023-07-24 Thread Barry Smith
Perhaps you need > make PETSC_DIR=~/asd/petsc-3.19.3 PETSC_ARCH=arch-mswin-c-opt all > On Jul 24, 2023, at 1:11 PM, Константин via petsc-users > wrote: > > Good evening. After configuring petsc I had to write this comand on cygwin64. > $ make PETSC_DIR=/home/itugr/asd/petsc-3.19.3

[petsc-users] (no subject)

2023-07-24 Thread Константин via petsc-users
Good evening. After configuring petsc I had to write this comand on cygwin64. $ make PETSC_DIR=/home/itugr/asd/petsc-3.19.3 PETSC_ARCH=arch-mswin-c-opt all But I have such problem makefile:26: /home/itugr/asd/petsc-3.19.3/lib/petsc/conf/rules.utils: No such file or directory make[1]: *** No rule

Re: [petsc-users] support for mixed block size matrices/AIM in PETSc?

2023-07-24 Thread Matthew Knepley
On Mon, Jul 24, 2023 at 6:34 AM Daniel Stone wrote: > Hello PETSc Users/Developers, > > A collegue of mine is looking into implementing an adaptive implicit > method (AIM) over > PETSc in our simulator. This has led to some interesting questions about > what can > be done with blocked matrices,

Re: [petsc-users] support for mixed block size matrices/AIM in PETSc?

2023-07-24 Thread Satish Balay via petsc-users
One way to boost performance [of MatVec etc] in sparse matrices with blocks is by avoiding loading (from memory to cpu registers) of row/col indices for the blocks - when possible. [the performance boost here come by the fact that the memory bandwidth requirements get reduced] So we have BAIJ

[petsc-users] support for mixed block size matrices/AIM in PETSc?

2023-07-24 Thread Daniel Stone
Hello PETSc Users/Developers, A collegue of mine is looking into implementing an adaptive implicit method (AIM) over PETSc in our simulator. This has led to some interesting questions about what can be done with blocked matrices, which I'm not able to answer myself - does anyone have any insight?

Re: [petsc-users] Confusion/failures about the tests involved in including Hypre

2023-07-24 Thread Daniel Stone
On the hypre versioning - aha. For this project I locked the petsc version a little while ago (3.19.1), but I've been using a fresh clone of hypre, so clearly it's too modern a version. Using the appropriate version of hypre (2.28.0, according to hypre.py) might fix some things. I may have other