On Sun, Jul 2, 2023 at 2:24 AM 王赫萌 wrote:
> Dear PETSc Team,
>
> Sorry to bother! My name is Hemeng Wang, and I am currently learning the
> use of PETSc software package. I am confused while calculating the norm of
> residue.
>
> I calculated residue norm by myself with:
> ```
> PetscCall(VecNo
On Sun, Jul 2, 2023 at 7:53 AM 王赫萌 wrote:
> Thanks for your reply!
> So sorry that I made a mistake in the description.
> I set the tolerances by:
> PetscCall(KSPSetTolerances(ksp, 1e-12, DBL_MIN, PETSC_DEFAULT,
> PETSC_DEFAULT));
> and got (by passing `-ksp_norm_type unpreconditioned
> -ksp_moni
On Sun, Jul 2, 2023 at 8:05 AM Matthew Knepley wrote:
> On Sun, Jul 2, 2023 at 7:53 AM 王赫萌 wrote:
>
>> Thanks for your reply!
>> So sorry that I made a mistake in the description.
>> I set the tolerances by:
>> PetscCall(KSPSetTolerances(ksp, 1e-12, DBL_MIN, PETSC_DEFAULT,
>> PETSC_DEFAULT));
>>
On Sun, Jul 2, 2023 at 8:19 AM 王赫萌 wrote:
> Here is the mat and rhs used in code! (May need to change the data path)
>
> mat:
>
> https://studentcupeducn-my.sharepoint.com/:u:/g/personal/wanghemeng_student_cup_edu_cn/Ed76oGtC1ttDriZsObbPR74BCnDPUP8aicVXQEL4sO1AyQ?e=zeszik
> rhs:
>
> https://stude
On Sun, Jul 2, 2023 at 8:45 AM 王赫萌 wrote:
> Thank so much for your patience! I'm really grateful for that!
>
> Could you explain the calculation of "1.31278e+06"
> It appears that my calculation of this value is "18.0468"
>
100 KSP unpreconditioned resid norm 1.312782529439e+06 true resid norm
1
Are you sure the NN is correct? I cannot see how you set that and know that it exactly matches the way PCREDISTRIBUTE selects rows? I suggest making a tiny problem with artificial matrix values that you select to "slice" of parts of the grid, so you can see exactly on the grid that the selecte
Also look at
https://petsc.org/release/manualpages/KSP/KSPSetPCSide/#kspsetpcside and
https://petsc.org/release/manualpages/KSP/KSPSetNormType/#kspsetnormtype in
PETSc different Krylov solvers have different default values for this.
> On Jul 2, 2023, at 1:47 AM, 王赫萌 wrote:
>
> Dear PET
Hi! Good advice!
I set value with MatSetValues() API, which sets one part of a row at a
time(I use a kind of tiling technology so I cannot get all values of a row
at a time).
I tested the number of malloc in these three cases. The number of
mallocs is decreasing with the increase of proces
The main branch of PETSc now supports filling sparse matrices without
providing any preallocation information.
You can give it a try. Use your current fastest code but just remove ALL the
preallocation calls. I would be interested in what kind of performance you get
compared to your best