15:52
To: feng wang
Cc: Jose E. Roman ; petsc-users@mcs.anl.gov
Subject: Re: [petsc-users] Slepc, shell matrix, parallel, halo exchange
On Mon, Oct 10, 2022 at 11:42 AM feng wang
mailto:snails...@hotmail.com>> wrote:
Hi Mat,
Thanks for your reply. It seems I have to use "VecSetValues
om:* Matthew Knepley
> *Sent:* 09 October 2022 12:11
> *To:* feng wang
> *Cc:* Jose E. Roman ; petsc-users@mcs.anl.gov <
> petsc-users@mcs.anl.gov>
> *Subject:* Re: [petsc-users] Slepc, shell matrix, parallel, halo exchange
>
> On Fri, Oct 7, 2022 at 5:48 PM feng wang wrote:
alo cells. it seems
> not doing that.
>
> I can only think you are misinterpreting the result. There are many examples,
> such
>
> src/vec/tutorials/ex9.c (and ex9f.F)
>
> I would start there and try to change that into the communication you want,
> since it definitely w
ouble precision.
Thanks,
Feng
From: Matthew Knepley
Sent: 09 October 2022 12:11
To: feng wang
Cc: Jose E. Roman ; petsc-users@mcs.anl.gov
Subject: Re: [petsc-users] Slepc, shell matrix, parallel, halo exchange
On Fri, Oct 7, 2022 at 5:48 PM feng wang
mailto:snails...@hotmail.co
y think you are misinterpreting the result. There are many
examples, such
src/vec/tutorials/ex9.c (and ex9f.F)
I would start there and try to change that into the communication you want,
since it definitely works. I cannot
see a problem with the code snippet above.
Thanks,
Matt
> Than
for(iv=0; iv
Sent: 21 September 2022 14:36
To: feng wang
Cc: Jose E. Roman ; petsc-users@mcs.anl.gov
Subject: Re: [petsc-users] Slepc, shell matrix, parallel, halo exchange
On Wed, Sep 21, 2022 at 10:35 AM feng wang
mailto:snails...@hotmail.com>> wrote:
Hi Jose,
For your 2nd suggestion on
what you mean?
>
Yes
Matt
> Thanks,
> Feng
>
> --
> *From:* Jose E. Roman
> *Sent:* 21 September 2022 13:07
> *To:* feng wang
> *Cc:* Matthew Knepley ; petsc-users@mcs.anl.gov <
> petsc-users@mcs.anl.gov>
> *Subject:* Re: [petsc-users] Slepc, shell matrix
Subject: Re: [petsc-users] Slepc, shell matrix, parallel, halo exchange
> El 21 sept 2022, a las 14:47, feng wang escribió:
>
> Thanks Jose, I will try this and will come back to this thread if I have any
> issue.
>
> Besides, for EPSGetEigenpair, I guess each rank gets
> Sent: 21 September 2022 12:34
> To: feng wang
> Cc: Matthew Knepley ; petsc-users@mcs.anl.gov
>
> Subject: Re: [petsc-users] Slepc, shell matrix, parallel, halo exchange
>
> If you define the MATOP_CREATE_VECS operation in your shell matrix so that it
> creates a gho
> Sent: 21 September 2022 12:34
> To: feng wang
> Cc: Matthew Knepley ; petsc-users@mcs.anl.gov
>
> Subject: Re: [petsc-users] Slepc, shell matrix, parallel, halo exchange
>
> If you define the MATOP_CREATE_VECS operation in your shell matrix so that it
> creates a gho
September 2022 12:34
To: feng wang
Cc: Matthew Knepley ; petsc-users@mcs.anl.gov
Subject: Re: [petsc-users] Slepc, shell matrix, parallel, halo exchange
If you define the MATOP_CREATE_VECS operation in your shell matrix so that it
creates a ghost vector, then all vectors within EPS will be ghost
a ghost vector and how many ghost cells there are?
>
> Thanks,
> Feng
> From: Matthew Knepley
> Sent: 21 September 2022 11:58
> To: feng wang
> Cc: petsc-users@mcs.anl.gov
> Subject: Re: [petsc-users] Slepc, shell matrix, parallel, halo exchange
>
> On Wed, Sep 21
there are?
Thanks,
Feng
From: Matthew Knepley
Sent: 21 September 2022 11:58
To: feng wang
Cc: petsc-users@mcs.anl.gov
Subject: Re: [petsc-users] Slepc, shell matrix, parallel, halo exchange
On Wed, Sep 21, 2022 at 7:41 AM feng wang
mailto:snails...@hotmail.com
On Wed, Sep 21, 2022 at 7:41 AM feng wang wrote:
> Hello,
>
> I am using Slepc with a shell matrix. The sequential version seems working
> and now I am trying to make it run in parallel.
>
> The partition of the domain is done, I am not sure how to do the halo
> exchange in the shell matrix in Sl
Hello,
I am using Slepc with a shell matrix. The sequential version seems working and
now I am trying to make it run in parallel.
The partition of the domain is done, I am not sure how to do the halo exchange
in the shell matrix in Slepc. I have a parallel version of matrix-free GMRES in
my co
15 matches
Mail list logo