On Sun, Aug 6, 2023 at 3:03 AM K. Wu wrote:
> Dear Matthew,
>
> Thanks for your kind help, please see in attachment the configure.log file
> for PETSc.
>
Okay, you told PETSc to build MPICH, so you should use
/lhome/kai/Documents/petsc/linux-c-debug-directsolver/bin/mpiexec -n
./myprog
It s
On Sat, Aug 5, 2023 at 10:10 AM K. Wu wrote:
> Dear Matthew,
>
> Thanks for your reply!
>
> Is there any way that I can choose to use the previous MPI installation
> used to build PETSc?
>
First we would need to know what that was. You can send configure.log,
which will have that information in
Dear Matthew,
Thanks for your reply!
Is there any way that I can choose to use the previous MPI installation
used to build PETSc?
Regards,
Kai
Matthew Knepley 于2023年8月5日周六 14:23写道:
> On Sat, Aug 5, 2023 at 3:22 AM K. Wu wrote:
>
>> Hi all,
>>
>> Good day!
>>
>> After installing ParaView on m
On Sat, Aug 5, 2023 at 3:22 AM K. Wu wrote:
> Hi all,
>
> Good day!
>
> After installing ParaView on my desktop, PETSc starts to work anomalously
> even after reconfiguration:
> 1. If I use mpirun (frequently used before), it seems that now all the
> processors will run the program independently
Hi all,
Good day!
After installing ParaView on my desktop, PETSc starts to work anomalously
even after reconfiguration:
1. If I use mpirun (frequently used before), it seems that now all the
processors will run the program independently without communication. While
mpiexec seems to work properly.