Re: [petsc-dev] Cannot locate file: share/petsc/datafiles/matrices/small

2021-09-14 Thread Junchao Zhang
Yes, we can turn it off.   The code without real use is just a
maintenance burden.

--Junchao Zhang

On Tue, Sep 14, 2021 at 10:45 AM Barry Smith  wrote:

>
>   Ok, so it could be a bug in PETSc, but if it appears with particular MPI
> implementations shouldn't we turn off the support in those cases we know it
> will fail?
>
>   Barry
>
>
> On Sep 14, 2021, at 11:10 AM, Junchao Zhang 
> wrote:
>
> MPI one-sided is tricky and needs careful synchronization (like OpenMP).
> An incorrect code could work in one interface but fail in another.
>
> --Junchao Zhang
>
>
> On Tue, Sep 14, 2021 at 10:01 AM Barry Smith  wrote:
>
>>
>>It sounds reproducible and related to using a particular versions of
>> OpenMPI and even particular interfaces.
>>
>>   Barry
>>
>>On Tue, Sep 14, 2021 at 2:35 AM Stefano Zampini <
>> stefano.zamp...@gmail.com> wrote:
>>>
>>> I can reproduce it even with OpenMPI 4.1.1 on a different machine
>>> (Ubuntu 18 + AMD milan + clang from AOCC)  and it is definitely an OpenMPI
>>> bug in the vader BTL If I use tcp, everything runs smooth
>>>
>>>
>>>
>>>
>> On Sep 14, 2021, at 10:54 AM, Junchao Zhang 
>> wrote:
>>
>> Without a standalone & valid mpi example to reproduce the error, we are
>> not assured to say it is an OpenMPI bug.
>>
>> --Junchao Zhang
>>
>>
>> On Tue, Sep 14, 2021 at 6:17 AM Matthew Knepley 
>> wrote:
>>
>>> Okay, we have to send this to OpenMPI. Volunteers?
>>>
>>> Maybe we should note this in the FAQ, or installation, so we remember
>>> how to fix it if someone else asks?
>>>
>>>   Thanks,
>>>
>>>  Matt
>>>
>>> On Tue, Sep 14, 2021 at 2:35 AM Stefano Zampini <
>>> stefano.zamp...@gmail.com> wrote:
>>>
 I can reproduce it even with OpenMPI 4.1.1 on a different machine
 (Ubuntu 18 + AMD milan + clang from AOCC)  and it is definitely an OpenMPI
 bug in the vader BTL If I use tcp, everything runs smooth

 zampins@kanary:~/Devel/petsc$ cat
 /home/zampins/local/etc/openmpi-mca-params.conf | grep btl
 btl = tcp,self
 zampins@kanary:~/Devel/petsc$ make -f gmakefile.test
 vec_is_sf_tutorials-ex1_4
 Using MAKEFLAGS:
 TEST arch-debug/tests/counts/vec_is_sf_tutorials-ex1_4.counts
  ok
 vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-create
  ok
 diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-create
  ok
 vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-dynamic
  ok
 diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-dynamic
  ok
 vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-allocate
  ok
 diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-allocate
  ok
 vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-create
  ok
 diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-create
  ok
 vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-dynamic
  ok
 diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-dynamic
  ok
 vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-allocate
  ok
 diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-allocate
  ok
 vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-create
  ok
 diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-create
  ok
 vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-dynamic
  ok
 diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-dynamic
  ok
 vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-allocate
  ok
 diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-allocate


 zampins@kanary:~/Devel/petsc$ cat
 /home/zampins/local/etc/openmpi-mca-params.conf | grep btl
 btl = vader,tcp,self
 zampins@kanary:~/Devel/petsc$ make -f gmakefile.test
 vec_is_sf_tutorials-ex1_4
 Using MAKEFLAGS:
 TEST arch-debug/tests/counts/vec_is_sf_tutorials-ex1_4.counts
  ok
 vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-create
 not ok
 diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-create
 # Error code: 1
 # 43,46c43,46
 # < [0] 0: 4001 2000 2002 3002 4002
 # < [1] 0: 1001 3000
 # < [2] 0: 2001 4000
 # < [3] 0: 3001 1000
 # ---
 # > [0] 0: 2002 2146435072 2 2146435072 38736240
 # > [1] 0: 3000 2146435072
 # > [2] 0: 2001 2146435072
 # > [3] 0: 3001 2146435072
  ok
 vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-dynamic
 not ok
 diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-dynamic
 # Error code: 1
 # 43,46c43,46
 # < [0] 0: 4001 2000 2002 3002 4002
 # < [1] 0: 1001 3000
 # < [2] 0: 2001 4000
 # < [3] 0: 3001 1000
 # ---
 # > [0] 0: 2002 2146435072 2 

Re: [petsc-dev] Cannot locate file: share/petsc/datafiles/matrices/small

2021-09-14 Thread Barry Smith

  Ok, so it could be a bug in PETSc, but if it appears with particular MPI 
implementations shouldn't we turn off the support in those cases we know it 
will fail?

  Barry


> On Sep 14, 2021, at 11:10 AM, Junchao Zhang  wrote:
> 
> MPI one-sided is tricky and needs careful synchronization (like OpenMP).  An 
> incorrect code could work in one interface but fail in another.
> 
> --Junchao Zhang
> 
> 
> On Tue, Sep 14, 2021 at 10:01 AM Barry Smith  > wrote:
> 
>It sounds reproducible and related to using a particular versions of 
> OpenMPI and even particular interfaces.
> 
>   Barry
> 
>On Tue, Sep 14, 2021 at 2:35 AM Stefano Zampini  > wrote:
> I can reproduce it even with OpenMPI 4.1.1 on a different machine (Ubuntu 18 
> + AMD milan + clang from AOCC)  and it is definitely an OpenMPI bug in the 
> vader BTL If I use tcp, everything runs smooth
> 
> 
> 
> 
>> On Sep 14, 2021, at 10:54 AM, Junchao Zhang > > wrote:
>> 
>> Without a standalone & valid mpi example to reproduce the error, we are not 
>> assured to say it is an OpenMPI bug. 
>> 
>> --Junchao Zhang
>> 
>> 
>> On Tue, Sep 14, 2021 at 6:17 AM Matthew Knepley > > wrote:
>> Okay, we have to send this to OpenMPI. Volunteers?
>> 
>> Maybe we should note this in the FAQ, or installation, so we remember how to 
>> fix it if someone else asks?
>> 
>>   Thanks,
>> 
>>  Matt
>> 
>> On Tue, Sep 14, 2021 at 2:35 AM Stefano Zampini > > wrote:
>> I can reproduce it even with OpenMPI 4.1.1 on a different machine (Ubuntu 18 
>> + AMD milan + clang from AOCC)  and it is definitely an OpenMPI bug in the 
>> vader BTL If I use tcp, everything runs smooth
>> 
>> zampins@kanary:~/Devel/petsc$ cat 
>> /home/zampins/local/etc/openmpi-mca-params.conf | grep btl
>> btl = tcp,self
>> zampins@kanary:~/Devel/petsc$ make -f gmakefile.test 
>> vec_is_sf_tutorials-ex1_4 
>> Using MAKEFLAGS:
>> TEST arch-debug/tests/counts/vec_is_sf_tutorials-ex1_4.counts
>>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-create
>>  ok 
>> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-create
>>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-dynamic
>>  ok 
>> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-dynamic
>>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-allocate
>>  ok 
>> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-allocate
>>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-create
>>  ok 
>> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-create
>>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-dynamic
>>  ok 
>> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-dynamic
>>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-allocate
>>  ok 
>> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-allocate
>>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-create
>>  ok 
>> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-create
>>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-dynamic
>>  ok 
>> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-dynamic
>>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-allocate
>>  ok 
>> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-allocate
>> 
>> 
>> zampins@kanary:~/Devel/petsc$ cat 
>> /home/zampins/local/etc/openmpi-mca-params.conf | grep btl
>> btl = vader,tcp,self
>> zampins@kanary:~/Devel/petsc$ make -f gmakefile.test 
>> vec_is_sf_tutorials-ex1_4 
>> Using MAKEFLAGS:
>> TEST arch-debug/tests/counts/vec_is_sf_tutorials-ex1_4.counts
>>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-create
>> not ok 
>> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-create 
>> # Error code: 1
>> # 43,46c43,46
>> # < [0] 0: 4001 2000 2002 3002 4002
>> # < [1] 0: 1001 3000
>> # < [2] 0: 2001 4000
>> # < [3] 0: 3001 1000
>> # ---
>> # > [0] 0: 2002 2146435072 2 2146435072 38736240
>> # > [1] 0: 3000 2146435072
>> # > [2] 0: 2001 2146435072
>> # > [3] 0: 3001 2146435072
>>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-dynamic
>> not ok 
>> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-dynamic 
>> # Error code: 1
>> # 43,46c43,46
>> # < [0] 0: 4001 2000 2002 3002 4002
>> # < [1] 0: 1001 3000
>> # < [2] 0: 2001 4000
>> # < [3] 0: 3001 1000
>> # ---
>> # > [0] 0: 2002 2146435072 2 2146435072 0
>> # > [1] 0: 3000 2146435072
>> # > [2] 0: 2001 2146435072
>> # > [3] 0: 3001 2146435072
>>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-allocate
>>  ok 
>> 

Re: [petsc-dev] Cannot locate file: share/petsc/datafiles/matrices/small

2021-09-14 Thread Junchao Zhang
MPI one-sided is tricky and needs careful synchronization (like OpenMP).
An incorrect code could work in one interface but fail in another.

--Junchao Zhang


On Tue, Sep 14, 2021 at 10:01 AM Barry Smith  wrote:

>
>It sounds reproducible and related to using a particular versions of
> OpenMPI and even particular interfaces.
>
>   Barry
>
>On Tue, Sep 14, 2021 at 2:35 AM Stefano Zampini <
> stefano.zamp...@gmail.com> wrote:
>>
>> I can reproduce it even with OpenMPI 4.1.1 on a different machine (Ubuntu
>> 18 + AMD milan + clang from AOCC)  and it is definitely an OpenMPI bug in
>> the vader BTL If I use tcp, everything runs smooth
>>
>>
>>
>>
> On Sep 14, 2021, at 10:54 AM, Junchao Zhang 
> wrote:
>
> Without a standalone & valid mpi example to reproduce the error, we are
> not assured to say it is an OpenMPI bug.
>
> --Junchao Zhang
>
>
> On Tue, Sep 14, 2021 at 6:17 AM Matthew Knepley  wrote:
>
>> Okay, we have to send this to OpenMPI. Volunteers?
>>
>> Maybe we should note this in the FAQ, or installation, so we remember how
>> to fix it if someone else asks?
>>
>>   Thanks,
>>
>>  Matt
>>
>> On Tue, Sep 14, 2021 at 2:35 AM Stefano Zampini <
>> stefano.zamp...@gmail.com> wrote:
>>
>>> I can reproduce it even with OpenMPI 4.1.1 on a different machine
>>> (Ubuntu 18 + AMD milan + clang from AOCC)  and it is definitely an OpenMPI
>>> bug in the vader BTL If I use tcp, everything runs smooth
>>>
>>> zampins@kanary:~/Devel/petsc$ cat
>>> /home/zampins/local/etc/openmpi-mca-params.conf | grep btl
>>> btl = tcp,self
>>> zampins@kanary:~/Devel/petsc$ make -f gmakefile.test
>>> vec_is_sf_tutorials-ex1_4
>>> Using MAKEFLAGS:
>>> TEST arch-debug/tests/counts/vec_is_sf_tutorials-ex1_4.counts
>>>  ok
>>> vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-create
>>>  ok
>>> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-create
>>>  ok
>>> vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-dynamic
>>>  ok
>>> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-dynamic
>>>  ok
>>> vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-allocate
>>>  ok
>>> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-allocate
>>>  ok
>>> vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-create
>>>  ok
>>> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-create
>>>  ok
>>> vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-dynamic
>>>  ok
>>> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-dynamic
>>>  ok
>>> vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-allocate
>>>  ok
>>> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-allocate
>>>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-create
>>>  ok
>>> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-create
>>>  ok
>>> vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-dynamic
>>>  ok
>>> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-dynamic
>>>  ok
>>> vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-allocate
>>>  ok
>>> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-allocate
>>>
>>>
>>> zampins@kanary:~/Devel/petsc$ cat
>>> /home/zampins/local/etc/openmpi-mca-params.conf | grep btl
>>> btl = vader,tcp,self
>>> zampins@kanary:~/Devel/petsc$ make -f gmakefile.test
>>> vec_is_sf_tutorials-ex1_4
>>> Using MAKEFLAGS:
>>> TEST arch-debug/tests/counts/vec_is_sf_tutorials-ex1_4.counts
>>>  ok
>>> vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-create
>>> not ok
>>> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-create
>>> # Error code: 1
>>> # 43,46c43,46
>>> # < [0] 0: 4001 2000 2002 3002 4002
>>> # < [1] 0: 1001 3000
>>> # < [2] 0: 2001 4000
>>> # < [3] 0: 3001 1000
>>> # ---
>>> # > [0] 0: 2002 2146435072 2 2146435072 38736240
>>> # > [1] 0: 3000 2146435072
>>> # > [2] 0: 2001 2146435072
>>> # > [3] 0: 3001 2146435072
>>>  ok
>>> vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-dynamic
>>> not ok
>>> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-dynamic
>>> # Error code: 1
>>> # 43,46c43,46
>>> # < [0] 0: 4001 2000 2002 3002 4002
>>> # < [1] 0: 1001 3000
>>> # < [2] 0: 2001 4000
>>> # < [3] 0: 3001 1000
>>> # ---
>>> # > [0] 0: 2002 2146435072 2 2146435072 0
>>> # > [1] 0: 3000 2146435072
>>> # > [2] 0: 2001 2146435072
>>> # > [3] 0: 3001 2146435072
>>>  ok
>>> vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-allocate
>>>  ok
>>> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-allocate
>>> # retrying
>>> vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-create
>>> not ok
>>> vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-create #
>>> Error code: 98
>>> # [1]PETSC ERROR: - Error Message
>>> 

Re: [petsc-dev] Cannot locate file: share/petsc/datafiles/matrices/small

2021-09-14 Thread Barry Smith

   It sounds reproducible and related to using a particular versions of OpenMPI 
and even particular interfaces.

  Barry

   On Tue, Sep 14, 2021 at 2:35 AM Stefano Zampini mailto:stefano.zamp...@gmail.com>> wrote:
I can reproduce it even with OpenMPI 4.1.1 on a different machine (Ubuntu 18 + 
AMD milan + clang from AOCC)  and it is definitely an OpenMPI bug in the vader 
BTL If I use tcp, everything runs smooth




> On Sep 14, 2021, at 10:54 AM, Junchao Zhang  wrote:
> 
> Without a standalone & valid mpi example to reproduce the error, we are not 
> assured to say it is an OpenMPI bug. 
> 
> --Junchao Zhang
> 
> 
> On Tue, Sep 14, 2021 at 6:17 AM Matthew Knepley  > wrote:
> Okay, we have to send this to OpenMPI. Volunteers?
> 
> Maybe we should note this in the FAQ, or installation, so we remember how to 
> fix it if someone else asks?
> 
>   Thanks,
> 
>  Matt
> 
> On Tue, Sep 14, 2021 at 2:35 AM Stefano Zampini  > wrote:
> I can reproduce it even with OpenMPI 4.1.1 on a different machine (Ubuntu 18 
> + AMD milan + clang from AOCC)  and it is definitely an OpenMPI bug in the 
> vader BTL If I use tcp, everything runs smooth
> 
> zampins@kanary:~/Devel/petsc$ cat 
> /home/zampins/local/etc/openmpi-mca-params.conf | grep btl
> btl = tcp,self
> zampins@kanary:~/Devel/petsc$ make -f gmakefile.test 
> vec_is_sf_tutorials-ex1_4 
> Using MAKEFLAGS:
> TEST arch-debug/tests/counts/vec_is_sf_tutorials-ex1_4.counts
>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-create
>  ok 
> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-create
>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-dynamic
>  ok 
> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-dynamic
>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-allocate
>  ok 
> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-allocate
>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-create
>  ok 
> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-create
>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-dynamic
>  ok 
> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-dynamic
>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-allocate
>  ok 
> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-allocate
>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-create
>  ok diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-create
>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-dynamic
>  ok 
> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-dynamic
>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-allocate
>  ok 
> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-allocate
> 
> 
> zampins@kanary:~/Devel/petsc$ cat 
> /home/zampins/local/etc/openmpi-mca-params.conf | grep btl
> btl = vader,tcp,self
> zampins@kanary:~/Devel/petsc$ make -f gmakefile.test 
> vec_is_sf_tutorials-ex1_4 
> Using MAKEFLAGS:
> TEST arch-debug/tests/counts/vec_is_sf_tutorials-ex1_4.counts
>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-create
> not ok 
> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-create # 
> Error code: 1
> # 43,46c43,46
> # < [0] 0: 4001 2000 2002 3002 4002
> # < [1] 0: 1001 3000
> # < [2] 0: 2001 4000
> # < [3] 0: 3001 1000
> # ---
> # > [0] 0: 2002 2146435072 2 2146435072 38736240
> # > [1] 0: 3000 2146435072
> # > [2] 0: 2001 2146435072
> # > [3] 0: 3001 2146435072
>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-dynamic
> not ok 
> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-dynamic 
> # Error code: 1
> # 43,46c43,46
> # < [0] 0: 4001 2000 2002 3002 4002
> # < [1] 0: 1001 3000
> # < [2] 0: 2001 4000
> # < [3] 0: 3001 1000
> # ---
> # > [0] 0: 2002 2146435072 2 2146435072 0
> # > [1] 0: 3000 2146435072
> # > [2] 0: 2001 2146435072
> # > [3] 0: 3001 2146435072
>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-allocate
>  ok 
> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-allocate
> # retrying 
> vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-create
> not ok 
> vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-create # 
> Error code: 98
> # [1]PETSC ERROR: - Error Message 
> --
> # [1]PETSC ERROR: General MPI error 
> # [1]PETSC ERROR: MPI error 6 MPI_ERR_RANK: invalid rank
> # [1]PETSC ERROR: See https://petsc.org/release/faq/ 
>  for trouble shooting.
> # [1]PETSC ERROR: Petsc Development GIT revision: v3.15.4-783-g168bb9f76b  
> GIT Date: 2021-09-13 14:01:22 +
> # 

Re: [petsc-dev] Cannot locate file: share/petsc/datafiles/matrices/small

2021-09-14 Thread Junchao Zhang
Without a standalone & valid mpi example to reproduce the error, we are not
assured to say it is an OpenMPI bug.

--Junchao Zhang


On Tue, Sep 14, 2021 at 6:17 AM Matthew Knepley  wrote:

> Okay, we have to send this to OpenMPI. Volunteers?
>
> Maybe we should note this in the FAQ, or installation, so we remember how
> to fix it if someone else asks?
>
>   Thanks,
>
>  Matt
>
> On Tue, Sep 14, 2021 at 2:35 AM Stefano Zampini 
> wrote:
>
>> I can reproduce it even with OpenMPI 4.1.1 on a different machine (Ubuntu
>> 18 + AMD milan + clang from AOCC)  and it is definitely an OpenMPI bug in
>> the vader BTL If I use tcp, everything runs smooth
>>
>> zampins@kanary:~/Devel/petsc$ cat
>> /home/zampins/local/etc/openmpi-mca-params.conf | grep btl
>> btl = tcp,self
>> zampins@kanary:~/Devel/petsc$ make -f gmakefile.test
>> vec_is_sf_tutorials-ex1_4
>> Using MAKEFLAGS:
>> TEST arch-debug/tests/counts/vec_is_sf_tutorials-ex1_4.counts
>>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-create
>>  ok
>> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-create
>>  ok
>> vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-dynamic
>>  ok
>> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-dynamic
>>  ok
>> vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-allocate
>>  ok
>> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-allocate
>>  ok
>> vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-create
>>  ok
>> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-create
>>  ok
>> vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-dynamic
>>  ok
>> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-dynamic
>>  ok
>> vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-allocate
>>  ok
>> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-allocate
>>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-create
>>  ok
>> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-create
>>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-dynamic
>>  ok
>> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-dynamic
>>  ok
>> vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-allocate
>>  ok
>> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-allocate
>>
>>
>> zampins@kanary:~/Devel/petsc$ cat
>> /home/zampins/local/etc/openmpi-mca-params.conf | grep btl
>> btl = vader,tcp,self
>> zampins@kanary:~/Devel/petsc$ make -f gmakefile.test
>> vec_is_sf_tutorials-ex1_4
>> Using MAKEFLAGS:
>> TEST arch-debug/tests/counts/vec_is_sf_tutorials-ex1_4.counts
>>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-create
>> not ok
>> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-create
>> # Error code: 1
>> # 43,46c43,46
>> # < [0] 0: 4001 2000 2002 3002 4002
>> # < [1] 0: 1001 3000
>> # < [2] 0: 2001 4000
>> # < [3] 0: 3001 1000
>> # ---
>> # > [0] 0: 2002 2146435072 2 2146435072 38736240
>> # > [1] 0: 3000 2146435072
>> # > [2] 0: 2001 2146435072
>> # > [3] 0: 3001 2146435072
>>  ok
>> vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-dynamic
>> not ok
>> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-dynamic
>> # Error code: 1
>> # 43,46c43,46
>> # < [0] 0: 4001 2000 2002 3002 4002
>> # < [1] 0: 1001 3000
>> # < [2] 0: 2001 4000
>> # < [3] 0: 3001 1000
>> # ---
>> # > [0] 0: 2002 2146435072 2 2146435072 0
>> # > [1] 0: 3000 2146435072
>> # > [2] 0: 2001 2146435072
>> # > [3] 0: 3001 2146435072
>>  ok
>> vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-allocate
>>  ok
>> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-allocate
>> # retrying
>> vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-create
>> not ok
>> vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-create #
>> Error code: 98
>> # [1]PETSC ERROR: - Error Message
>> --
>> # [1]PETSC ERROR: General MPI error
>> # [1]PETSC ERROR: MPI error 6 MPI_ERR_RANK: invalid rank
>> # [1]PETSC ERROR: See https://petsc.org/release/faq/ for trouble
>> shooting.
>> # [1]PETSC ERROR: Petsc Development GIT revision: v3.15.4-783-g168bb9f76b
>>  GIT Date: 2021-09-13 14:01:22 +
>> # [1]PETSC ERROR: ../ex1 on a arch-debug named kanary.kaust.edu.sa by
>> zampins Tue Sep 14 09:31:42 2021
>> # [1]PETSC ERROR: [2]PETSC ERROR: - Error Message
>> --
>> # [2]PETSC ERROR: General MPI error
>> # [2]PETSC ERROR: MPI error 6 MPI_ERR_RANK: invalid rank
>> # [2]PETSC ERROR: See https://petsc.org/release/faq/ for trouble
>> shooting.
>> # [2]PETSC ERROR: Petsc Development GIT revision: v3.15.4-783-g168bb9f76b
>>  GIT 

Re: [petsc-dev] Cannot locate file: share/petsc/datafiles/matrices/small

2021-09-14 Thread Barry Smith

   Have mpi.py turn off the use of MPI windows for these versions of OpenMPI.  
Note this is already standard practice, we have the comment "Skip buggy MPICH 
versions" and a variety of checks based on bad versions.

  Barry


> On Sep 14, 2021, at 7:17 AM, Matthew Knepley  wrote:
> 
> Okay, we have to send this to OpenMPI. Volunteers?
> 
> Maybe we should note this in the FAQ, or installation, so we remember how to 
> fix it if someone else asks?
> 
>   Thanks,
> 
>  Matt
> 
> On Tue, Sep 14, 2021 at 2:35 AM Stefano Zampini  > wrote:
> I can reproduce it even with OpenMPI 4.1.1 on a different machine (Ubuntu 18 
> + AMD milan + clang from AOCC)  and it is definitely an OpenMPI bug in the 
> vader BTL If I use tcp, everything runs smooth
> 
> zampins@kanary:~/Devel/petsc$ cat 
> /home/zampins/local/etc/openmpi-mca-params.conf | grep btl
> btl = tcp,self
> zampins@kanary:~/Devel/petsc$ make -f gmakefile.test 
> vec_is_sf_tutorials-ex1_4 
> Using MAKEFLAGS:
> TEST arch-debug/tests/counts/vec_is_sf_tutorials-ex1_4.counts
>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-create
>  ok 
> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-create
>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-dynamic
>  ok 
> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-dynamic
>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-allocate
>  ok 
> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-allocate
>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-create
>  ok 
> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-create
>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-dynamic
>  ok 
> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-dynamic
>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-allocate
>  ok 
> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-allocate
>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-create
>  ok diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-create
>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-dynamic
>  ok 
> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-dynamic
>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-allocate
>  ok 
> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-allocate
> 
> 
> zampins@kanary:~/Devel/petsc$ cat 
> /home/zampins/local/etc/openmpi-mca-params.conf | grep btl
> btl = vader,tcp,self
> zampins@kanary:~/Devel/petsc$ make -f gmakefile.test 
> vec_is_sf_tutorials-ex1_4 
> Using MAKEFLAGS:
> TEST arch-debug/tests/counts/vec_is_sf_tutorials-ex1_4.counts
>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-create
> not ok 
> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-create # 
> Error code: 1
> # 43,46c43,46
> # < [0] 0: 4001 2000 2002 3002 4002
> # < [1] 0: 1001 3000
> # < [2] 0: 2001 4000
> # < [3] 0: 3001 1000
> # ---
> # > [0] 0: 2002 2146435072 2 2146435072 38736240
> # > [1] 0: 3000 2146435072
> # > [2] 0: 2001 2146435072
> # > [3] 0: 3001 2146435072
>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-dynamic
> not ok 
> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-dynamic 
> # Error code: 1
> # 43,46c43,46
> # < [0] 0: 4001 2000 2002 3002 4002
> # < [1] 0: 1001 3000
> # < [2] 0: 2001 4000
> # < [3] 0: 3001 1000
> # ---
> # > [0] 0: 2002 2146435072 2 2146435072 0
> # > [1] 0: 3000 2146435072
> # > [2] 0: 2001 2146435072
> # > [3] 0: 3001 2146435072
>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-allocate
>  ok 
> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-allocate
> # retrying 
> vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-create
> not ok 
> vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-create # 
> Error code: 98
> # [1]PETSC ERROR: - Error Message 
> --
> # [1]PETSC ERROR: General MPI error 
> # [1]PETSC ERROR: MPI error 6 MPI_ERR_RANK: invalid rank
> # [1]PETSC ERROR: See https://petsc.org/release/faq/ 
>  for trouble shooting.
> # [1]PETSC ERROR: Petsc Development GIT revision: v3.15.4-783-g168bb9f76b  
> GIT Date: 2021-09-13 14:01:22 +
> # [1]PETSC ERROR: ../ex1 on a arch-debug named kanary.kaust.edu.sa 
>  by zampins Tue Sep 14 09:31:42 2021
> # [1]PETSC ERROR: [2]PETSC ERROR: - Error Message 
> --
> # [2]PETSC ERROR: General MPI error 
> # [2]PETSC ERROR: MPI error 6 MPI_ERR_RANK: invalid rank
> # [2]PETSC ERROR: See https://petsc.org/release/faq/ 
> 

Re: [petsc-dev] Cannot locate file: share/petsc/datafiles/matrices/small

2021-09-14 Thread Satish Balay via petsc-dev
cc: Antonio

On Tue, 14 Sep 2021, Matthew Knepley wrote:

> Okay, we have to send this to OpenMPI. Volunteers?
> 
> Maybe we should note this in the FAQ, or installation, so we remember how
> to fix it if someone else asks?
> 
>   Thanks,
> 
>  Matt
> 
> On Tue, Sep 14, 2021 at 2:35 AM Stefano Zampini 
> wrote:
> 
> > I can reproduce it even with OpenMPI 4.1.1 on a different machine (Ubuntu
> > 18 + AMD milan + clang from AOCC)  and it is definitely an OpenMPI bug in
> > the vader BTL If I use tcp, everything runs smooth
> >
> > zampins@kanary:~/Devel/petsc$ cat
> > /home/zampins/local/etc/openmpi-mca-params.conf | grep btl
> > btl = tcp,self
> > zampins@kanary:~/Devel/petsc$ make -f gmakefile.test
> > vec_is_sf_tutorials-ex1_4
> > Using MAKEFLAGS:
> > TEST arch-debug/tests/counts/vec_is_sf_tutorials-ex1_4.counts
> >  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-create
> >  ok
> > diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-create
> >  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-dynamic
> >  ok
> > diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-dynamic
> >  ok
> > vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-allocate
> >  ok
> > diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-allocate
> >  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-create
> >  ok
> > diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-create
> >  ok
> > vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-dynamic
> >  ok
> > diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-dynamic
> >  ok
> > vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-allocate
> >  ok
> > diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-allocate
> >  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-create
> >  ok
> > diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-create
> >  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-dynamic
> >  ok
> > diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-dynamic
> >  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-allocate
> >  ok
> > diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-allocate
> >
> >
> > zampins@kanary:~/Devel/petsc$ cat
> > /home/zampins/local/etc/openmpi-mca-params.conf | grep btl
> > btl = vader,tcp,self
> > zampins@kanary:~/Devel/petsc$ make -f gmakefile.test
> > vec_is_sf_tutorials-ex1_4
> > Using MAKEFLAGS:
> > TEST arch-debug/tests/counts/vec_is_sf_tutorials-ex1_4.counts
> >  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-create
> > not ok
> > diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-create
> > # Error code: 1
> > # 43,46c43,46
> > # < [0] 0: 4001 2000 2002 3002 4002
> > # < [1] 0: 1001 3000
> > # < [2] 0: 2001 4000
> > # < [3] 0: 3001 1000
> > # ---
> > # > [0] 0: 2002 2146435072 2 2146435072 38736240
> > # > [1] 0: 3000 2146435072
> > # > [2] 0: 2001 2146435072
> > # > [3] 0: 3001 2146435072
> >  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-dynamic
> > not ok
> > diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-dynamic
> > # Error code: 1
> > # 43,46c43,46
> > # < [0] 0: 4001 2000 2002 3002 4002
> > # < [1] 0: 1001 3000
> > # < [2] 0: 2001 4000
> > # < [3] 0: 3001 1000
> > # ---
> > # > [0] 0: 2002 2146435072 2 2146435072 0
> > # > [1] 0: 3000 2146435072
> > # > [2] 0: 2001 2146435072
> > # > [3] 0: 3001 2146435072
> >  ok
> > vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-allocate
> >  ok
> > diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-allocate
> > # retrying
> > vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-create
> > not ok
> > vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-create #
> > Error code: 98
> > # [1]PETSC ERROR: - Error Message
> > --
> > # [1]PETSC ERROR: General MPI error
> > # [1]PETSC ERROR: MPI error 6 MPI_ERR_RANK: invalid rank
> > # [1]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
> > # [1]PETSC ERROR: Petsc Development GIT revision: v3.15.4-783-g168bb9f76b
> >  GIT Date: 2021-09-13 14:01:22 +
> > # [1]PETSC ERROR: ../ex1 on a arch-debug named kanary.kaust.edu.sa by
> > zampins Tue Sep 14 09:31:42 2021
> > # [1]PETSC ERROR: [2]PETSC ERROR: - Error Message
> > --
> > # [2]PETSC ERROR: General MPI error
> > # [2]PETSC ERROR: MPI error 6 MPI_ERR_RANK: invalid rank
> > # [2]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
> > # [2]PETSC ERROR: Petsc Development GIT revision: v3.15.4-783-g168bb9f76b
> >  GIT Date: 2021-09-13 14:01:22 +
> > # 

Re: [petsc-dev] Cannot locate file: share/petsc/datafiles/matrices/small

2021-09-14 Thread Matthew Knepley
Okay, we have to send this to OpenMPI. Volunteers?

Maybe we should note this in the FAQ, or installation, so we remember how
to fix it if someone else asks?

  Thanks,

 Matt

On Tue, Sep 14, 2021 at 2:35 AM Stefano Zampini 
wrote:

> I can reproduce it even with OpenMPI 4.1.1 on a different machine (Ubuntu
> 18 + AMD milan + clang from AOCC)  and it is definitely an OpenMPI bug in
> the vader BTL If I use tcp, everything runs smooth
>
> zampins@kanary:~/Devel/petsc$ cat
> /home/zampins/local/etc/openmpi-mca-params.conf | grep btl
> btl = tcp,self
> zampins@kanary:~/Devel/petsc$ make -f gmakefile.test
> vec_is_sf_tutorials-ex1_4
> Using MAKEFLAGS:
> TEST arch-debug/tests/counts/vec_is_sf_tutorials-ex1_4.counts
>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-create
>  ok
> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-create
>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-dynamic
>  ok
> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-dynamic
>  ok
> vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-allocate
>  ok
> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-allocate
>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-create
>  ok
> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-create
>  ok
> vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-dynamic
>  ok
> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-dynamic
>  ok
> vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-allocate
>  ok
> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-allocate
>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-create
>  ok
> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-create
>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-dynamic
>  ok
> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-dynamic
>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-allocate
>  ok
> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-allocate
>
>
> zampins@kanary:~/Devel/petsc$ cat
> /home/zampins/local/etc/openmpi-mca-params.conf | grep btl
> btl = vader,tcp,self
> zampins@kanary:~/Devel/petsc$ make -f gmakefile.test
> vec_is_sf_tutorials-ex1_4
> Using MAKEFLAGS:
> TEST arch-debug/tests/counts/vec_is_sf_tutorials-ex1_4.counts
>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-create
> not ok
> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-create
> # Error code: 1
> # 43,46c43,46
> # < [0] 0: 4001 2000 2002 3002 4002
> # < [1] 0: 1001 3000
> # < [2] 0: 2001 4000
> # < [3] 0: 3001 1000
> # ---
> # > [0] 0: 2002 2146435072 2 2146435072 38736240
> # > [1] 0: 3000 2146435072
> # > [2] 0: 2001 2146435072
> # > [3] 0: 3001 2146435072
>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-dynamic
> not ok
> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-dynamic
> # Error code: 1
> # 43,46c43,46
> # < [0] 0: 4001 2000 2002 3002 4002
> # < [1] 0: 1001 3000
> # < [2] 0: 2001 4000
> # < [3] 0: 3001 1000
> # ---
> # > [0] 0: 2002 2146435072 2 2146435072 0
> # > [1] 0: 3000 2146435072
> # > [2] 0: 2001 2146435072
> # > [3] 0: 3001 2146435072
>  ok
> vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-allocate
>  ok
> diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-allocate
> # retrying
> vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-create
> not ok
> vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-create #
> Error code: 98
> # [1]PETSC ERROR: - Error Message
> --
> # [1]PETSC ERROR: General MPI error
> # [1]PETSC ERROR: MPI error 6 MPI_ERR_RANK: invalid rank
> # [1]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
> # [1]PETSC ERROR: Petsc Development GIT revision: v3.15.4-783-g168bb9f76b
>  GIT Date: 2021-09-13 14:01:22 +
> # [1]PETSC ERROR: ../ex1 on a arch-debug named kanary.kaust.edu.sa by
> zampins Tue Sep 14 09:31:42 2021
> # [1]PETSC ERROR: [2]PETSC ERROR: - Error Message
> --
> # [2]PETSC ERROR: General MPI error
> # [2]PETSC ERROR: MPI error 6 MPI_ERR_RANK: invalid rank
> # [2]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
> # [2]PETSC ERROR: Petsc Development GIT revision: v3.15.4-783-g168bb9f76b
>  GIT Date: 2021-09-13 14:01:22 +
> # [2]PETSC ERROR: ../ex1 on a arch-debug named kanary.kaust.edu.sa by
> zampins Tue Sep 14 09:31:42 2021
> # [2]PETSC ERROR: Configure options
> --with-cc=/home/zampins/local/bin/mpicc --with-cxx-dialect=c++14
> --with-cxx=/home/zampins/local/bin/mpicxx --with-debugging=1
> 

Re: [petsc-dev] Cannot locate file: share/petsc/datafiles/matrices/small

2021-09-14 Thread Stefano Zampini
I can reproduce it even with OpenMPI 4.1.1 on a different machine (Ubuntu
18 + AMD milan + clang from AOCC)  and it is definitely an OpenMPI bug in
the vader BTL If I use tcp, everything runs smooth

zampins@kanary:~/Devel/petsc$ cat
/home/zampins/local/etc/openmpi-mca-params.conf | grep btl
btl = tcp,self
zampins@kanary:~/Devel/petsc$ make -f gmakefile.test
vec_is_sf_tutorials-ex1_4
Using MAKEFLAGS:
TEST arch-debug/tests/counts/vec_is_sf_tutorials-ex1_4.counts
 ok vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-create
 ok
diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-create
 ok vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-dynamic
 ok
diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-dynamic
 ok vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-allocate
 ok
diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-allocate
 ok vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-create
 ok
diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-create
 ok vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-dynamic
 ok
diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-dynamic
 ok
vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-allocate
 ok
diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-allocate
 ok vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-create
 ok
diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-create
 ok vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-dynamic
 ok
diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-dynamic
 ok vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-allocate
 ok
diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-allocate


zampins@kanary:~/Devel/petsc$ cat
/home/zampins/local/etc/openmpi-mca-params.conf | grep btl
btl = vader,tcp,self
zampins@kanary:~/Devel/petsc$ make -f gmakefile.test
vec_is_sf_tutorials-ex1_4
Using MAKEFLAGS:
TEST arch-debug/tests/counts/vec_is_sf_tutorials-ex1_4.counts
 ok vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-create
not ok
diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-create
# Error code: 1
# 43,46c43,46
# < [0] 0: 4001 2000 2002 3002 4002
# < [1] 0: 1001 3000
# < [2] 0: 2001 4000
# < [3] 0: 3001 1000
# ---
# > [0] 0: 2002 2146435072 2 2146435072 38736240
# > [1] 0: 3000 2146435072
# > [2] 0: 2001 2146435072
# > [3] 0: 3001 2146435072
 ok vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-dynamic
not ok
diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-dynamic
# Error code: 1
# 43,46c43,46
# < [0] 0: 4001 2000 2002 3002 4002
# < [1] 0: 1001 3000
# < [2] 0: 2001 4000
# < [3] 0: 3001 1000
# ---
# > [0] 0: 2002 2146435072 2 2146435072 0
# > [1] 0: 3000 2146435072
# > [2] 0: 2001 2146435072
# > [3] 0: 3001 2146435072
 ok vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-allocate
 ok
diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-allocate
# retrying
vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-create
not ok
vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-create #
Error code: 98
# [1]PETSC ERROR: - Error Message
--
# [1]PETSC ERROR: General MPI error
# [1]PETSC ERROR: MPI error 6 MPI_ERR_RANK: invalid rank
# [1]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
# [1]PETSC ERROR: Petsc Development GIT revision: v3.15.4-783-g168bb9f76b
 GIT Date: 2021-09-13 14:01:22 +
# [1]PETSC ERROR: ../ex1 on a arch-debug named kanary.kaust.edu.sa by
zampins Tue Sep 14 09:31:42 2021
# [1]PETSC ERROR: [2]PETSC ERROR: - Error Message
--
# [2]PETSC ERROR: General MPI error
# [2]PETSC ERROR: MPI error 6 MPI_ERR_RANK: invalid rank
# [2]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
# [2]PETSC ERROR: Petsc Development GIT revision: v3.15.4-783-g168bb9f76b
 GIT Date: 2021-09-13 14:01:22 +
# [2]PETSC ERROR: ../ex1 on a arch-debug named kanary.kaust.edu.sa by
zampins Tue Sep 14 09:31:42 2021
# [2]PETSC ERROR: Configure options --with-cc=/home/zampins/local/bin/mpicc
--with-cxx-dialect=c++14 --with-cxx=/home/zampins/local/bin/mpicxx
--with-debugging=1 --with-fc=/home/zampins/local/bin/mpifort
--with-fortran-bindings=0 --with-hip-dir=/opt/rocm --with-hip=1
--with-hypre-dir=/home/zampins/local-petsc
--with-kokkos-dir=/home/zampins/local-petsc
--with-kokkos-kernels-dir=/home/zampins/local-petsc
--with-blaslapack-include=/home/zampins/local-aocl/aocc/3.0-6/include

Re: [petsc-dev] Cannot locate file: share/petsc/datafiles/matrices/small

2021-09-13 Thread Stefano Zampini
I'll see if I can reproduce

Il Mar 14 Set 2021, 06:58 Junchao Zhang  ha
scritto:

> Hi, Stefano,
>Ping you again to see if you want to resolve this problem before
> petsc-3.16
>
> --Junchao Zhang
>
>
> On Sun, Sep 12, 2021 at 3:06 PM Antonio T. sagitter <
> sagit...@fedoraproject.org> wrote:
>
>> Unfortunately, it's not possible. I must use the OpenMPI provided by
>> Fedora build-system (these rpm builds of PETSc are for Fedora's
>> repositories), downloading external software is not permitted.
>>
>> On 9/12/21 21:10, Pierre Jolivet wrote:
>> >
>> >> On 12 Sep 2021, at 8:56 PM, Matthew Knepley > >> > wrote:
>> >>
>> >> On Sun, Sep 12, 2021 at 2:49 PM Antonio T. sagitter
>> >> mailto:sagit...@fedoraproject.org>>
>> wrote:
>> >>
>> >> Those attached are configure.log/make.log from a MPI build in
>> >> Fedora 34
>> >> x86_64 where the error below occurred.
>> >>
>> >>
>> >> This is OpenMPI 4.1.0. Is that the only MPI you build? My first
>> >> inclination is that this is an MPI implementation bug.
>> >>
>> >> Junchao, do we have an OpenMPI build in the CI?
>> >
>> > config/examples/arch-ci-linux-cuda-double-64idx.py:
>> >   '--download-openmpi=1',
>> > config/examples/arch-ci-linux-pkgs-dbg-ftn-interfaces.py:
>> >   '--download-openmpi=1',
>> > config/examples/arch-ci-linux-pkgs-opt.py:  '--download-openmpi=1',
>> >
>> > config/BuildSystem/config/packages/OpenMPI.py uses version 4.1.0 as
>> well.
>> > I’m not sure PETSc is to blame here Antonio. You may want to try to
>> > ditch the OpenMPI shipped by your packet manager and try
>> > --download-openmpi as well, just for a quick sanity check.
>> >
>> > Thanks,
>> > Pierre
>> >
>>
>> --
>> ---
>> Antonio Trande
>> Fedora Project
>> mailto: sagit...@fedoraproject.org
>> GPG key: 0x29FBC85D7A51CC2F
>> GPG key server: https://keyserver1.pgp.com/
>>
>


Re: [petsc-dev] Cannot locate file: share/petsc/datafiles/matrices/small

2021-09-13 Thread Junchao Zhang
Hi, Stefano,
   Ping you again to see if you want to resolve this problem before
petsc-3.16

--Junchao Zhang


On Sun, Sep 12, 2021 at 3:06 PM Antonio T. sagitter <
sagit...@fedoraproject.org> wrote:

> Unfortunately, it's not possible. I must use the OpenMPI provided by
> Fedora build-system (these rpm builds of PETSc are for Fedora's
> repositories), downloading external software is not permitted.
>
> On 9/12/21 21:10, Pierre Jolivet wrote:
> >
> >> On 12 Sep 2021, at 8:56 PM, Matthew Knepley  >> > wrote:
> >>
> >> On Sun, Sep 12, 2021 at 2:49 PM Antonio T. sagitter
> >> mailto:sagit...@fedoraproject.org>> wrote:
> >>
> >> Those attached are configure.log/make.log from a MPI build in
> >> Fedora 34
> >> x86_64 where the error below occurred.
> >>
> >>
> >> This is OpenMPI 4.1.0. Is that the only MPI you build? My first
> >> inclination is that this is an MPI implementation bug.
> >>
> >> Junchao, do we have an OpenMPI build in the CI?
> >
> > config/examples/arch-ci-linux-cuda-double-64idx.py:
> >   '--download-openmpi=1',
> > config/examples/arch-ci-linux-pkgs-dbg-ftn-interfaces.py:
> >   '--download-openmpi=1',
> > config/examples/arch-ci-linux-pkgs-opt.py:  '--download-openmpi=1',
> >
> > config/BuildSystem/config/packages/OpenMPI.py uses version 4.1.0 as well.
> > I’m not sure PETSc is to blame here Antonio. You may want to try to
> > ditch the OpenMPI shipped by your packet manager and try
> > --download-openmpi as well, just for a quick sanity check.
> >
> > Thanks,
> > Pierre
> >
>
> --
> ---
> Antonio Trande
> Fedora Project
> mailto: sagit...@fedoraproject.org
> GPG key: 0x29FBC85D7A51CC2F
> GPG key server: https://keyserver1.pgp.com/
>


Re: [petsc-dev] Cannot locate file: share/petsc/datafiles/matrices/small

2021-09-12 Thread Antonio T. sagitter
Unfortunately, it's not possible. I must use the OpenMPI provided by 
Fedora build-system (these rpm builds of PETSc are for Fedora's 
repositories), downloading external software is not permitted.


On 9/12/21 21:10, Pierre Jolivet wrote:


On 12 Sep 2021, at 8:56 PM, Matthew Knepley > wrote:


On Sun, Sep 12, 2021 at 2:49 PM Antonio T. sagitter 
mailto:sagit...@fedoraproject.org>> wrote:


Those attached are configure.log/make.log from a MPI build in
Fedora 34
x86_64 where the error below occurred.


This is OpenMPI 4.1.0. Is that the only MPI you build? My first 
inclination is that this is an MPI implementation bug.


Junchao, do we have an OpenMPI build in the CI?


config/examples/arch-ci-linux-cuda-double-64idx.py:   
  '--download-openmpi=1',
config/examples/arch-ci-linux-pkgs-dbg-ftn-interfaces.py: 
  '--download-openmpi=1',

config/examples/arch-ci-linux-pkgs-opt.py:  '--download-openmpi=1',

config/BuildSystem/config/packages/OpenMPI.py uses version 4.1.0 as well.
I’m not sure PETSc is to blame here Antonio. You may want to try to 
ditch the OpenMPI shipped by your packet manager and try 
--download-openmpi as well, just for a quick sanity check.


Thanks,
Pierre



--
---
Antonio Trande
Fedora Project
mailto: sagit...@fedoraproject.org
GPG key: 0x29FBC85D7A51CC2F
GPG key server: https://keyserver1.pgp.com/


Re: [petsc-dev] Cannot locate file: share/petsc/datafiles/matrices/small

2021-09-12 Thread Junchao Zhang
An old issue with SF_Window is at
https://gitlab.com/petsc/petsc/-/issues/555,  though which is a different
error.

--Junchao Zhang


On Sun, Sep 12, 2021 at 2:20 PM Junchao Zhang 
wrote:

> We met SF + Windows errors before.  Stefano wrote the code, which I don't
> think was worth doing. SF with MPI one-sided is hard to be correct (due to
> shared memory programming), bad in performance, and no users use that.
> I would suggest we just disable the test and feature?   Stefano, what do
> you think?
>
> --Junchao Zhang
>
>
> On Sun, Sep 12, 2021 at 2:10 PM Pierre Jolivet  wrote:
>
>>
>> On 12 Sep 2021, at 8:56 PM, Matthew Knepley  wrote:
>>
>> On Sun, Sep 12, 2021 at 2:49 PM Antonio T. sagitter <
>> sagit...@fedoraproject.org> wrote:
>>
>>> Those attached are configure.log/make.log from a MPI build in Fedora 34
>>> x86_64 where the error below occurred.
>>>
>>
>> This is OpenMPI 4.1.0. Is that the only MPI you build? My first
>> inclination is that this is an MPI implementation bug.
>>
>> Junchao, do we have an OpenMPI build in the CI?
>>
>>
>> config/examples/arch-ci-linux-cuda-double-64idx.py:
>>  '--download-openmpi=1',
>> config/examples/arch-ci-linux-pkgs-dbg-ftn-interfaces.py:
>>  '--download-openmpi=1',
>> config/examples/arch-ci-linux-pkgs-opt.py:  '--download-openmpi=1',
>>
>> config/BuildSystem/config/packages/OpenMPI.py uses version 4.1.0 as well.
>> I’m not sure PETSc is to blame here Antonio. You may want to try to ditch
>> the OpenMPI shipped by your packet manager and try --download-openmpi as
>> well, just for a quick sanity check.
>>
>> Thanks,
>> Pierre
>>
>>   Thanks,
>>
>>  Matt
>>
>>
>>> On 9/12/21 19:18, Antonio T. sagitter wrote:
>>> > Okay. I will try to set correctly the DATAFILESPATH options.
>>> >
>>> > I see even this error:
>>> >
>>> > not ok
>>> > vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-create
>>> #
>>> > Error code: 68
>>> >
>>> > #PetscSF Object: 4 MPI processes
>>> >
>>> > #  type: window
>>> >
>>> > #  [0] Number of roots=3, leaves=2, remote ranks=2
>>> >
>>> > #  [0] 0 <- (3,1)
>>> >
>>> > #  [0] 1 <- (1,0)
>>> >
>>> > #  [1] Number of roots=2, leaves=3, remote ranks=2
>>> >
>>> > #  [1] 0 <- (0,1)
>>> >
>>> > #  [1] 1 <- (2,0)
>>> >
>>> > #  [1] 2 <- (0,2)
>>> >
>>> > #  [2] Number of roots=2, leaves=3, remote ranks=3
>>> >
>>> > #  [2] 0 <- (1,1)
>>> >
>>> > #  [2] 1 <- (3,0)
>>> >
>>> > #  [2] 2 <- (0,2)
>>> >
>>> > #  [3] Number of roots=2, leaves=3, remote ranks=2
>>> >
>>> > #  [3] 0 <- (2,1)
>>> >
>>> > #  [3] 1 <- (0,0)
>>> >
>>> > #  [3] 2 <- (0,2)
>>> >
>>> > #  [0] Roots referenced by my leaves, by rank
>>> >
>>> > #  [0] 1: 1 edges
>>> >
>>> > #  [0]1 <- 0
>>> >
>>> > #  [0] 3: 1 edges
>>> >
>>> > #  [0]0 <- 1
>>> >
>>> > #  [1] Roots referenced by my leaves, by rank
>>> >
>>> > #  [1] 0: 2 edges
>>> >
>>> > #  [1]0 <- 1
>>> >
>>> > #  [1]2 <- 2
>>> >
>>> > #  [1] 2: 1 edges
>>> >
>>> > #  [1]1 <- 0
>>> >
>>> > #  [2] Roots referenced by my leaves, by rank
>>> >
>>> > #  [2] 0: 1 edges
>>> >
>>> > #  [2]2 <- 2
>>> >
>>> > #  [2] 1: 1 edges
>>> >
>>> > #  [2]0 <- 1
>>> >
>>> > #  [2] 3: 1 edges
>>> >
>>> > #  [2]1 <- 0
>>> >
>>> > #  [3] Roots referenced by my leaves, by rank
>>> >
>>> > #  [3] 0: 2 edges
>>> >
>>> > #  [3]1 <- 0
>>> >
>>> > #  [3]2 <- 2
>>> >
>>> > #  [3] 2: 1 edges
>>> >
>>> > #  [3]0 <- 1
>>> >
>>> > #  current flavor=CREATE synchronization=FENCE MultiSF
>>> sort=rank-order
>>> >
>>> > #current info=MPI_INFO_NULL
>>> >
>>> > #[buildhw-x86-09:1135574] *** An error occurred in MPI_Accumulate
>>> >
>>> > #[buildhw-x86-09:1135574] *** reported by process [3562602497,3]
>>> >
>>> > #[buildhw-x86-09:1135574] *** on win rdma window 4
>>> >
>>> > #[buildhw-x86-09:1135574] *** MPI_ERR_RMA_RANGE: invalid RMA
>>> address
>>> > range
>>> >
>>> > #[buildhw-x86-09:1135574] *** MPI_ERRORS_ARE_FATAL (processes in
>>> > this win will now abort,
>>> >
>>> > #[buildhw-x86-09:1135574] ***and potentially your MPI job)
>>> >
>>> > #[buildhw-x86-09.iad2.fedoraproject.org:1135567] 3 more processes
>>> > have sent help message help-mpi-errors.txt / mpi_errors_are_fatal
>>> >
>>> > #[buildhw-x86-09.iad2.fedoraproject.org:1135567] Set MCA
>>> parameter
>>> > "orte_base_help_aggregate" to 0 to see all help / error messages
>>> >
>>> > Looks like an error related to OpenMPI-4*:
>>> > https://github.com/open-mpi/ompi/issues/6374
>>> >
>>>
>>> --
>>> ---
>>> Antonio Trande
>>> Fedora Project
>>> mailto: sagit...@fedoraproject.org
>>> GPG key: 0x29FBC85D7A51CC2F
>>> GPG key server: https://keyserver1.pgp.com/
>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- 

Re: [petsc-dev] Cannot locate file: share/petsc/datafiles/matrices/small

2021-09-12 Thread Junchao Zhang
We met SF + Windows errors before.  Stefano wrote the code, which I don't
think was worth doing. SF with MPI one-sided is hard to be correct (due to
shared memory programming), bad in performance, and no users use that.
I would suggest we just disable the test and feature?   Stefano, what do
you think?

--Junchao Zhang


On Sun, Sep 12, 2021 at 2:10 PM Pierre Jolivet  wrote:

>
> On 12 Sep 2021, at 8:56 PM, Matthew Knepley  wrote:
>
> On Sun, Sep 12, 2021 at 2:49 PM Antonio T. sagitter <
> sagit...@fedoraproject.org> wrote:
>
>> Those attached are configure.log/make.log from a MPI build in Fedora 34
>> x86_64 where the error below occurred.
>>
>
> This is OpenMPI 4.1.0. Is that the only MPI you build? My first
> inclination is that this is an MPI implementation bug.
>
> Junchao, do we have an OpenMPI build in the CI?
>
>
> config/examples/arch-ci-linux-cuda-double-64idx.py:
>  '--download-openmpi=1',
> config/examples/arch-ci-linux-pkgs-dbg-ftn-interfaces.py:
>  '--download-openmpi=1',
> config/examples/arch-ci-linux-pkgs-opt.py:  '--download-openmpi=1',
>
> config/BuildSystem/config/packages/OpenMPI.py uses version 4.1.0 as well.
> I’m not sure PETSc is to blame here Antonio. You may want to try to ditch
> the OpenMPI shipped by your packet manager and try --download-openmpi as
> well, just for a quick sanity check.
>
> Thanks,
> Pierre
>
>   Thanks,
>
>  Matt
>
>
>> On 9/12/21 19:18, Antonio T. sagitter wrote:
>> > Okay. I will try to set correctly the DATAFILESPATH options.
>> >
>> > I see even this error:
>> >
>> > not ok
>> > vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-create
>> #
>> > Error code: 68
>> >
>> > #PetscSF Object: 4 MPI processes
>> >
>> > #  type: window
>> >
>> > #  [0] Number of roots=3, leaves=2, remote ranks=2
>> >
>> > #  [0] 0 <- (3,1)
>> >
>> > #  [0] 1 <- (1,0)
>> >
>> > #  [1] Number of roots=2, leaves=3, remote ranks=2
>> >
>> > #  [1] 0 <- (0,1)
>> >
>> > #  [1] 1 <- (2,0)
>> >
>> > #  [1] 2 <- (0,2)
>> >
>> > #  [2] Number of roots=2, leaves=3, remote ranks=3
>> >
>> > #  [2] 0 <- (1,1)
>> >
>> > #  [2] 1 <- (3,0)
>> >
>> > #  [2] 2 <- (0,2)
>> >
>> > #  [3] Number of roots=2, leaves=3, remote ranks=2
>> >
>> > #  [3] 0 <- (2,1)
>> >
>> > #  [3] 1 <- (0,0)
>> >
>> > #  [3] 2 <- (0,2)
>> >
>> > #  [0] Roots referenced by my leaves, by rank
>> >
>> > #  [0] 1: 1 edges
>> >
>> > #  [0]1 <- 0
>> >
>> > #  [0] 3: 1 edges
>> >
>> > #  [0]0 <- 1
>> >
>> > #  [1] Roots referenced by my leaves, by rank
>> >
>> > #  [1] 0: 2 edges
>> >
>> > #  [1]0 <- 1
>> >
>> > #  [1]2 <- 2
>> >
>> > #  [1] 2: 1 edges
>> >
>> > #  [1]1 <- 0
>> >
>> > #  [2] Roots referenced by my leaves, by rank
>> >
>> > #  [2] 0: 1 edges
>> >
>> > #  [2]2 <- 2
>> >
>> > #  [2] 1: 1 edges
>> >
>> > #  [2]0 <- 1
>> >
>> > #  [2] 3: 1 edges
>> >
>> > #  [2]1 <- 0
>> >
>> > #  [3] Roots referenced by my leaves, by rank
>> >
>> > #  [3] 0: 2 edges
>> >
>> > #  [3]1 <- 0
>> >
>> > #  [3]2 <- 2
>> >
>> > #  [3] 2: 1 edges
>> >
>> > #  [3]0 <- 1
>> >
>> > #  current flavor=CREATE synchronization=FENCE MultiSF
>> sort=rank-order
>> >
>> > #current info=MPI_INFO_NULL
>> >
>> > #[buildhw-x86-09:1135574] *** An error occurred in MPI_Accumulate
>> >
>> > #[buildhw-x86-09:1135574] *** reported by process [3562602497,3]
>> >
>> > #[buildhw-x86-09:1135574] *** on win rdma window 4
>> >
>> > #[buildhw-x86-09:1135574] *** MPI_ERR_RMA_RANGE: invalid RMA
>> address
>> > range
>> >
>> > #[buildhw-x86-09:1135574] *** MPI_ERRORS_ARE_FATAL (processes in
>> > this win will now abort,
>> >
>> > #[buildhw-x86-09:1135574] ***and potentially your MPI job)
>> >
>> > #[buildhw-x86-09.iad2.fedoraproject.org:1135567] 3 more processes
>> > have sent help message help-mpi-errors.txt / mpi_errors_are_fatal
>> >
>> > #[buildhw-x86-09.iad2.fedoraproject.org:1135567] Set MCA parameter
>> > "orte_base_help_aggregate" to 0 to see all help / error messages
>> >
>> > Looks like an error related to OpenMPI-4*:
>> > https://github.com/open-mpi/ompi/issues/6374
>> >
>>
>> --
>> ---
>> Antonio Trande
>> Fedora Project
>> mailto: sagit...@fedoraproject.org
>> GPG key: 0x29FBC85D7A51CC2F
>> GPG key server: https://keyserver1.pgp.com/
>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> 
>
>
>


Re: [petsc-dev] Cannot locate file: share/petsc/datafiles/matrices/small

2021-09-12 Thread Pierre Jolivet

> On 12 Sep 2021, at 8:56 PM, Matthew Knepley  wrote:
> 
> On Sun, Sep 12, 2021 at 2:49 PM Antonio T. sagitter 
> mailto:sagit...@fedoraproject.org>> wrote:
> Those attached are configure.log/make.log from a MPI build in Fedora 34 
> x86_64 where the error below occurred.
> 
> This is OpenMPI 4.1.0. Is that the only MPI you build? My first inclination 
> is that this is an MPI implementation bug.
> 
> Junchao, do we have an OpenMPI build in the CI?

config/examples/arch-ci-linux-cuda-double-64idx.py:'--download-openmpi=1',
config/examples/arch-ci-linux-pkgs-dbg-ftn-interfaces.py:  
'--download-openmpi=1',
config/examples/arch-ci-linux-pkgs-opt.py:  '--download-openmpi=1',

config/BuildSystem/config/packages/OpenMPI.py uses version 4.1.0 as well.
I’m not sure PETSc is to blame here Antonio. You may want to try to ditch the 
OpenMPI shipped by your packet manager and try --download-openmpi as well, just 
for a quick sanity check.

Thanks,
Pierre

>   Thanks,
> 
>  Matt
>  
> On 9/12/21 19:18, Antonio T. sagitter wrote:
> > Okay. I will try to set correctly the DATAFILESPATH options.
> > 
> > I see even this error:
> > 
> > not ok 
> > vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-create # 
> > Error code: 68
> > 
> > #PetscSF Object: 4 MPI processes
> > 
> > #  type: window
> > 
> > #  [0] Number of roots=3, leaves=2, remote ranks=2
> > 
> > #  [0] 0 <- (3,1)
> > 
> > #  [0] 1 <- (1,0)
> > 
> > #  [1] Number of roots=2, leaves=3, remote ranks=2
> > 
> > #  [1] 0 <- (0,1)
> > 
> > #  [1] 1 <- (2,0)
> > 
> > #  [1] 2 <- (0,2)
> > 
> > #  [2] Number of roots=2, leaves=3, remote ranks=3
> > 
> > #  [2] 0 <- (1,1)
> > 
> > #  [2] 1 <- (3,0)
> > 
> > #  [2] 2 <- (0,2)
> > 
> > #  [3] Number of roots=2, leaves=3, remote ranks=2
> > 
> > #  [3] 0 <- (2,1)
> > 
> > #  [3] 1 <- (0,0)
> > 
> > #  [3] 2 <- (0,2)
> > 
> > #  [0] Roots referenced by my leaves, by rank
> > 
> > #  [0] 1: 1 edges
> > 
> > #  [0]1 <- 0
> > 
> > #  [0] 3: 1 edges
> > 
> > #  [0]0 <- 1
> > 
> > #  [1] Roots referenced by my leaves, by rank
> > 
> > #  [1] 0: 2 edges
> > 
> > #  [1]0 <- 1
> > 
> > #  [1]2 <- 2
> > 
> > #  [1] 2: 1 edges
> > 
> > #  [1]1 <- 0
> > 
> > #  [2] Roots referenced by my leaves, by rank
> > 
> > #  [2] 0: 1 edges
> > 
> > #  [2]2 <- 2
> > 
> > #  [2] 1: 1 edges
> > 
> > #  [2]0 <- 1
> > 
> > #  [2] 3: 1 edges
> > 
> > #  [2]1 <- 0
> > 
> > #  [3] Roots referenced by my leaves, by rank
> > 
> > #  [3] 0: 2 edges
> > 
> > #  [3]1 <- 0
> > 
> > #  [3]2 <- 2
> > 
> > #  [3] 2: 1 edges
> > 
> > #  [3]0 <- 1
> > 
> > #  current flavor=CREATE synchronization=FENCE MultiSF sort=rank-order
> > 
> > #current info=MPI_INFO_NULL
> > 
> > #[buildhw-x86-09:1135574] *** An error occurred in MPI_Accumulate
> > 
> > #[buildhw-x86-09:1135574] *** reported by process [3562602497,3]
> > 
> > #[buildhw-x86-09:1135574] *** on win rdma window 4
> > 
> > #[buildhw-x86-09:1135574] *** MPI_ERR_RMA_RANGE: invalid RMA address 
> > range
> > 
> > #[buildhw-x86-09:1135574] *** MPI_ERRORS_ARE_FATAL (processes in 
> > this win will now abort,
> > 
> > #[buildhw-x86-09:1135574] ***and potentially your MPI job)
> > 
> > #[buildhw-x86-09.iad2.fedoraproject.org:1135567 
> > ] 3 more processes 
> > have sent help message help-mpi-errors.txt / mpi_errors_are_fatal
> > 
> > #[buildhw-x86-09.iad2.fedoraproject.org:1135567 
> > ] Set MCA parameter 
> > "orte_base_help_aggregate" to 0 to see all help / error messages
> > 
> > Looks like an error related to OpenMPI-4*:
> > https://github.com/open-mpi/ompi/issues/6374 
> > 
> >  
> 
> -- 
> ---
> Antonio Trande
> Fedora Project
> mailto: sagit...@fedoraproject.org 
> GPG key: 0x29FBC85D7A51CC2F
> GPG key server: https://keyserver1.pgp.com/ 
> 
> -- 
> What most experimenters take for granted before they begin their experiments 
> is infinitely more interesting than any results to which their experiments 
> lead.
> -- Norbert Wiener
> 
> https://www.cse.buffalo.edu/~knepley/ 



Re: [petsc-dev] Cannot locate file: share/petsc/datafiles/matrices/small

2021-09-12 Thread Matthew Knepley
On Sun, Sep 12, 2021 at 2:49 PM Antonio T. sagitter <
sagit...@fedoraproject.org> wrote:

> Those attached are configure.log/make.log from a MPI build in Fedora 34
> x86_64 where the error below occurred.
>

This is OpenMPI 4.1.0. Is that the only MPI you build? My first inclination
is that this is an MPI implementation bug.

Junchao, do we have an OpenMPI build in the CI?

  Thanks,

 Matt


> On 9/12/21 19:18, Antonio T. sagitter wrote:
> > Okay. I will try to set correctly the DATAFILESPATH options.
> >
> > I see even this error:
> >
> > not ok
> > vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-create #
> > Error code: 68
> >
> > #PetscSF Object: 4 MPI processes
> >
> > #  type: window
> >
> > #  [0] Number of roots=3, leaves=2, remote ranks=2
> >
> > #  [0] 0 <- (3,1)
> >
> > #  [0] 1 <- (1,0)
> >
> > #  [1] Number of roots=2, leaves=3, remote ranks=2
> >
> > #  [1] 0 <- (0,1)
> >
> > #  [1] 1 <- (2,0)
> >
> > #  [1] 2 <- (0,2)
> >
> > #  [2] Number of roots=2, leaves=3, remote ranks=3
> >
> > #  [2] 0 <- (1,1)
> >
> > #  [2] 1 <- (3,0)
> >
> > #  [2] 2 <- (0,2)
> >
> > #  [3] Number of roots=2, leaves=3, remote ranks=2
> >
> > #  [3] 0 <- (2,1)
> >
> > #  [3] 1 <- (0,0)
> >
> > #  [3] 2 <- (0,2)
> >
> > #  [0] Roots referenced by my leaves, by rank
> >
> > #  [0] 1: 1 edges
> >
> > #  [0]1 <- 0
> >
> > #  [0] 3: 1 edges
> >
> > #  [0]0 <- 1
> >
> > #  [1] Roots referenced by my leaves, by rank
> >
> > #  [1] 0: 2 edges
> >
> > #  [1]0 <- 1
> >
> > #  [1]2 <- 2
> >
> > #  [1] 2: 1 edges
> >
> > #  [1]1 <- 0
> >
> > #  [2] Roots referenced by my leaves, by rank
> >
> > #  [2] 0: 1 edges
> >
> > #  [2]2 <- 2
> >
> > #  [2] 1: 1 edges
> >
> > #  [2]0 <- 1
> >
> > #  [2] 3: 1 edges
> >
> > #  [2]1 <- 0
> >
> > #  [3] Roots referenced by my leaves, by rank
> >
> > #  [3] 0: 2 edges
> >
> > #  [3]1 <- 0
> >
> > #  [3]2 <- 2
> >
> > #  [3] 2: 1 edges
> >
> > #  [3]0 <- 1
> >
> > #  current flavor=CREATE synchronization=FENCE MultiSF
> sort=rank-order
> >
> > #current info=MPI_INFO_NULL
> >
> > #[buildhw-x86-09:1135574] *** An error occurred in MPI_Accumulate
> >
> > #[buildhw-x86-09:1135574] *** reported by process [3562602497,3]
> >
> > #[buildhw-x86-09:1135574] *** on win rdma window 4
> >
> > #[buildhw-x86-09:1135574] *** MPI_ERR_RMA_RANGE: invalid RMA address
> > range
> >
> > #[buildhw-x86-09:1135574] *** MPI_ERRORS_ARE_FATAL (processes in
> > this win will now abort,
> >
> > #[buildhw-x86-09:1135574] ***and potentially your MPI job)
> >
> > #[buildhw-x86-09.iad2.fedoraproject.org:1135567] 3 more processes
> > have sent help message help-mpi-errors.txt / mpi_errors_are_fatal
> >
> > #[buildhw-x86-09.iad2.fedoraproject.org:1135567] Set MCA parameter
> > "orte_base_help_aggregate" to 0 to see all help / error messages
> >
> > Looks like an error related to OpenMPI-4*:
> > https://github.com/open-mpi/ompi/issues/6374
> >
>
> --
> ---
> Antonio Trande
> Fedora Project
> mailto: sagit...@fedoraproject.org
> GPG key: 0x29FBC85D7A51CC2F
> GPG key server: https://keyserver1.pgp.com/



-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ 


Re: [petsc-dev] Cannot locate file: share/petsc/datafiles/matrices/small

2021-09-12 Thread Pierre Jolivet
Could you please forward configure.log to petsc-ma...@mcs.anl.gov 
?
It’ll help the SF people figure out if it’s a false positive or if something 
needs fixing.

Thanks,
Pierre

> On 12 Sep 2021, at 7:33 PM, Antonio T. sagitter  
> wrote:
> 
> On 9/12/21 19:29, Pierre Jolivet wrote:
>> In your configure.log, I see --with-mpi=0, so I’m surprised this is even 
>> running, and even more surprised that you are encountering OpenMPI errors.
>> Is this from a different build?
> 
> Yes, sorry. This is another build with MPI.
> 
>> Thanks,
>> Pierre
>>> On 12 Sep 2021, at 7:18 PM, Antonio T. sagitter 
>>>  wrote:
>>> 
>>> Okay. I will try to set correctly the DATAFILESPATH options.
>>> 
>>> I see even this error:
>>> 
>>> not ok 
>>> vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-create # 
>>> Error code: 68
>>> 
>>> #   PetscSF Object: 4 MPI processes
>>> 
>>> # type: window
>>> 
>>> # [0] Number of roots=3, leaves=2, remote ranks=2
>>> 
>>> # [0] 0 <- (3,1)
>>> 
>>> # [0] 1 <- (1,0)
>>> 
>>> # [1] Number of roots=2, leaves=3, remote ranks=2
>>> 
>>> # [1] 0 <- (0,1)
>>> 
>>> # [1] 1 <- (2,0)
>>> 
>>> # [1] 2 <- (0,2)
>>> 
>>> # [2] Number of roots=2, leaves=3, remote ranks=3
>>> 
>>> # [2] 0 <- (1,1)
>>> 
>>> # [2] 1 <- (3,0)
>>> 
>>> # [2] 2 <- (0,2)
>>> 
>>> # [3] Number of roots=2, leaves=3, remote ranks=2
>>> 
>>> # [3] 0 <- (2,1)
>>> 
>>> # [3] 1 <- (0,0)
>>> 
>>> # [3] 2 <- (0,2)
>>> 
>>> # [0] Roots referenced by my leaves, by rank
>>> 
>>> # [0] 1: 1 edges
>>> 
>>> # [0]1 <- 0
>>> 
>>> # [0] 3: 1 edges
>>> 
>>> # [0]0 <- 1
>>> 
>>> # [1] Roots referenced by my leaves, by rank
>>> 
>>> # [1] 0: 2 edges
>>> 
>>> # [1]0 <- 1
>>> 
>>> # [1]2 <- 2
>>> 
>>> # [1] 2: 1 edges
>>> 
>>> # [1]1 <- 0
>>> 
>>> # [2] Roots referenced by my leaves, by rank
>>> 
>>> # [2] 0: 1 edges
>>> 
>>> # [2]2 <- 2
>>> 
>>> # [2] 1: 1 edges
>>> 
>>> # [2]0 <- 1
>>> 
>>> # [2] 3: 1 edges
>>> 
>>> # [2]1 <- 0
>>> 
>>> # [3] Roots referenced by my leaves, by rank
>>> 
>>> # [3] 0: 2 edges
>>> 
>>> # [3]1 <- 0
>>> 
>>> # [3]2 <- 2
>>> 
>>> # [3] 2: 1 edges
>>> 
>>> # [3]0 <- 1
>>> 
>>> # current flavor=CREATE synchronization=FENCE MultiSF sort=rank-order
>>> 
>>> #   current info=MPI_INFO_NULL
>>> 
>>> #   [buildhw-x86-09:1135574] *** An error occurred in MPI_Accumulate
>>> 
>>> #   [buildhw-x86-09:1135574] *** reported by process [3562602497,3]
>>> 
>>> #   [buildhw-x86-09:1135574] *** on win rdma window 4
>>> 
>>> #   [buildhw-x86-09:1135574] *** MPI_ERR_RMA_RANGE: invalid RMA address 
>>> range
>>> 
>>> #   [buildhw-x86-09:1135574] *** MPI_ERRORS_ARE_FATAL (processes in this 
>>> win will now abort,
>>> 
>>> #   [buildhw-x86-09:1135574] ***and potentially your MPI job)
>>> 
>>> #   [buildhw-x86-09.iad2.fedoraproject.org:1135567] 3 more processes have 
>>> sent help message help-mpi-errors.txt / mpi_errors_are_fatal
>>> 
>>> #   [buildhw-x86-09.iad2.fedoraproject.org:1135567] Set MCA parameter 
>>> "orte_base_help_aggregate" to 0 to see all help / error messages
>>> 
>>> Looks like an error related to OpenMPI-4*:
>>> https://github.com/open-mpi/ompi/issues/6374
>>> 
>>> Thank you.
>>> 
>>> On 9/12/21 18:09, Pierre Jolivet wrote:
 Hello,
 Did you copy the files from https://gitlab.com/petsc/datafiles 
  before the make check?
 cfd.1.10 is not part of https://gitlab.com/petsc/petsc 
 , so you either need to copy the files 
 from the datafiles repository, or remove the 
 --DATAFILESPATH=/home/sagitter/rpmbuild/BUILD/petsc-3.15.4/petsc-3.15.4/share/petsc/datafiles
  configure option.
 Thanks,
 Pierre
> On 12 Sep 2021, at 6:02 PM, Antonio T. sagitter 
> mailto:sagit...@fedoraproject.org>> wrote:
> 
> Hi all.
> 
> This error is repeated when the tests are compiled in PETSc-3.15.4 
> compilation:
> https://paste.in/MhdNjf 
> 
> This happens in Fedora 34 x86_64 with gcc-11.2.1
> 
> make.log and configure.log are attached
> -- 
> ---
> Antonio Trande
> Fedora Project
> mailto: sagit...@fedoraproject.org
> GPG key: 0x29FBC85D7A51CC2F
> GPG key server: https://keyserver1.pgp.com/
> 
>>> 
>>> -- 
>>> ---
>>> Antonio Trande
>>> Fedora Project
>>> mailto: sagit...@fedoraproject.org
>>> GPG key: 0x29FBC85D7A51CC2F
>>> GPG key server: https://keyserver1.pgp.com/
> 
> -- 
> ---
> Antonio Trande
> Fedora Project
> mailto: sagit...@fedoraproject.org
> GPG key: 0x29FBC85D7A51CC2F
> GPG key server: https://keyserver1.pgp.com/



Re: [petsc-dev] Cannot locate file: share/petsc/datafiles/matrices/small

2021-09-12 Thread Antonio T. sagitter

On 9/12/21 19:29, Pierre Jolivet wrote:

In your configure.log, I see --with-mpi=0, so I’m surprised this is even 
running, and even more surprised that you are encountering OpenMPI errors.
Is this from a different build?



Yes, sorry. This is another build with MPI.


Thanks,
Pierre


On 12 Sep 2021, at 7:18 PM, Antonio T. sagitter  
wrote:

Okay. I will try to set correctly the DATAFILESPATH options.

I see even this error:

not ok vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-create # 
Error code: 68

#   PetscSF Object: 4 MPI processes

# type: window

# [0] Number of roots=3, leaves=2, remote ranks=2

# [0] 0 <- (3,1)

# [0] 1 <- (1,0)

# [1] Number of roots=2, leaves=3, remote ranks=2

# [1] 0 <- (0,1)

# [1] 1 <- (2,0)

# [1] 2 <- (0,2)

# [2] Number of roots=2, leaves=3, remote ranks=3

# [2] 0 <- (1,1)

# [2] 1 <- (3,0)

# [2] 2 <- (0,2)

# [3] Number of roots=2, leaves=3, remote ranks=2

# [3] 0 <- (2,1)

# [3] 1 <- (0,0)

# [3] 2 <- (0,2)

# [0] Roots referenced by my leaves, by rank

# [0] 1: 1 edges

# [0]1 <- 0

# [0] 3: 1 edges

# [0]0 <- 1

# [1] Roots referenced by my leaves, by rank

# [1] 0: 2 edges

# [1]0 <- 1

# [1]2 <- 2

# [1] 2: 1 edges

# [1]1 <- 0

# [2] Roots referenced by my leaves, by rank

# [2] 0: 1 edges

# [2]2 <- 2

# [2] 1: 1 edges

# [2]0 <- 1

# [2] 3: 1 edges

# [2]1 <- 0

# [3] Roots referenced by my leaves, by rank

# [3] 0: 2 edges

# [3]1 <- 0

# [3]2 <- 2

# [3] 2: 1 edges

# [3]0 <- 1

# current flavor=CREATE synchronization=FENCE MultiSF sort=rank-order

#   current info=MPI_INFO_NULL

#   [buildhw-x86-09:1135574] *** An error occurred in MPI_Accumulate

#   [buildhw-x86-09:1135574] *** reported by process [3562602497,3]

#   [buildhw-x86-09:1135574] *** on win rdma window 4

#   [buildhw-x86-09:1135574] *** MPI_ERR_RMA_RANGE: invalid RMA address 
range

#   [buildhw-x86-09:1135574] *** MPI_ERRORS_ARE_FATAL (processes in this 
win will now abort,

#   [buildhw-x86-09:1135574] ***and potentially your MPI job)

#   [buildhw-x86-09.iad2.fedoraproject.org:1135567] 3 more processes have 
sent help message help-mpi-errors.txt / mpi_errors_are_fatal

#   [buildhw-x86-09.iad2.fedoraproject.org:1135567] Set MCA parameter 
"orte_base_help_aggregate" to 0 to see all help / error messages

Looks like an error related to OpenMPI-4*:
https://github.com/open-mpi/ompi/issues/6374

Thank you.

On 9/12/21 18:09, Pierre Jolivet wrote:

Hello,
Did you copy the files from https://gitlab.com/petsc/datafiles 
 before the make check?
cfd.1.10 is not part of https://gitlab.com/petsc/petsc 
, so you either need to copy the files from the 
datafiles repository, or remove the 
--DATAFILESPATH=/home/sagitter/rpmbuild/BUILD/petsc-3.15.4/petsc-3.15.4/share/petsc/datafiles
 configure option.
Thanks,
Pierre

On 12 Sep 2021, at 6:02 PM, Antonio T. sagitter mailto:sagit...@fedoraproject.org>> wrote:

Hi all.

This error is repeated when the tests are compiled in PETSc-3.15.4 compilation:
https://paste.in/MhdNjf 

This happens in Fedora 34 x86_64 with gcc-11.2.1

make.log and configure.log are attached
--
---
Antonio Trande
Fedora Project
mailto: sagit...@fedoraproject.org
GPG key: 0x29FBC85D7A51CC2F
GPG key server: https://keyserver1.pgp.com/



--
---
Antonio Trande
Fedora Project
mailto: sagit...@fedoraproject.org
GPG key: 0x29FBC85D7A51CC2F
GPG key server: https://keyserver1.pgp.com/




--
---
Antonio Trande
Fedora Project
mailto: sagit...@fedoraproject.org
GPG key: 0x29FBC85D7A51CC2F
GPG key server: https://keyserver1.pgp.com/


Re: [petsc-dev] Cannot locate file: share/petsc/datafiles/matrices/small

2021-09-12 Thread Pierre Jolivet
In your configure.log, I see --with-mpi=0, so I’m surprised this is even 
running, and even more surprised that you are encountering OpenMPI errors.
Is this from a different build?

Thanks,
Pierre

> On 12 Sep 2021, at 7:18 PM, Antonio T. sagitter  
> wrote:
> 
> Okay. I will try to set correctly the DATAFILESPATH options.
> 
> I see even this error:
> 
> not ok vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-create 
> # Error code: 68
> 
> # PetscSF Object: 4 MPI processes
> 
> #   type: window
> 
> #   [0] Number of roots=3, leaves=2, remote ranks=2
> 
> #   [0] 0 <- (3,1)
> 
> #   [0] 1 <- (1,0)
> 
> #   [1] Number of roots=2, leaves=3, remote ranks=2
> 
> #   [1] 0 <- (0,1)
> 
> #   [1] 1 <- (2,0)
> 
> #   [1] 2 <- (0,2)
> 
> #   [2] Number of roots=2, leaves=3, remote ranks=3
> 
> #   [2] 0 <- (1,1)
> 
> #   [2] 1 <- (3,0)
> 
> #   [2] 2 <- (0,2)
> 
> #   [3] Number of roots=2, leaves=3, remote ranks=2
> 
> #   [3] 0 <- (2,1)
> 
> #   [3] 1 <- (0,0)
> 
> #   [3] 2 <- (0,2)
> 
> #   [0] Roots referenced by my leaves, by rank
> 
> #   [0] 1: 1 edges
> 
> #   [0]1 <- 0
> 
> #   [0] 3: 1 edges
> 
> #   [0]0 <- 1
> 
> #   [1] Roots referenced by my leaves, by rank
> 
> #   [1] 0: 2 edges
> 
> #   [1]0 <- 1
> 
> #   [1]2 <- 2
> 
> #   [1] 2: 1 edges
> 
> #   [1]1 <- 0
> 
> #   [2] Roots referenced by my leaves, by rank
> 
> #   [2] 0: 1 edges
> 
> #   [2]2 <- 2
> 
> #   [2] 1: 1 edges
> 
> #   [2]0 <- 1
> 
> #   [2] 3: 1 edges
> 
> #   [2]1 <- 0
> 
> #   [3] Roots referenced by my leaves, by rank
> 
> #   [3] 0: 2 edges
> 
> #   [3]1 <- 0
> 
> #   [3]2 <- 2
> 
> #   [3] 2: 1 edges
> 
> #   [3]0 <- 1
> 
> #   current flavor=CREATE synchronization=FENCE MultiSF sort=rank-order
> 
> # current info=MPI_INFO_NULL
> 
> # [buildhw-x86-09:1135574] *** An error occurred in MPI_Accumulate
> 
> # [buildhw-x86-09:1135574] *** reported by process [3562602497,3]
> 
> # [buildhw-x86-09:1135574] *** on win rdma window 4
> 
> # [buildhw-x86-09:1135574] *** MPI_ERR_RMA_RANGE: invalid RMA address 
> range
> 
> # [buildhw-x86-09:1135574] *** MPI_ERRORS_ARE_FATAL (processes in this 
> win will now abort,
> 
> # [buildhw-x86-09:1135574] ***and potentially your MPI job)
> 
> # [buildhw-x86-09.iad2.fedoraproject.org:1135567] 3 more processes have 
> sent help message help-mpi-errors.txt / mpi_errors_are_fatal
> 
> # [buildhw-x86-09.iad2.fedoraproject.org:1135567] Set MCA parameter 
> "orte_base_help_aggregate" to 0 to see all help / error messages
> 
> Looks like an error related to OpenMPI-4*:
> https://github.com/open-mpi/ompi/issues/6374
> 
> Thank you.
> 
> On 9/12/21 18:09, Pierre Jolivet wrote:
>> Hello,
>> Did you copy the files from https://gitlab.com/petsc/datafiles 
>>  before the make check?
>> cfd.1.10 is not part of https://gitlab.com/petsc/petsc 
>> , so you either need to copy the files from 
>> the datafiles repository, or remove the 
>> --DATAFILESPATH=/home/sagitter/rpmbuild/BUILD/petsc-3.15.4/petsc-3.15.4/share/petsc/datafiles
>>  configure option.
>> Thanks,
>> Pierre
>>> On 12 Sep 2021, at 6:02 PM, Antonio T. sagitter >> > wrote:
>>> 
>>> Hi all.
>>> 
>>> This error is repeated when the tests are compiled in PETSc-3.15.4 
>>> compilation:
>>> https://paste.in/MhdNjf 
>>> 
>>> This happens in Fedora 34 x86_64 with gcc-11.2.1
>>> 
>>> make.log and configure.log are attached
>>> -- 
>>> ---
>>> Antonio Trande
>>> Fedora Project
>>> mailto: sagit...@fedoraproject.org
>>> GPG key: 0x29FBC85D7A51CC2F
>>> GPG key server: https://keyserver1.pgp.com/
>>> 
> 
> -- 
> ---
> Antonio Trande
> Fedora Project
> mailto: sagit...@fedoraproject.org
> GPG key: 0x29FBC85D7A51CC2F
> GPG key server: https://keyserver1.pgp.com/



Re: [petsc-dev] Cannot locate file: share/petsc/datafiles/matrices/small

2021-09-12 Thread Antonio T. sagitter

Okay. I will try to set correctly the DATAFILESPATH options.

I see even this error:

not ok 
vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-create # 
Error code: 68


#   PetscSF Object: 4 MPI processes

# type: window

# [0] Number of roots=3, leaves=2, remote ranks=2

# [0] 0 <- (3,1)

# [0] 1 <- (1,0)

# [1] Number of roots=2, leaves=3, remote ranks=2

# [1] 0 <- (0,1)

# [1] 1 <- (2,0)

# [1] 2 <- (0,2)

# [2] Number of roots=2, leaves=3, remote ranks=3

# [2] 0 <- (1,1)

# [2] 1 <- (3,0)

# [2] 2 <- (0,2)

# [3] Number of roots=2, leaves=3, remote ranks=2

# [3] 0 <- (2,1)

# [3] 1 <- (0,0)

# [3] 2 <- (0,2)

# [0] Roots referenced by my leaves, by rank

# [0] 1: 1 edges

# [0]1 <- 0

# [0] 3: 1 edges

# [0]0 <- 1

# [1] Roots referenced by my leaves, by rank

# [1] 0: 2 edges

# [1]0 <- 1

# [1]2 <- 2

# [1] 2: 1 edges

# [1]1 <- 0

# [2] Roots referenced by my leaves, by rank

# [2] 0: 1 edges

# [2]2 <- 2

# [2] 1: 1 edges

# [2]0 <- 1

# [2] 3: 1 edges

# [2]1 <- 0

# [3] Roots referenced by my leaves, by rank

# [3] 0: 2 edges

# [3]1 <- 0

# [3]2 <- 2

# [3] 2: 1 edges

# [3]0 <- 1

# current flavor=CREATE synchronization=FENCE MultiSF sort=rank-order

#   current info=MPI_INFO_NULL

#   [buildhw-x86-09:1135574] *** An error occurred in MPI_Accumulate

#   [buildhw-x86-09:1135574] *** reported by process [3562602497,3]

#   [buildhw-x86-09:1135574] *** on win rdma window 4

#   [buildhw-x86-09:1135574] *** MPI_ERR_RMA_RANGE: invalid RMA address 
range

#	[buildhw-x86-09:1135574] *** MPI_ERRORS_ARE_FATAL (processes in this 
win will now abort,


#   [buildhw-x86-09:1135574] ***and potentially your MPI job)

#	[buildhw-x86-09.iad2.fedoraproject.org:1135567] 3 more processes have 
sent help message help-mpi-errors.txt / mpi_errors_are_fatal


#	[buildhw-x86-09.iad2.fedoraproject.org:1135567] Set MCA parameter 
"orte_base_help_aggregate" to 0 to see all help / error messages


Looks like an error related to OpenMPI-4*:
https://github.com/open-mpi/ompi/issues/6374

Thank you.

On 9/12/21 18:09, Pierre Jolivet wrote:

Hello,
Did you copy the files from https://gitlab.com/petsc/datafiles 
 before the make check?
cfd.1.10 is not part of https://gitlab.com/petsc/petsc 
, so you either need to copy the files 
from the datafiles repository, or remove 
the --DATAFILESPATH=/home/sagitter/rpmbuild/BUILD/petsc-3.15.4/petsc-3.15.4/share/petsc/datafiles 
configure option.


Thanks,
Pierre

On 12 Sep 2021, at 6:02 PM, Antonio T. sagitter 
mailto:sagit...@fedoraproject.org>> wrote:


Hi all.

This error is repeated when the tests are compiled in PETSc-3.15.4 
compilation:

https://paste.in/MhdNjf 

This happens in Fedora 34 x86_64 with gcc-11.2.1

make.log and configure.log are attached
--
---
Antonio Trande
Fedora Project
mailto: sagit...@fedoraproject.org
GPG key: 0x29FBC85D7A51CC2F
GPG key server: https://keyserver1.pgp.com/





--
---
Antonio Trande
Fedora Project
mailto: sagit...@fedoraproject.org
GPG key: 0x29FBC85D7A51CC2F
GPG key server: https://keyserver1.pgp.com/


Re: [petsc-dev] Cannot locate file: share/petsc/datafiles/matrices/small

2021-09-12 Thread Satish Balay via petsc-dev
> you either need to copy the files from the datafiles repository

Actually clone https://gitlab.com/petsc/datafiles and use its location with 
DATAFILESPATH option

But yeah - its not really needed. If DATAFILESPATH is missing - the 
corresponding tests won't be run.

Satish

On Sun, 12 Sep 2021, Pierre Jolivet wrote:

> Hello,
> Did you copy the files from https://gitlab.com/petsc/datafiles 
>  before the make check?
> cfd.1.10 is not part of https://gitlab.com/petsc/petsc 
> , so you either need to copy the files from 
> the datafiles repository, or remove the 
> --DATAFILESPATH=/home/sagitter/rpmbuild/BUILD/petsc-3.15.4/petsc-3.15.4/share/petsc/datafiles
>  configure option.
> 
> Thanks,
> Pierre
> 
> > On 12 Sep 2021, at 6:02 PM, Antonio T. sagitter 
> >  wrote:
> > 
> > Hi all.
> > 
> > This error is repeated when the tests are compiled in PETSc-3.15.4 
> > compilation:
> > https://paste.in/MhdNjf
> > 
> > This happens in Fedora 34 x86_64 with gcc-11.2.1
> > 
> > make.log and configure.log are attached
> > -- 
> > ---
> > Antonio Trande
> > Fedora Project
> > mailto: sagit...@fedoraproject.org
> > GPG key: 0x29FBC85D7A51CC2F
> > GPG key server: https://keyserver1.pgp.com/
> > 
> 
> 



Re: [petsc-dev] Cannot locate file: share/petsc/datafiles/matrices/small

2021-09-12 Thread Pierre Jolivet
Hello,
Did you copy the files from https://gitlab.com/petsc/datafiles 
 before the make check?
cfd.1.10 is not part of https://gitlab.com/petsc/petsc 
, so you either need to copy the files from the 
datafiles repository, or remove the 
--DATAFILESPATH=/home/sagitter/rpmbuild/BUILD/petsc-3.15.4/petsc-3.15.4/share/petsc/datafiles
 configure option.

Thanks,
Pierre

> On 12 Sep 2021, at 6:02 PM, Antonio T. sagitter  
> wrote:
> 
> Hi all.
> 
> This error is repeated when the tests are compiled in PETSc-3.15.4 
> compilation:
> https://paste.in/MhdNjf
> 
> This happens in Fedora 34 x86_64 with gcc-11.2.1
> 
> make.log and configure.log are attached
> -- 
> ---
> Antonio Trande
> Fedora Project
> mailto: sagit...@fedoraproject.org
> GPG key: 0x29FBC85D7A51CC2F
> GPG key server: https://keyserver1.pgp.com/
>