Re: [Wien] WARNING elpa_setup during lapw1_mpi

2023-07-04 Thread Peter Blaha

When elpa is installed, it needs an extra flag   --enable-openmp

Otherwise multiple threads are not supported.


In the tar ball of elpa is a file INSTALL.md which explains 
clearly how to configure, compile and install elpa.



---

Anyway, if you do not have a threaded elpa version, set OMP_NUM_THREADS 
to one and use 2x mpi parallelization.


Depending on your hardware (network,...) it may be a little slower or 
not, but only tests can tell you this for sure.




Am 04.07.2023 um 16:33 schrieb Ilias, Miroslav:


Dear Professor Marks,


concerning 
https://www.mail-archive.com/wien@zeus.theochem.tuwien.ac.at/msg22621.html 
  : 




I am trying to find out why is ELPA module complaining of missing 
MPI_THREAD_MULTIPLE .



We have a debate on this 
https://git.gsi.de/SDEGroup/SIR/-/issues/85#note_55392 




If you somebody could comment on this topic as I would like to 
investigate deeply. Can somebody advice me a short ELPA testing 
program for this ?



Best, Miro









___
Wien mailing list
Wien@zeus.theochem.tuwien.ac.at
http://zeus.theochem.tuwien.ac.at/mailman/listinfo/wien
SEARCH the MAILING-LIST 
at:http://www.mail-archive.com/wien@zeus.theochem.tuwien.ac.at/index.html


--
---
Peter Blaha,  Inst. f. Materials Chemistry, TU Vienna, A-1060 Vienna
Phone: +43-158801165300
Email:peter.bl...@tuwien.ac.at   
WWW:http://www.imc.tuwien.ac.at   WIEN2k:http://www.wien2k.at

-
___
Wien mailing list
Wien@zeus.theochem.tuwien.ac.at
http://zeus.theochem.tuwien.ac.at/mailman/listinfo/wien
SEARCH the MAILING-LIST at:  
http://www.mail-archive.com/wien@zeus.theochem.tuwien.ac.at/index.html


Re: [Wien] WARNING elpa_setup during lapw1_mpi

2023-07-04 Thread Gavin Abo

That might be an issue worth asking about over in the Spack Community [1].


Not sure if I'm interpreting the message correctly, but if I am, the 
"@1.5.4:2" in the message from spack info I believe is indicating that 
thread_multiple will be disabled unless OpenMPI version 1.5.4 [2] to 2 
[3] is being used.



In the Spack source code of its current latest version 0.20.0 [4], the 
openmpi/package.py file has on lines 662-668:



 # thread_multiple
    if version in spack.version.ver("1.5.4:2"):
    match = re.search(r"MPI_THREAD_MULTIPLE: (\S+?),?", output)
    if match and is_enabled(match.group(1)):
    variants.append("+thread_multiple")
    else:
    variants.append("~thread_multiple")


Since you are using OpenMPI 4.1.5, I suspect line 663 might need changed 
for example to "if version in spack.version.ver("1.5.4:4.1.5"):".  
Perhaps there are other changes a Spark developer would know about that 
might also be needed for the MPI_THREAD_MULTIPLE to work with Spack when 
using OpenMPI 4.1.5.



[1] https://github.com/spack/spack/blob/develop/README.md#community

[2] https://www.open-mpi.org/software/ompi/v1.5/

[3] https://www.open-mpi.org/software/ompi/v2.0/

[4] 
https://github.com/spack/spack/blob/v0.20.0/var/spack/repos/builtin/packages/openmpi/package.py



Gavin

WIEN2k user


On 7/4/2023 11:59 AM, Ilias, Miroslav wrote:


Hello,


sorry for hidden link;


the point is that  the "|ompi_info -a | grep THREAD|" says 
|MPI_THREAD_MULTIPLE: yes|,



but  "spack info openmpi@4.1.5" gives "thread_multiple [off] 
[@1.5.4:2] on, off Enable MPI_THREAD_MULTIPLE support"



Maybe this is the case of the ELPA  "WARNING elpa_setup: MPI threading 
level MPI_THREAD_SERALIZED or MPI_THREAD_MULTIPLE required but your 
implementation does not support this! The number of OpenMP threads 
within ELPA will be limited to 1"



I will investigate it deeper, as I need Wien2k parallel as fast as 
possible.


Intel MPI will come later, again, I need ELPA built with Intel. I 
guess that IntelMPI+MKL  Wien2k is faster than OpenMPI+Openblas ?


Best, Miro


*From:* Wien  on behalf of 
Laurence Marks 

*Sent:* 04 July 2023 18:02
*To:* A Mailing list for WIEN2k users
*Subject:* Re: [Wien] WARNING elpa_setup during lapw1_mpi
That is a private site, do I cannot read anything.

All I can suggest is doing a Google search on "missing  
MPI_THREAD_MULTIPLE". It looks as if you have to enable this in 
openmpi configure, although there might be some bugs. There could also 
be some environmental parameters that need to be set.


Or use Intel mpi. While it does not have everything that some sysadmin 
want, if you are using mkl & ifort (both free) it integrates 
trivially. I personally dislike openmpi as it keeps changing and is 
not always user friendly.


--
Professor Laurence Marks (Laurie)
Department of Materials Science and Engineering, Northwestern University
www.numis.northwestern.edu <http://www.numis.northwestern.edu>
"Research is to see what everybody else has seen, and to think what 
nobody else has thought" Albert Szent-Györgyi


On Tue, Jul 4, 2023, 09:33 Ilias, Miroslav  wrote:

Dear Professor Marks,


concerning
https://www.mail-archive.com/wien@zeus.theochem.tuwien.ac.at/msg22621.html  
:



I am trying to find out why is ELPA module complaining of missing
MPI_THREAD_MULTIPLE .


We have a debate on this
https://git.gsi.de/SDEGroup/SIR/-/issues/85#note_55392


If you somebody could comment on this topic as I would like to
investigate deeply. Can somebody advice me a short ELPA testing
program for this ?


Best, Miro








___
Wien mailing list
Wien@zeus.theochem.tuwien.ac.at
http://zeus.theochem.tuwien.ac.at/mailman/listinfo/wien
SEARCH the MAILING-LIST at:
http://www.mail-archive.com/wien@zeus.theochem.tuwien.ac.at/index.html


___
Wien mailing list
Wien@zeus.theochem.tuwien.ac.at
http://zeus.theochem.tuwien.ac.at/mailman/listinfo/wien
SEARCH the MAILING-LIST 
at:http://www.mail-archive.com/wien@zeus.theochem.tuwien.ac.at/index.html___
Wien mailing list
Wien@zeus.theochem.tuwien.ac.at
http://zeus.theochem.tuwien.ac.at/mailman/listinfo/wien
SEARCH the MAILING-LIST at:  
http://www.mail-archive.com/wien@zeus.theochem.tuwien.ac.at/index.html


Re: [Wien] WARNING elpa_setup during lapw1_mpi

2023-07-04 Thread Ilias, Miroslav
Hello,


sorry for hidden link;


the point is that  the "ompi_info -a | grep THREAD" says MPI_THREAD_MULTIPLE: 
yes,


but  "spack info openmpi@4.1.5" gives "thread_multiple [off] [@1.5.4:2] on, off 
Enable MPI_THREAD_MULTIPLE support"


Maybe this is the case of the ELPA  "WARNING elpa_setup: MPI threading level 
MPI_THREAD_SERALIZED or MPI_THREAD_MULTIPLE required but your implementation 
does not support this! The number of OpenMP threads within ELPA will be limited 
to 1"

I will investigate it deeper, as I need Wien2k parallel as fast as possible.

Intel MPI will come later, again, I need ELPA built with Intel. I guess that 
IntelMPI+MKL  Wien2k is faster than OpenMPI+Openblas ?

Best, Miro


From: Wien  on behalf of Laurence 
Marks 
Sent: 04 July 2023 18:02
To: A Mailing list for WIEN2k users
Subject: Re: [Wien] WARNING elpa_setup during lapw1_mpi

That is a private site, do I cannot read anything.

All I can suggest is doing a Google search on "missing  MPI_THREAD_MULTIPLE". 
It looks as if you have to enable this in openmpi configure, although there 
might be some bugs. There could also be some environmental parameters that need 
to be set.

Or use Intel mpi. While it does not have everything that some sysadmin want, if 
you are using mkl & ifort (both free) it integrates trivially. I personally 
dislike openmpi as it keeps changing and is not always user friendly.

--
Professor Laurence Marks (Laurie)
Department of Materials Science and Engineering, Northwestern University
www.numis.northwestern.edu<http://www.numis.northwestern.edu>
"Research is to see what everybody else has seen, and to think what nobody else 
has thought" Albert Szent-Györgyi

On Tue, Jul 4, 2023, 09:33 Ilias, Miroslav 
mailto:m.il...@gsi.de>> wrote:

Dear Professor Marks,


concerning 
https://www.mail-archive.com/wien@zeus.theochem.tuwien.ac.at/msg22621.html  :


I am trying to find out why is ELPA module complaining of missing  
MPI_THREAD_MULTIPLE .


We have a debate on this  https://git.gsi.de/SDEGroup/SIR/-/issues/85#note_55392


If you somebody could comment on this topic as I would like to investigate 
deeply. Can somebody advice me a short ELPA testing program for this ?


Best, Miro








___
Wien mailing list
Wien@zeus.theochem.tuwien.ac.at<mailto:Wien@zeus.theochem.tuwien.ac.at>
http://zeus.theochem.tuwien.ac.at/mailman/listinfo/wien
SEARCH the MAILING-LIST at:  
http://www.mail-archive.com/wien@zeus.theochem.tuwien.ac.at/index.html
___
Wien mailing list
Wien@zeus.theochem.tuwien.ac.at
http://zeus.theochem.tuwien.ac.at/mailman/listinfo/wien
SEARCH the MAILING-LIST at:  
http://www.mail-archive.com/wien@zeus.theochem.tuwien.ac.at/index.html


Re: [Wien] WARNING elpa_setup during lapw1_mpi

2023-07-04 Thread Laurence Marks
That is a private site, do I cannot read anything.

All I can suggest is doing a Google search on "missing
MPI_THREAD_MULTIPLE". It looks as if you have to enable this in openmpi
configure, although there might be some bugs. There could also be some
environmental parameters that need to be set.

Or use Intel mpi. While it does not have everything that some sysadmin
want, if you are using mkl & ifort (both free) it integrates trivially. I
personally dislike openmpi as it keeps changing and is not always user
friendly.

--
Professor Laurence Marks (Laurie)
Department of Materials Science and Engineering, Northwestern University
www.numis.northwestern.edu
"Research is to see what everybody else has seen, and to think what nobody
else has thought" Albert Szent-Györgyi

On Tue, Jul 4, 2023, 09:33 Ilias, Miroslav  wrote:

> Dear Professor Marks,
>
>
> concerning
> https://www.mail-archive.com/wien@zeus.theochem.tuwien.ac.at/msg22621.html  :
>
>
>
> I am trying to find out why is ELPA module complaining of missing
> MPI_THREAD_MULTIPLE .
>
>
> We have a debate on this
> https://git.gsi.de/SDEGroup/SIR/-/issues/85#note_55392
>
>
> If you somebody could comment on this topic as I would like to investigate
> deeply. Can somebody advice me a short ELPA testing program for this ?
>
>
> Best, Miro
>
>
>
>
>
>
>
>
>
> ___
> Wien mailing list
> Wien@zeus.theochem.tuwien.ac.at
> http://zeus.theochem.tuwien.ac.at/mailman/listinfo/wien
> SEARCH the MAILING-LIST at:
> http://www.mail-archive.com/wien@zeus.theochem.tuwien.ac.at/index.html
>
___
Wien mailing list
Wien@zeus.theochem.tuwien.ac.at
http://zeus.theochem.tuwien.ac.at/mailman/listinfo/wien
SEARCH the MAILING-LIST at:  
http://www.mail-archive.com/wien@zeus.theochem.tuwien.ac.at/index.html


Re: [Wien] WARNING elpa_setup during lapw1_mpi

2023-07-04 Thread Ilias, Miroslav
Dear Professor Marks,


concerning 
https://www.mail-archive.com/wien@zeus.theochem.tuwien.ac.at/msg22621.html  :


I am trying to find out why is ELPA module complaining of missing  
MPI_THREAD_MULTIPLE .


We have a debate on this  https://git.gsi.de/SDEGroup/SIR/-/issues/85#note_55392


If you somebody could comment on this topic as I would like to investigate 
deeply. Can somebody advice me a short ELPA testing program for this ?


Best, Miro







___
Wien mailing list
Wien@zeus.theochem.tuwien.ac.at
http://zeus.theochem.tuwien.ac.at/mailman/listinfo/wien
SEARCH the MAILING-LIST at:  
http://www.mail-archive.com/wien@zeus.theochem.tuwien.ac.at/index.html


Re: [Wien] WARNING elpa_setup during lapw1_mpi

2023-06-28 Thread Laurence Marks
The message states the issue. You are requesting both mpi & omp in lapw1
but either or both your openmpi and elpa have not been compiled for this.

Modules are not a Wien2k issue, you will have to see what versions you have
available, and maybe ask your sysadmin.

It is a warning, lapw1 should still have run.

---
Professor Laurence Marks (Laurie)
Department of Materials Science and Engineering
Northwestern University
www.numis.northwestern.edu
"Research is to see what everybody else has seen, and to think what nobody
else has thought" Albert Szent-Györgyi

On Wed, Jun 28, 2023, 16:06 Ilias Miroslav, doc. RNDr., PhD. <
miroslav.il...@umb.sk> wrote:

>
> Hello,
>
> with parallel wien2k - here lapw1 - I am getting the warning below. At
> that machine, we are using spack modules,
> https://github.com/miroi/open-collection/blob/master/theoretical_chemistry/software/wien2k/runs/LvO2_on_small_quartz/wien2k/LvO2onQg/lxir127_bash_wien2k_gnu_openmpi_openblas.01/
> , both openmpi and elpa are loaded...
>
> The logfile
> https://github.com/miroi/open-collection/blob/master/theoretical_chemistry/software/wien2k/runs/LvO2_on_small_quartz/wien2k/LvO2onQg/lxir127_bash_wien2k_gnu_openmpi_openblas.01_logfile_unfinished
>
>
> Maybe some ill interactions between modules ?
>
> milias   18493 14653  0 22:56 pts/000:00:00 /bin/tcsh -f
> /data.local1/milias/software/wien2k/WIEN2k_23.2/gnu_openmpi_openblas/x
> lapw1 -p -c
> milias   18515 18493  1 22:56 pts/000:00:00 /bin/tcsh -f
> /data.local1/milias/software/wien2k/WIEN2k_23.2/gnu_openmpi_openblas/lapw1cpara
> -c lapw1.def
> milias   18633 18515  0 22:56 pts/000:00:00 /bin/tcsh -f
> /data.local1/milias/software/wien2k/WIEN2k_23.2/gnu_openmpi_openblas/lapw1cpara
> -c lapw1.def
> milias   18636 18633  0 22:56 pts/000:00:00 mpirun -np 8 -machinefile
> .machine1
> /data.local1/milias/software/wien2k/WIEN2k_23.2/gnu_openmpi_openblas/lapw1c_mpi
> lapw1_1.def
> milias   18640 18636 99 22:56 pts/000:00:22
> /data.local1/milias/software/wien2k/WIEN2k_23.2/gnu_openmpi_openblas/lapw1c_mpi
> lapw1_1.def
> milias   18641 18636 99 22:56 pts/000:00:22
> /data.local1/milias/software/wien2k/WIEN2k_23.2/gnu_openmpi_openblas/lapw1c_mpi
> lapw1_1.def
> milias   18642 18636 99 22:56 pts/000:00:22
> /data.local1/milias/software/wien2k/WIEN2k_23.2/gnu_openmpi_openblas/lapw1c_mpi
> lapw1_1.def
> milias   18643 18636 99 22:56 pts/000:00:22
> /data.local1/milias/software/wien2k/WIEN2k_23.2/gnu_openmpi_openblas/lapw1c_mpi
> lapw1_1.def
> milias   18644 18636 99 22:56 pts/000:00:22
> /data.local1/milias/software/wien2k/WIEN2k_23.2/gnu_openmpi_openblas/lapw1c_mpi
> lapw1_1.def
> milias   18645 18636 99 22:56 pts/000:00:22
> /data.local1/milias/software/wien2k/WIEN2k_23.2/gnu_openmpi_openblas/lapw1c_mpi
> lapw1_1.def
> milias   18646 18636 99 22:56 pts/000:00:22
> /data.local1/milias/software/wien2k/WIEN2k_23.2/gnu_openmpi_openblas/lapw1c_mpi
> lapw1_1.def
> milias   18648 18636 99 22:56 pts/000:00:22
> /data.local1/milias/software/wien2k/WIEN2k_23.2/gnu_openmpi_openblas/lapw1c_mpi
> lapw1_1.def
>
>
> WARNING elpa_setup: MPI threading level MPI_THREAD_SERALIZED or
> MPI_THREAD_MULTIPLE required but your implementation does not support this!
> The number of OpenMP threads
> within ELPA will be limited to 1
> WARNING elpa_setup: MPI threading level MPI_THREAD_SERALIZED or
> MPI_THREAD_MULTIPLE required but your implementation does not support this!
> The number of OpenMP threads
> within ELPA will be limited to 1
>
>
> ___
> Wien mailing list
> Wien@zeus.theochem.tuwien.ac.at
> http://zeus.theochem.tuwien.ac.at/mailman/listinfo/wien
> SEARCH the MAILING-LIST at:
> http://www.mail-archive.com/wien@zeus.theochem.tuwien.ac.at/index.html
>
___
Wien mailing list
Wien@zeus.theochem.tuwien.ac.at
http://zeus.theochem.tuwien.ac.at/mailman/listinfo/wien
SEARCH the MAILING-LIST at:  
http://www.mail-archive.com/wien@zeus.theochem.tuwien.ac.at/index.html