Will you help me now how to locate the "libmpi_usempif08.so.40"
The libmpi_usempif08.so.40 seems to be a Open MPI file based on the
webpage at:
https://superuser.com/questions/1500931/error-in-linking-libmpi-so
It looks like Shared Memory Open MPI might have got linked in when you
compiled
Dear Prof. Blaha,
Thanks for your clarification. Will you help me now how to locate the
"libmpi_usempif08.so.40"
I would like to know, Is there any relation to the mentioned error and
OMP_SWITCH.
Do you suggest that I must install wien2k by loading the cray-mpich module
instead of the intel modu
In any queuing system there is a way that your job gets to know which
nodes you have.
Then use a script to write the .machines file on the fly. Examples at:
http://www.wien2k.at/reg_user/faq/
Am 8/6/21 um 8:12 AM schrieb venky ch:
Dear Prof. Blaha,
Thank you for your reply.
Yes, I have also
Dear Prof. Blaha,
Thank you for your reply.
Yes, I have also loaded the module intel in the job script.
Further, I would like to know that If there is no way to get the nodelist
from a HPC, then how one could write the .machines files to run the mpi
parallelization. Is there any way to have a un
You mentioned that you loaded the module intel when compiling wien2k.
Did you also load this module when running the code ?
PS: Please read the UG about k-point and mpi parallelization.
Am 06.08.2021 um 06:50 schrieb venky ch:
Dear Prof. Marks,
Thanks for your reply. Before installing the Wie
Dear Prof. Marks,
Thanks for your reply. Before installing the Wien2k, I have loaded the
module intel which drives me to the mpi path as defined in the
WIEN2k_OPTIONS. Further, these switches "current:OMP_SWITCH:-qopenmp &
current:OMP_SWITCHP:-qopenmp" are automatically selected.
I would like to
Use of openmpi versus Intel mpi, Cray, mvapich etc is not a Wien2k
question. You need to ensure that you have a working mpi which you can
compile against. Did you read
http://www.serc.iisc.ac.in/message-passing-toolkit-cray/ and load the
relevant modules?
On Thu, Aug 5, 2021 at 2:46 PM venky ch w
Dear Wien2k users,
I have successfully installed the Wien2k_21 version at the HPC cluster
(CrayXC40) of my institute (
http://www.serc.iisc.ac.in/supercomputer/for-traditional-hpc-simulations-sahasrat/),.
While running the parallel calculations there, I noticed a "lapw0_mpi:
error" as given below.
8 matches
Mail list logo