[gmx-users] configure: error: cannot compute sizeof (off_t)
Im not quite sure if this is the right place to analyze, but a bit before the part in the config.log where it states the error of cannot compuete sizeof (off_t), there is this: configure:25048: checking size of off_t configure:25053: /usr/local/bin/mpicc -o conftest -O3 -fomit-frame-pointer -fin line-functions -Wall -Wno-unused -msse2 -funroll-all-loops -std=gnu99 -I/usr/loc al/include -L/usr/local/lib conftest.c -lnsl -lm 5 configure:25053: $? = 0 configure:25053: ./conftest ./conftest: error while loading shared libraries: libmpi.so.0: cannot open share d object file: No such file or directory configure:25053: $? = 127 configure: program exited with status 127 configure: failed program was: | /* confdefs.h */ | #define PACKAGE_NAME gromacs | #define PACKAGE_TARNAME gromacs | #define PACKAGE_VERSION 4.5.3 | #define PACKAGE_STRING gromacs 4.5.3 | #define PACKAGE_BUGREPORT gmx-users@gromacs.org | #define PACKAGE_URL | #define PACKAGE gromacs | #define VERSION 4.5.3 | #define GMX_SOFTWARE_INVSQRT /**/ | #define GMX_QMMM_GAUSSIAN /**/ Also before this there a multiple chunks of such configure:25019: $? = 0 configure:25019: ./conftest ./conftest: error while loading shared libraries: libmpi.so.0: cannot open share d object file: No such file or directory configure:25019: $? = 127 configure: program exited with status 127 configure: failed program was: | /* confdefs.h */ parts all over the place. It seems like I am missing some files somewhere? On Feb 20, 2011, at 9:30 PM, Justin Kat wrote: * Dear experts, ** ** I am still unable to overcome this error during the configuration: ** ** configure: error: cannot compute sizeof (off_t) ** See `config.log' for more details. *So what does config.log say about cannot compute sizeof (off_t) ? Carsten * ** I came across this thread with the exact same setup as I have: ** ** http://lists.gromacs.org/pipermail/gmx-users/2011-February/058369.html ** ** I have tried uninstalling openmpi 1.4.4 and installing the more stable openmpi1.4.3 but I am still experiencing the same error. ** ** ./configure --enable-mpi --program-suffix=_mpi MPICC=/usr/local/bin/mpicc --with-fft=fftw3 ** ** I have also tried to explicitly provide the path to mpicc as above but it still gives me the same error. ** ** This may or may not be relevant but at the end of the config.log there is also this line: ** ** configure: exit 77 ** ** Does that mean anything? ** ** Any help at all is appreciated! ** ** Thanks, ** Justin-- ** gmx-users mailing listgmx-users at gromacs.org http://lists.gromacs.org/mailman/listinfo/gmx-users ** http://lists.gromacs.org/mailman/listinfo/gmx-users ** Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/Search before posting! ** Please don't post (un)subscribe requests to the list. Use the ** www interface or send it to gmx-users-request at gromacs.org. http://lists.gromacs.org/mailman/listinfo/gmx-users ** Can't post? Read http://www.gromacs.org/Support/Mailing_Lists * -- Dr. Carsten Kutzner Max Planck Institute for Biophysical Chemistry Theoretical and Computational Biophysics Am Fassberg 11, 37077 Goettingen, Germany Tel. +49-551-2012313, Fax: +49-551-2012302 http://www.mpibpc.mpg.de/home/grubmueller/ihp/ckutzne -- gmx-users mailing listgmx-users@gromacs.org http://lists.gromacs.org/mailman/listinfo/gmx-users Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/Search before posting! Please don't post (un)subscribe requests to the list. Use the www interface or send it to gmx-users-requ...@gromacs.org. Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
[gmx-users] configure: error: cannot compute sizeof (off_t)
Dear experts, I am still unable to overcome this error during the configuration: configure: error: cannot compute sizeof (off_t) See `config.log' for more details. I came across this thread with the exact same setup as I have: http://lists.gromacs.org/pipermail/gmx-users/2011-February/058369.html I have tried uninstalling openmpi 1.4.4 and installing the more stable openmpi1.4.3 but I am still experiencing the same error. ./configure --enable-mpi --program-suffix=_mpi MPICC=/usr/local/bin/mpicc --with-fft=fftw3 I have also tried to explicitly provide the path to mpicc as above but it still gives me the same error. This may or may not be relevant but at the end of the config.log there is also this line: configure: exit 77 Does that mean anything? Any help at all is appreciated! Thanks, Justin-- gmx-users mailing listgmx-users@gromacs.org http://lists.gromacs.org/mailman/listinfo/gmx-users Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/Search before posting! Please don't post (un)subscribe requests to the list. Use the www interface or send it to gmx-users-requ...@gromacs.org. Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
[gmx-users] configure: error: cannot compute sizeof (off_t)
Dear experts, I have installed Gromacs under the normal configuration and am now attempting to install the mpi version of mdrun on a machine, however I am getting this error during the configuration: configure: error: cannot compute sizeof (off_t) See `config.log' for more details. I cant seem to find anything in the config.log file that might point me in the right direction. The command that I used to attempt to configure the mpi version is: ./configure --enable-mpi --program-suffix=_mpi MPICC=/usr/lib64/openmpi/1.4-gcc/bin/mpicc I have also tried to append my path so that it includes these paths: /usr/lib64/openmpi/1.4-gcc/include /usr/lib64/openmpi/1.4-gcc/ But it seems to have no effect so Im guessing there is somewhere else I must look into. Appreciate any help! Thank you! -- gmx-users mailing listgmx-users@gromacs.org http://lists.gromacs.org/mailman/listinfo/gmx-users Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/Search before posting! Please don't post (un)subscribe requests to the list. Use the www interface or send it to gmx-users-requ...@gromacs.org. Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
[gmx-users] mpirun error?
Dear Gromacs, My colleague has attempted to issue this command: mpirun -np 8 (or 7) mdrun_mpi .. (etc) According to him, he gets the following error message: MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD with errorcode -1. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. -- --- Program mdrun_mpi, VERSION 4.0.7 Source code file: domdec.c, line: 5888 Fatal error: There is no domain decomposition for 7 nodes that is compatible with the given box and a minimum cell size of 0.955625 nm Change the number of nodes or mdrun option -rcon or -dds or your LINCS settings However, when he uses say, -np 6, he seems to get no error. Any insight on why this might be happening? Also, when he saves the output to a file, sometimes he sees the following: NOTE: Turning on dynamic load balancing Is this another flag that might be causing the crash? What does that line mean? Thanks! Justin -- gmx-users mailing listgmx-users@gromacs.org http://lists.gromacs.org/mailman/listinfo/gmx-users Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/Search before posting! Please don't post (un)subscribe requests to the list. Use the www interface or send it to gmx-users-requ...@gromacs.org. Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
[gmx-users] Output error mpirun/mdrun_mpi
Dear gmx-users, My colleague seems to be experiencing an output containing errors after issuing the command below: mpirun -np 8 mdrun_mpi -s *.tpr -o *.tpr -c out -v outt.mdrun_md The output reads as follows: NNODES=8, MYRANK=0, HOSTNAME=node3.reyclus.loc NODEID=0 argc=8 :-) G R O M A C S (-: NNODES=8, MYRANK=7, HOSTNAME=node3.reyclus.loc NNODES=8, MYRANK=5, HOSTNAME=node3.reyclus.loc GRoups of Organic Molecules in ACtion for Science :-) VERSION 4.0.7 (-: Written by David van der Spoel, Erik Lindahl, Berk Hess, and others. Copyright (c) 1991-2000, University of Groningen, The Netherlands. Copyright (c) 2001-2008, The GROMACS development team, check out http://www.gromacs.org for more information. This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. :-) mdrun_mpi (-: NNODES=8, MYRANK=4, HOSTNAME=node3.reyclus.loc NODEID=4 argc=8 NNODES=8, MYRANK=1, HOSTNAME=node3.reyclus.loc NODEID=1 argc=8 NODEID=5 argc=8 NNODES=8, MYRANK=2, HOSTNAME=node3.reyclus.loc NODEID=2 argc=8 NNODES=8, MYRANK=6, HOSTNAME=node3.reyclus.loc NODEID=6 argc=8 100 steps, 1000.0 ps. step 0 imb F 3% step 100, will finish Sat Jan 29 14:50:02 2011 imb F 394% step 200, will finish Sat Jan 29 17:50:38 2011 imb F 3% step 300, will finish Sat Jan 29 17:00:30 2011 imb F 3% step 400, will finish Sat Jan 29 15:12:15 2011 imb F 3% step 500, will finish Sat Jan 29 14:40:28 2011 imb F 4% step 600, will finish Sat Jan 29 13:51:32 2011 imb F 5% step 700, will finish Sat Jan 29 13:16:34 2011 It would be much appreciated if some light could be shed on what is going wrong perhaps. Thank you, Justin -- gmx-users mailing listgmx-users@gromacs.org http://lists.gromacs.org/mailman/listinfo/gmx-users Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/Search before posting! Please don't post (un)subscribe requests to the list. Use the www interface or send it to gmx-users-requ...@gromacs.org. Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
Re: [gmx-users] mdrun_mpi executable not found
Thank you, I have been to that page probably a good 100 times by now. Was the 'No.' response with regards to my primary question? Or to the one within the parentheses? Suppose I remove my existing installation and reinstall, I am hoping to figure out when/where exactly should I specify --program-suffix=_mpi so as to not overwrite the pre-existing serial mdrun as I have mistakenly done so with my current installation. *./configure --enable-mpi --program-suffix=_mpi **make mdrun **make install-mdrun **make links* Lastly, if the above set of commands are incorrect, or will not carry out what I intend (to build a separate mdrun_mpi executable apart from the existing mdrun after a normal build), I am kindly requesting for a suitable revision. Thanks, Justin On 26/01/2011 8:50 AM, Justin Kat wrote: * Alright. So meaning I should have instead issued: ** ** ./configure --enable-mpi --program-suffix=_mpi ** make mdrun ** make install-mdrun ** make links ** ** to have installed an MPI-enabled executable called mdrun_mpi apart ** from the existing mdrun executable? (Would I also need to append the ** _mpi suffix when issuing the first two make and make install commands ** above? * No. See http://www.gromacs.org/Downloads/Installation_Instructions Mark * ** Thanks, ** Justin ** ** On Mon, Jan 24, 2011 at 8:08 PM, Justin A. Lemkul jalemkul at vt.edu http://lists.gromacs.org/mailman/listinfo/gmx-users * - Show quoted text - * mailto:jalemkul at vt.edu http://lists.gromacs.org/mailman/listinfo/gmx-users wrote: ** ** ** ** Justin Kat wrote: ** Thank you for the reply! ** ** hmm mdrun_mpi does not appear in the list of executables in ** /usr/local/gromacs/bin (and well therefore not in /usr/local/bin). ** ** Which set of installation commands that I used should have ** compiled the ** mdrun_mpi executable? And how should I go about getting the ** mdrun_mpi ** executable at this point? ** ** ** I see it now. When you configured with --enable-mpi, you didn't ** specify ** --program-suffix=_mpi, so the installation procedure over-wrote ** your existing ** (serial) mdrun with an MPI-enabled one simply called mdrun. The ** configure ** output should have warned you about this. You could, in theory, ** simply re-name ** your existing executable mdrun_mpi and then re-install a serial ** mdrun, if you ** need it. ** ** -Justin ** ** -- ** ** ** Justin A. Lemkul ** Ph.D. Candidate ** ICTAS Doctoral Scholar ** MILES-IGERT Trainee ** Department of Biochemistry ** Virginia Tech ** Blacksburg, VA * * jalemkul[at]vt.edu http://vt.edu | (540) 231-9080 * * http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin ** ** ** -- * * gmx-users mailing list gmx-users at gromacs.org http://lists.gromacs.org/mailman/listinfo/gmx-users ** mailto:gmx-users at gromacs.org http://lists.gromacs.org/mailman/listinfo/gmx-users * * http://lists.gromacs.org/mailman/listinfo/gmx-users ** Please search the archive at ** http://www.gromacs.org/Support/Mailing_Lists/Search before posting! ** Please don't post (un)subscribe requests to the list. Use the * * www interface or send it to gmx-users-request at gromacs.org http://lists.gromacs.org/mailman/listinfo/gmx-users ** mailto:gmx-users-request at gromacs.org http://lists.gromacs.org/mailman/listinfo/gmx-users. * * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists ** ** * -- gmx-users mailing listgmx-users@gromacs.org http://lists.gromacs.org/mailman/listinfo/gmx-users Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/Search before posting! Please don't post (un)subscribe requests to the list. Use the www interface or send it to gmx-users-requ...@gromacs.org. Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
Re: [gmx-users] mdrun_mpi executable not found
./configure --enable-mpi --program-suffix=_mpi make mdrun make install-mdrun make links Sorry for the random asterisk* symbols they must have came through from some formatting. On Wed, Jan 26, 2011 at 12:53 PM, Justin Kat justin@mail.mcgill.cawrote: Thank you, I have been to that page probably a good 100 times by now. Was the 'No.' response with regards to my primary question? Or to the one within the parentheses? Suppose I remove my existing installation and reinstall, I am hoping to figure out when/where exactly should I specify --program-suffix=_mpi so as to not overwrite the pre-existing serial mdrun as I have mistakenly done so with my current installation. *./configure --enable-mpi --program-suffix=_mpi **make mdrun **make install-mdrun **make links* Lastly, if the above set of commands are incorrect, or will not carry out what I intend (to build a separate mdrun_mpi executable apart from the existing mdrun after a normal build), I am kindly requesting for a suitable revision. Thanks, Justin On 26/01/2011 8:50 AM, Justin Kat wrote: * Alright. So meaning I should have instead issued: ** ** ./configure --enable-mpi --program-suffix=_mpi ** make mdrun ** make install-mdrun ** make links ** ** to have installed an MPI-enabled executable called mdrun_mpi apart ** from the existing mdrun executable? (Would I also need to append the ** _mpi suffix when issuing the first two make and make install commands ** above? * No. See http://www.gromacs.org/Downloads/Installation_Instructions Mark * ** Thanks, ** Justin ** ** On Mon, Jan 24, 2011 at 8:08 PM, Justin A. Lemkul jalemkul at vt.edu http://lists.gromacs.org/mailman/listinfo/gmx-users * - Show quoted text - * mailto:jalemkul at vt.edu http://lists.gromacs.org/mailman/listinfo/gmx-users wrote: ** ** ** ** Justin Kat wrote: ** Thank you for the reply! ** ** hmm mdrun_mpi does not appear in the list of executables in ** /usr/local/gromacs/bin (and well therefore not in /usr/local/bin). ** ** Which set of installation commands that I used should have ** compiled the ** mdrun_mpi executable? And how should I go about getting the ** mdrun_mpi ** executable at this point? ** ** ** I see it now. When you configured with --enable-mpi, you didn't ** specify ** --program-suffix=_mpi, so the installation procedure over-wrote ** your existing ** (serial) mdrun with an MPI-enabled one simply called mdrun. The ** configure ** output should have warned you about this. You could, in theory, ** simply re-name ** your existing executable mdrun_mpi and then re-install a serial ** mdrun, if you ** need it. ** ** -Justin ** ** -- ** ** ** Justin A. Lemkul ** Ph.D. Candidate ** ICTAS Doctoral Scholar ** MILES-IGERT Trainee ** Department of Biochemistry ** Virginia Tech ** Blacksburg, VA * * jalemkul[at]vt.edu http://vt.edu | (540) 231-9080 * * http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin ** ** ** -- * * gmx-users mailing list gmx-users at gromacs.org http://lists.gromacs.org/mailman/listinfo/gmx-users ** mailto:gmx-users at gromacs.org http://lists.gromacs.org/mailman/listinfo/gmx-users * * http://lists.gromacs.org/mailman/listinfo/gmx-users ** Please search the archive at ** http://www.gromacs.org/Support/Mailing_Lists/Search before posting! ** Please don't post (un)subscribe requests to the list. Use the * * www interface or send it to gmx-users-request at gromacs.org http://lists.gromacs.org/mailman/listinfo/gmx-users ** mailto:gmx-users-request at gromacs.org http://lists.gromacs.org/mailman/listinfo/gmx-users. * * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists ** ** * -- gmx-users mailing listgmx-users@gromacs.org http://lists.gromacs.org/mailman/listinfo/gmx-users Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/Search before posting! Please don't post (un)subscribe requests to the list. Use the www interface or send it to gmx-users-requ...@gromacs.org. Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
Re: [gmx-users] mdrun_mpi executable not found
Alright. So meaning I should have instead issued: ./configure --enable-mpi --program-suffix=_mpi make mdrun make install-mdrun make links to have installed an MPI-enabled executable called mdrun_mpi apart from the existing mdrun executable? (Would I also need to append the _mpi suffix when issuing the first two make and make install commands above? Thanks, Justin On Mon, Jan 24, 2011 at 8:08 PM, Justin A. Lemkul jalem...@vt.edu wrote: Justin Kat wrote: Thank you for the reply! hmm mdrun_mpi does not appear in the list of executables in /usr/local/gromacs/bin (and well therefore not in /usr/local/bin). Which set of installation commands that I used should have compiled the mdrun_mpi executable? And how should I go about getting the mdrun_mpi executable at this point? I see it now. When you configured with --enable-mpi, you didn't specify --program-suffix=_mpi, so the installation procedure over-wrote your existing (serial) mdrun with an MPI-enabled one simply called mdrun. The configure output should have warned you about this. You could, in theory, simply re-name your existing executable mdrun_mpi and then re-install a serial mdrun, if you need it. -Justin -- Justin A. Lemkul Ph.D. Candidate ICTAS Doctoral Scholar MILES-IGERT Trainee Department of Biochemistry Virginia Tech Blacksburg, VA jalemkul[at]vt.edu | (540) 231-9080 http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin -- gmx-users mailing listgmx-users@gromacs.org http://lists.gromacs.org/mailman/listinfo/gmx-users Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/Search before posting! Please don't post (un)subscribe requests to the list. Use the www interface or send it to gmx-users-requ...@gromacs.org. Can't post? Read http://www.gromacs.org/Support/Mailing_Lists -- gmx-users mailing listgmx-users@gromacs.org http://lists.gromacs.org/mailman/listinfo/gmx-users Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/Search before posting! Please don't post (un)subscribe requests to the list. Use the www interface or send it to gmx-users-requ...@gromacs.org. Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
[gmx-users] mdrun_mpi executable not found
Dear gmx users, I have installed the parallel version 4.0.7 of gromacs on one of the nodes of my cluster. Here is the steps I've done through root: first, the normal installation: ./configure make make install make links then issued commands below for the mpi build: ./configure --enable-mpi make mdrun make install-mdrun make links I dont see any errors and everything seems to install fine. I then switch to a normal user to do my work and then after issuing the grompp_md command as usual, I entered the command below: mpirun -np 8 mdrun_mpi -s *.tpr -o *.tpr -c *_after_md -v output.mdrun_md however, output.mdrun_md gives: mpirun was unable to launch the specified application as it could not find an executable: Executable: mdrun_mpi Node: node3.reyclus.loc while attempting to start process rank 0. Was the installation procedure incorrect? Or do I need to go through a separate installation for the mdrun_mpi executable? Please guide me on what is wrong. Thanks, Justin -- gmx-users mailing listgmx-users@gromacs.org http://lists.gromacs.org/mailman/listinfo/gmx-users Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/Search before posting! Please don't post (un)subscribe requests to the list. Use the www interface or send it to gmx-users-requ...@gromacs.org. Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
Re: [gmx-users] mdrun_mpi executable not found
Thank you for the reply! hmm mdrun_mpi does not appear in the list of executables in /usr/local/gromacs/bin (and well therefore not in /usr/local/bin). Which set of installation commands that I used should have compiled the mdrun_mpi executable? And how should I go about getting the mdrun_mpi executable at this point? Justin On Mon, Jan 24, 2011 at 6:57 PM, Justin A. Lemkul jalem...@vt.edu wrote: Justin Kat wrote: Dear gmx users, I have installed the parallel version 4.0.7 of gromacs on one of the nodes of my cluster. Here is the steps I've done through root: first, the normal installation: ./configure make make install make links then issued commands below for the mpi build: ./configure |--enable-mpi| make mdrun make install-mdrun make links I dont see any errors and everything seems to install fine. I then switch to a normal user to do my work and then after issuing the grompp_md command as usual, I entered the command below: mpirun -np 8 mdrun_mpi -s *.tpr -o *.tpr -c *_after_md -v output.mdrun_md however, output.mdrun_md gives: mpirun was unable to launch the specified application as it could not find an executable: Executable: mdrun_mpi Node: node3.reyclus.loc while attempting to start process rank 0. Was the installation procedure incorrect? Or do I need to go through a separate installation for the mdrun_mpi executable? Please guide me on what is wrong. No, the commands you gave should have built mdrun_mpi, as long as they finished successfully. Were there errors in the installation? With make links you should have links to all the Gromacs executables in /usr/local/bin - are they there? You don't need to make links, instead you can follow the steps here: http://www.gromacs.org/Downloads/Installation_Instructions#Getting_access_to_GROMACS_after_installation -Justin Thanks, Justin -- Justin A. Lemkul Ph.D. Candidate ICTAS Doctoral Scholar MILES-IGERT Trainee Department of Biochemistry Virginia Tech Blacksburg, VA jalemkul[at]vt.edu | (540) 231-9080 http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin -- gmx-users mailing listgmx-users@gromacs.org http://lists.gromacs.org/mailman/listinfo/gmx-users Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/Search before posting! Please don't post (un)subscribe requests to the list. Use the www interface or send it to gmx-users-requ...@gromacs.org. Can't post? Read http://www.gromacs.org/Support/Mailing_Lists -- gmx-users mailing listgmx-users@gromacs.org http://lists.gromacs.org/mailman/listinfo/gmx-users Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/Search before posting! Please don't post (un)subscribe requests to the list. Use the www interface or send it to gmx-users-requ...@gromacs.org. Can't post? Read http://www.gromacs.org/Support/Mailing_Lists