Hi Fred,

For us, PBS (free) is installed to a shared filesystem that is mounted via NFS 
to the galaxy server (and via GPFS to the rest of the HPC). The server home 
directory is also custom (/opt) but local to the disk on each HPC node 
(including galaxy).

I have tested running a simple PBS script from the galaxy node as the galaxy 
user and it was able to run successfully. In your case, have you installed PBS 
Pro with all default installations paths? It seems to me that the pbs-python 
module is happy to be installed with PBS defaults but has trouble with a custom 
PBS install. I am having to look into the pbs-python installer script to hunt 
down the relevant variables.

Re: parse error – I had configured a basic job_conf.xml to test with galaxy, 
but it seems it was too basic. Are you perchance able to provide an example of 
your job_conf.xml for reference?

Thanks,
Sandra

From: SAPET, Frederic via galaxy-dev <galaxy-dev@lists.galaxyproject.org>
Sent: Wednesday, 28 August 2019 5:09 PM
To: Sandra Maksimovic <sandra.maksimo...@mcri.edu.au>; 
'galaxy-dev@lists.galaxyproject.org' <galaxy-dev@lists.galaxyproject.org>
Subject: [galaxy-dev] Re: pbs-python issues

Hi Sandra

We're running a Galaxy with PBS Pro as jobs scheduler.
I never had any problem with pbs-python however.

Here, for us, the PBS client is installed locally (on the VM that hosts Galaxy).

Are you sure that PBS is well installed ? Are you able to launch a simple PBS 
script with the galaxy user ?
Maybe the error you see (xml.etree.ElementTree.ParseError ) is because one XML 
config file has an issue.

Fred

-----Message d'origine-----
De : Sandra Maksimovic 
<sandra.maksimo...@mcri.edu.au<mailto:sandra.maksimo...@mcri.edu.au>>
Envoyé : mercredi 28 août 2019 02:36
À : 'galaxy-dev@lists.galaxyproject.org' 
<galaxy-dev@lists.galaxyproject.org<mailto:galaxy-dev@lists.galaxyproject.org>>
Objet : [galaxy-dev] pbs-python issues

Hi all,

As a galaxy newbie, I'm struggling to get the pbs-python module working on our 
galaxy instance. PBS is installed to a custom location on a shared filesystem 
mounted via NFS to our galaxy server, however, installing pbs-python using the 
git clone / python venv method in the documentation fails because it can't find 
PBS and there does not appear to be any way to define it for pbs-python (I 
could be wrong?).

...
gcc -pthread -fno-strict-aliasing -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 
-fexceptions -fstack-protector-strong --param=ssp-buffer-size=4 
-grecord-gcc-switches -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -DNDEBUG 
-O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions 
-fstack-protector-strong --param=ssp-buffer-size=4 -grecord-gcc-switches -m64 
-mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -fPIC -DTORQUE_4 
-I/usr/include/torque -Isrc/C++ -I/usr/include/python2.7 -c 
src/C++/pbs_wrap.cxx -o build/temp.linux-x86_64-2.7/src/C++/pbs_wrap.o
In file included from src/C++/pbs_wrap.cxx:2978:0:
src/C++/pbs_ifl.h:90:32: fatal error: u_hash_map_structs.h: No such file or 
directory #include "u_hash_map_structs.h"
^
compilation terminated.
error: command 'gcc' failed with exit status 1

So I went ahead and installed pbs_python from source (which does allow you 
define a PBS_PYTHON_INCLUDEDIR environment variable), however, galaxy does not 
seem to like this as evidenced by errors during startup. I suspect this has to 
do with pbs_python not being installed into the galaxy virtual environment.

galaxy[97486]: Traceback (most recent call last):
galaxy[97486]: File "<string>", line 1, in <module>
galaxy[97486]: File 
"/hpc/software/installed/galaxy/19.05/lib/galaxy/dependencies/__init__.py", 
line 179, in optional
galaxy[97486]: conditional = ConditionalDependencies(config_file)
galaxy[97486]: File 
"/hpc/software/installed/galaxy/19.05/lib/galaxy/dependencies/__init__.py", 
line 32, in __init__
galaxy[97486]: self.parse_configs()
galaxy[97486]: File 
"/hpc/software/installed/galaxy/19.05/lib/galaxy/dependencies/__init__.py", 
line 41, in parse_configs
galaxy[97486]: for plugin in 
ElementTree.parse(job_conf_xml).find('plugins').findall('plugin'):
galaxy[97486]: File "/usr/lib64/python2.7/xml/etree/ElementTree.py", line 1182, 
in parse
galaxy[97486]: tree.parse(source, parser)
galaxy[97486]: File "/usr/lib64/python2.7/xml/etree/ElementTree.py", line 656, 
in parse
galaxy[97486]: parser.feed(data)
galaxy[97486]: File "/usr/lib64/python2.7/xml/etree/ElementTree.py", line 1642, 
in feed
galaxy[97486]: self._raiseerror(v)
galaxy[97486]: File "/usr/lib64/python2.7/xml/etree/ElementTree.py", line 1506, 
in _raiseerror
galaxy[97486]: raise err
galaxy[97486]: xml.etree.ElementTree.ParseError: junk after document element: 
line 4, column 0

I've attempted reinstalls of the git clone / python venv method with the PBS 
environment variable to no avail. I was wondering if someone might have any 
ideas about working around this roadblock, or may have encountered a similar 
module installation issue in the past?

Also, I was hoping to get some ideas/examples of generic job_conf.xml 
definitions for PBS clusters? Things like best practices, caveats, etc. My 
understanding is that, unless configured otherwise, galaxy will submit jobs as 
the galaxy user and that configuring the server to run jobs as end users 
themselves is difficult/risky. Just wondering if you guys might have 
opinions/thoughts/recommendations about this?

And finally, what would be a good way to test that galaxy is submitting jobs to 
the queue properly? Is there some generic test data/procedure to verify that 
the galaxy instance is working as expected?

Thanks,

Sandra Maksimovic
Systems Administrator
Information Technology

Murdoch Children's Research Institute
The Royal Children's Hospital, 50 Flemington Road Parkville, Victoria 3052 
Australia

T +61 3 8341 6498
E 
sandra.maksimo...@mcri.edu.au<mailto:sandra.maksimo...@mcri.edu.au<mailto:sandra.maksimo...@mcri.edu.au%3cmailto:sandra.maksimo...@mcri.edu.au>>
W mcri.edu.au<https://www.mcri.edu.au/<https://www.mcri.edu.au/>>

Disclaimer

This e-mail and any attachments to it (the "Communication") are, unless 
otherwise stated, confidential, may contain copyright material and is for the 
use only of the intended recipient. If you receive the Communication in error, 
please notify the sender immediately by return e-mail, delete the Communication 
and the return e-mail, and do not read, copy, retransmit or otherwise deal with 
it. Any views expressed in the Communication are those of the individual sender 
only, unless expressly stated to be those of Murdoch Children’s Research 
Institute (MCRI) ABN 21 006 566 972 or any of its related entities. MCRI does 
not accept liability in connection with the integrity of or errors in the 
Communication, computer virus, data corruption, interference or delay arising 
from or in respect of the Communication.
___________________________________________________________
Please keep all replies on the list by using "reply all"
in your mail client. To manage your subscriptions to this and other Galaxy 
lists, please use the interface at:
%(web_page_url)s

To search Galaxy mailing lists use the unified search at:
http://galaxyproject.org/search/<http://galaxyproject.org/search/>
___________________________________________________________
Please keep all replies on the list by using "reply all"
in your mail client. To manage your subscriptions to this
and other Galaxy lists, please use the interface at:
%(web_page_url)s

To search Galaxy mailing lists use the unified search at:
http://galaxyproject.org/search/<http://galaxyproject.org/search/>
___________________________________________________________
Please keep all replies on the list by using "reply all"
in your mail client.  To manage your subscriptions to this
and other Galaxy lists, please use the interface at:
  %(web_page_url)s

To search Galaxy mailing lists use the unified search at:
  http://galaxyproject.org/search/

Reply via email to