Hi,
If I have a tool wrapper with this:
I get an error when I start Galaxy:
galaxy.tools ERROR 2012-09-19 12:09:08,056 Error reading tool from
path: RGalaxy_test_tool/foo.xml
Traceback (most recent call last):
File "/Users/dtenenba/dev/galaxy-dist/lib/galaxy/tools/__init__.py",
line 332,
From: ja...@taylorlab.org [mailto:ja...@taylorlab.org] On Behalf Of James Taylor
Sent: Wednesday, September 19, 2012 12:36 PM
To: Marc Logghe
Cc: galaxy-dev@lists.bx.psu.edu
Subject: Re: [galaxy-dev] concatenate datasets: how to fixate first dataset
No, but this is a good idea that can be impleme
On Sep 18, 2012, at 6:31 PM, Anthonius deBoer wrote:
> Hi,
>
> Jobs that I start on my in-house Galaxy instance now take up to 3-4 min to go
> from queued to Running, even though there is nothing much going on on the
> galaxy server...
>
> I have been running this instance since June and use a
repost to galaxy-dev
On 9/7/12 6:39 PM, Lukasz Lacinski wrote:
Dear All,
I use an init script that comes with Galaxy in the contrib/ subdirectory
to start Galaxy. The log file
--log-file /home/galaxy/galaxy.log
specified in the script grows really quickly. How to logrotate the file?
Thanks,
Odd, it works for me on EC2/Cloudman.
jorrit
On 09/19/2012 03:29 PM, Peter Cock wrote:
On Tue, Sep 18, 2012 at 7:11 PM, Scott McManus wrote:
Sorry - that's changeset 7714:3f12146d6d81
-Scott
Hi Scott,
The good news is this error does seem to be fixed as of that commit:
TypeError: check_to
On Tue, Sep 18, 2012 at 7:11 PM, Scott McManus wrote:
> Sorry - that's changeset 7714:3f12146d6d81
>
> -Scott
Hi Scott,
The good news is this error does seem to be fixed as of that commit:
TypeError: check_tool_output() takes exactly 5 arguments (4 given)
The bad news is my cluster jobs still
I added this snippet to the top of my extract_dataset_part.py:
pkg_resources.require("simplejson")
# wait until this process' PID is the first PID of all processes with
the same name, then import
while True:
with os.popen("ps ax|grep extract_dataset_part.py |grep -v grep|awk
'{print $1}'"
No, but this is a good idea that can be implemented as a new type of
workflow module (fixed data set, like input dataset but specified at
workflow build time). Can you file an issue in bitbucket describing how you
would like this feature to work? Thanks!
On Wednesday, September 19, 2012, Marc Logg
For completeness, here's two tracebacks (there were more similar ones)
from the same job:
/mnt/galaxyData/tmp/job_working_directory/000/75/task_4:
Traceback (most recent call last):
File "./scripts/extract_dataset_part.py", line 25, in
import galaxy.model.mapping #need to load this before
Hi,
In a workflow the last step is 'concatenate datasets' where the results table
(dataset 2) is concatenated to a table header (dataset 1). Issue is that
whenever the workflow is run, the table header dataset is required input at
runtime.
Is there a way to fixate that e.g. making it the input
10 matches
Mail list logo