Hi Piotr,
Regarding data parallelism - Galaxy can split a single large file into small
parts and process them in parallel, then merge outputs into single file.
That's not what we need, as we already have multiple input files. But as I
understand, there's a possibility to write our own
Hi guys,
We've been using Galaxy for a year now, we created our own Galaxy fork where we
were making changes to adapt Galaxy to our requirements. As we need multiple
file dataset - we were using Johns' fork for that initially.
Now we are trying to use The most updated version of the
Hi John,
One more problem with multifile upload - when I display a workflow which
includes multi upload tool, I get:
Module workflow_run_mako:476 in render_row_for_param
http://140.79.7.98/workflow/run?id=f597429621d6eb2b
__M_writer(unicode(param.get_label()))
AttributeError:
Hi Peter,
Of course I added def get_label(self), as a matter of fact, I copied
get_label() from ToolParameter class. That's why I'm a bit confused.
The get_label function returns a string which is supposed to be displayed, but
instead something is trying to execute it?
Best Regards,
Alex
Hi John,
Can you please have a look at Neil's question.
Thank you,
-Alex
From: Burdett, Neil (ICT Centre, Herston - RBWH)
Sent: Thursday, 30 May 2013 4:30 PM
To: Khassapov, Alex (CSIRO IMT, Clayton)
Subject: Multi File upload api
Hi Alex,
The file
Hi Dannon,
I understand that instead of having one dataset with multiple files you are
planning to use existing datasets and combine them in a 'collection'. My
concerns are:
1. Our data consists of 200-8000 files, can you imagine how many datasets we'll
end up with? It will be a mess.
2. All
Thanks Peter. I see, parallelism works on a single large file by splitting it
and using multiple instances to process the bits in parallel.
In our case we use 'composite' data type, simply an array of input files and we
would like to process them in parallel, instead of having a 'foreach' loop
Hi All,
Can anybody please add a few words on how can we use the “initial
implementation” which “ exists in the tasks framework”?
-Alex
From: Trello [mailto:do-not-re...@trello.com]
Sent: Wednesday, 6 February 2013 10:58 AM
To: Khassapov, Alex (CSIRO IMT, Clayton)
Subject: 4 new notifications
Thanks John, works fine.
-Alex
-Original Message-
From: Burdett, Neil (ICT Centre, Herston - RBWH)
Sent: Tuesday, 4 December 2012 9:57 AM
To: Khassapov, Alex (CSIRO IMT, Clayton)
Cc: Szul, Piotr (ICT Centre, Marsfield)
Subject: RE: [galaxy-dev] pass more information on a dataset merge
Hi John,
My colleague (Neil) has a bit of a problem with the multi file support:
When I try and use the option Upload Directory of files I get the error below
Error Traceback:
View as: Interactive | Text | XML (full)
⇝ AttributeError: 'Bunch' object has no attribute 'multifiles'
URL:
Hi John,
Do you think it's possible to create a test for your 'm: format? I couldn't
find how to specify multi input files for the test.
-Alex
-Original Message-
From: jmchil...@gmail.com [mailto:jmchil...@gmail.com] On Behalf Of John Chilton
Sent: Tuesday, 23 October 2012 7:59 AM
To:
This can be done in validate(), see
http://wiki.g2.bx.psu.edu/Admin/Tools/Custom%20Code
Alex
From: galaxy-dev-boun...@lists.bx.psu.edu
[mailto:galaxy-dev-boun...@lists.bx.psu.edu] On Behalf Of Lukasse, Pieter
Sent: Tuesday, 23 October 2012 9:05 PM
To: galaxy-dev@lists.bx.psu.edu
Subject:
1) One more question,
My colleague likes the idea, but his composite data set dataset_id.dat file
contains only a plain list of uploaded files, not HTML like yours.
I was wondering if it is possible to pass somehow a parameter to
CompositeMultifile.regenerate_primary_file(dataset) to switch
By default Galaxy checks stderr, if it's not empty - returns an error. So if
your tool doesn't fail (returns 0) but you print something to stderr , your
tool will still fail in Galaxy. There's stderr_wrapper.py workaround for
that.
On the other hand, if you tool returns non zero but
API seems a bit of overkill, as I understand, it's useful for 'external' access
via http. My tools run inside Galaxy and I should be able to use Python code
directly.
From: Anthonius deBoer [mailto:thondeb...@me.com]
Sent: Tuesday, 23 October 2012 12:16 PM
To: Khassapov, Alex (CSIRO IMT,
$filename.ext
From: galaxy-dev-boun...@lists.bx.psu.edu
[mailto:galaxy-dev-boun...@lists.bx.psu.edu] On Behalf Of Simon Gladman
Sent: Friday, 19 October 2012 4:38 PM
To: galaxy-dev@lists.bx.psu.edu
Subject: Re: [galaxy-dev] Retreiving a library data type in command
oops, clicked send too
Hi John,
what I don't get - I specify the output format m:grd, my tool generates
multiple output files in dataset_id_files folder, but the dataset_id.dat file
is empty. I need to call this regenerate_primary_file() to add the HTML with
the file list to the dat file. But I'm not sure where?
I tried galaxy-central-homogeneous-composite-datatypes fork, works great. I
have a similar problem, where number of output files varies, it seems that your
approach might work for output files as well (not only input). Currently I'm
trying to work out how to implement it, any help is
Hi John,
I tried your galaxy-central-homogeneous-composite-datatypes implementation,
works great thank you (and Jorrit).
A couple of fixes:
1. Add multi_upload.xml to too_conf.xml
2. lib/galaxy/tools/parameters/grouping.py line 322 (in get_filenames( context
)) -
if ftp_files is not
Hi List,
Our tools require an array of input files and also produce an array of files.
This seems as a pretty standard task, but unfortunately Galaxy doesn't support
this yet.
I wonder if somebody has already implemented this kind of 'file array' data
type?
I.e. the user selects a local
20 matches
Mail list logo