Hi Dan
I added this to my Rscript_wrapper.sh
script and all is well.
I am happy to hear that
and don't worry about answering my own question. There are always other
people on the list which will learn from the comments and/or find the
solution later in the mail archive ;)
regards,
Dear List,
I'm trying to check my toolshed tools by installing first on a local toolshed,
but running into an error every time I try to add my tools.
In order to rule out possible idiosyncracies of my system I've reproduced the
error on an Ubuntu VM ... the system was not completely clean
Hello all,
The shell script scripts/cleanup_datasets/update_metadata.sh tries to
call non-existent file scripts/get_python.sh
What is this intended to do?
Thanks,
Peter
___
Please keep all replies on the list by using reply all
in your
Hi Edward,
I've started work on this in earnest now. I see you only defined one
new datatype, blastdb, which worked for nucleotide databases.
I want to handle protein databases too, so I think two datatypes
makes sense - which I am currently calling blastdbn and blastdbp.
That won't be
Hi Todd,
History export is a beta feature and hasn't been fully tested yet. We'll look
into this, but it's difficult to diagnose a bug on a local instance. Can you
reproduce on either our main or test server?
Thanks,
J.
On Apr 25, 2012, at 8:08 PM, Todd Oakley wrote:
Hello,
I'd am
Hello Ira,
Sorry you bumped into this problem. I've cloned your protk-toolshed repository
and made a bz2 compressed archive as you did. I was able to upload it to a
local tool shed, and was successful installing from there to my local Galaxy
instance, where my environment is running Galaxy
Hello all,
I'm looking at 'basic' composite datatypes in Galaxy, based on how
Edward did BLAST databases in his BLAST+ fork on the toolshed.
I've extended his work to handle both protein and nucleotide databases,
and have got this working for tools to create a database (i.e. wrapped
makeblastdb)
On Fri, Mar 16, 2012 at 11:00 AM, Peter Cock p.j.a.c...@googlemail.com wrote:
On Mon, Feb 13, 2012 at 5:02 PM, Nate Coraor n...@bx.psu.edu wrote:
On Feb 10, 2012, at 6:47 AM, Peter Cock wrote:
Hello all,
I've noticed we have about a dozen stalled upload jobs on our server
from several
I set library_import_dir to a path and tried uploading a directory of bam
files. After fixing the situation so galaxy could find samtools in that
subshell, I was able to upload links to the history. But moving things to
one directory did not appear to be terribly useful, so I tested what
Hi all,
I am trying to add a tool to galaxy. I want to collect some data from the
user. The data may contain hundreds of records each has 7 or 8 fields.
Either user can upload a spreadsheet for that or I shall provide a template
file to fill up the data. When I checked there is no provision
Either user can upload a spreadsheet for that or I shall provide a template
file to fill up the data. When I checked there is no provision for uploading a
speadsheet in my galaxy instance.
Two simple options:
(1) have users convert their spreadsheet data to csv/tabular format; Galaxy
Thank you J.
I will try that.
Deepthi
On 4/26/12, Jeremy Goecks jeremy.goe...@emory.edu wrote:
Either user can upload a spreadsheet for that or I shall provide a
template file to fill up the data. When I checked there is no provision
for uploading a
speadsheet in my galaxy instance.
Two
On our local installation of Galaxy, we get the following error when running
Macs:
Usage: macs -t tfile [-n name] [-g genomesize] [options]
Example: macs -t ChIP.bam -c Control.bam -f BAM -g h -n test -w --call-subpeaks
macs: error: no such option: --lambdaset
What is the latest version of
your suggestion for blastdbn and blastdbp sounds fine.
it's okay if a few of our users need to edit the metadata of the dbs in
their history.
thanks for asking and doing this.
On Thu, Apr 26, 2012 at 5:37 AM, Peter Cock p.j.a.c...@googlemail.comwrote:
Hi Edward,
I've started work on this in
Hello,
I am performing NGS data analysis using Galaxy cloudman. I am working with
very large Fastq files that add up and approach the 1TB limit of the EBS
volume when I perform my analysis. Is there a way to use multiple EBS
volumes with one EC2 instance of Galaxy cloudman. For example, could I
15 matches
Mail list logo