Excellent, it was the trailing dot that I was missing!
Thanks so much for the help, I will most certainly be using Galaxy again,
it's been very useful so far.
karl
> You're almost there, the command should be executed from your local
> machine
> (home directory is fine) and it should look as f
You're almost there, the command should be executed from your local machine
(home directory is fine) and it should look as follows:
scp -i
ubuntu@:/mnt/galaxyData/files/000/dataset_11.dat
.
(note the 'ubuntu@' before the and a trailing dot (.) - the dot
means your current directory on your curren
Hi Karl,
Hmm, not having Galaxy accessible is definitely not a step in the right
direction.
Being signed into command line is not an issue; something else must have
gone wrong. To start, please take a look at the (bottom of) galaxy log file
(and email the relevant part if you don't see how to fix i
Hello Enis,
Thanks for the quick response and suggestions. I actually did have a job
running while I tried to download a file the first time, that's the first
time it gave the error message. But the jobs have long since finished and
it's still giving the error message.
I've been able to edit t
Hi Karl,
As you see from the error message, you seem to be getting this error because
the machine is running out of memory. This can in part be caused by a
configuration option that might be set in Galaxy's universe_wsgi.ini file
(see below).
Did you have any jobs running while trying to download t
Hello,
I'm trying to download the library files I processed on my galaxy cloud
instance, but I'm getting an error. At the top (on the right panel) it
says "Server Error" and then lists the URL where the data should be and
then lists:
Module paste.exceptions.errormiddleware:143 in __call__
<<
6 matches
Mail list logo