Hello Florent,

On Oct 4, 2011, at 11:42 PM, Florent Angly wrote:

> Hi Greg,
> 
> Thank you for your help. Your suggestion worked like a charm!
> 
> I was expecting to see an error because I did not create the ../shed_tool/ 
> folder that contains the tools installed from the shed, but I was happy to 
> see that it was automatically created.
> 
> I have a few additional comments and questions though:
> 1/ In the universe_wsgi.ini file, maybe the tool_config_file parameter could 
> be renamed to tool_config_files (plural) to indicate that it takes a list of 
> files.

Change set 6095:540ff06d44b4 allows for both "tool_config_file" and 
"tool_config_files" to behave the same way.


> 2/ It seems like the Tools Search box cannot find newly installed tools. 
> However, after I restart Galaxy, it works as intended again.

Fixed in change set 6095:540ff06d44b4.


> 3/ In the Grinder wrapper, I relied on installing the wrapper under a 
> specific folder: ./tools/ngs simulation/grinder. The new wrapper installation 
> procedure installs the tools in the ../shed_tools/ folder and the admins can 
> choose under what category the tool is to be placed. This means that my 
> Grinder wrapper fails since it does not know where to find the scripts it 
> needs. Is there a way to get the directory where a tool is installed? Here is 
> an excerpt of the Grinder wrapper so you can better understand what I am 
> trying to do. This wrapper first runs Grinder and then moves its files (the 
> number of files is hard to determine ahead of time) to a place where Galaxy 
> will find them (see the wiki page 
> http://wiki.g2.bx.psu.edu/Admin/Tools/Multiple%20Output%20Files under section 
> "Number of Output datasets cannot be determined until tool run").
> 

I've looked at this a bit more now that I'm back in the lab and feel 
differently than I did when I initially looked at it - I was on vacation, after 
all ;)

Galaxy automatically adds the path to the tool config file (in your case, 
ginder.xml), no matter where it is located, so your $tool_dir setting is really 
not necessary.  To simplify the discussion, I'll remove the if blocks from your 
command, and add an interpreter attribute.  So we have something like:

  <command interpreter="python">
    stderr_wrapper.py grinder grinder_multiple_outputs.py $output_dir $base_name
  </command>

The problem with the above is that the use of the interpreter attribute applies 
only to stderr_wrapper.py, but not grinder_multiple_outputs.py.  However, your 
stderr_wrapper.py and grinder_multiple_outputs.py do not have to be a separate 
scripts, but can merged into one overall script that uses the same approach as 
the ~/tools/gatk/gatk_wrapper.py script (among several other examples in the 
Galaxy distribution).  Doing this would allow for something like the following:

<command interpreter="python">
  grinder_wrapper.py -reference_file<etc> $output_dir $base_name
</command>

The code inside grinder_wrapper.py would parse the command line, and call the 
required grinder binary itself.  It could also include code like the following, 
eliminating the need for the separate stderr_wrapper.py.

    #set up stdout and stderr output options
    stdout = open_file_from_option( options.stdout, mode = 'wb' )
    stderr = open_file_from_option( options.stderr, mode = 'wb' )
    #if no stderr file is specified, we'll use our own
    if stderr is None:
        stderr = tempfile.NamedTemporaryFile( prefix="grinder-stderr-", 
dir=tmp_dir )
    
    proc = subprocess.Popen( args=cmd, stdout=stdout, stderr=stderr, 
shell=True, cwd=tmp_dir )
    return_code = proc.wait()
    
    if return_code:
        stderr_target = sys.stderr
    else:
        stderr_target = sys.stdout
    stderr.flush()
    stderr.seek(0)
    while True:
        chunk = stderr.read( CHUNK_SIZE )
        if chunk:
            stderr_target.write( chunk )
        else:
            break
    stderr.close()




> 
> 
>> <command>
>>    #set $tool_dir = os.path.join( os.path.abspath($__root_dir__), 'tools', 
>> 'ngs_simulation' )
>>    #set $script1  = os.path.join( $tool_dir, 'stderr_wrapper.py' )
>>    #set $script2  = os.path.join( $tool_dir, 'grinder_multiple_outputs.py' )
>> 
>>    $script1
>>      grinder
>>      #if $reference_file.specify == "builtin":
>>        -reference_file   ${ filter( lambda x: str( x[0] ) == str( 
>> $reference_file.value ), $__app__.tool_data_tables[ 'all_fasta' 
>> ].get_fields() )[0][-1] }
>>      #else if $reference_file.specify == "uploaded":
>>        -reference_file   $reference_file.value
>>      #end if
>>      [...]
>>      #if str($homopolymer_dist):
>>        -homopolymer_dist $homopolymer_dist
>>      #end if
>> 
>>      #set $output_dir = $__new_file_path__
>>      -output_dir         $output_dir
>> 
>>      #set $base_name  = $output.id
>>      -base_name          $base_name
>>    ;
>> 
>>    $script2 $output_dir $base_name
>> 
>> </command>
> 
> 
> 
> On 04/10/11 22:55, Greg Von Kuster wrote:
>> Hello Florent,
>> 
>> Sorry for the confusion on this - we are preparing a new Galaxy 
>> distribution, and the tool shed wiki has been written in preparation for it. 
>>  The new distribution will be available fairly soon, and the Galaxy News 
>> Brief will include information about these new tool shed features.  In any 
>> case, you have already discovered that you can use in if you update your 
>> Galaxy instance to the latest Galaxy development repository ( Galaxy central 
>> ).
>> 
>> The problem you see is most likely caused by your not having configured an 
>> additional tool_config_file setting in your universe_wsgi.ini.  Look for 
>> something like following in your latest version of the 
>> universe_wsgi.ini.sample that you got when you updated from Galaxy central.
>> 
>> # Locally installed tools and tools installed from tool sheds
>> tool_config_file = tool_conf.xml,shed_tool_conf.xml
>> 
>> If you add a new additional file name like shed_tool_conf.xml, you should 
>> not have a problem installing from a tool shed.  I'll have a fix for the bug 
>> you've discovered shortly, but making this change will fix the behavior 
>> until then.
>> 
>> Let me know if you bump into any additional problems.
>> 
>> Thanks for finding this!
>> 
>> Greg Von Kuster
>> 
>> On Oct 4, 2011, at 2:53 AM, Florent Angly wrote:
>> 
>>> Hi all,
>>> 
>>> I tried the latest stable version of Galaxy: 
>>> http://wiki.g2.bx.psu.edu/News%20Briefs/2011_08_30. This page has links to 
>>> how to use the new tool shed including how to automatically deploy tools 
>>> from the shed in a local Galaxy server.
>>> The documentation mentioned some tool shed options available from the the 
>>> admin section of Galaxy but I could not locate these options in my instance 
>>> of galaxy. So my question is: Can one only take advantage of the tool 
>>> deployment from the shed in the development version of Galaxy? If so, I 
>>> think the Tool shed wiki should be more clear about this.
>>> 
>>> Then I tried the latest development version of Galaxy and could locate the 
>>> tool shed deployment options. I attempted to install the Grinder wrapper 
>>> (http://toolshed.g2.bx.psu.edu/repository/manage_repository?sort=name&webapp=community&id=3d8312720a69a558&f-deleted=False&show_item_checkboxes=false&async=false&operation=view_or_manage_repository&f-free-text-search=grinder&page=1<http://toolshed.g2.bx.psu.edu/repository/manage_repository?sort=name&webapp=community&id=3d8312720a69a558&f-deleted=False&show_item_checkboxes=false&async=false&operation=view_or_manage_repository&f-free-text-search=grinder&page=1>)
>>>  but ran into an error that I am pasting below:
>>>> URL: 
>>>> http://localhost:8080/admin/install_tool_shed_repository?tool_shed_url=toolshed.g2.bx.psu.edu&name=grinder&description=Genomic,%20metagenomic%20and%20amplicon%20read%20simulator&repository_clone_url=http://fan...@toolshed.g2.bx.psu.edu/repos/fangly/grinder&changeset_revision=5ba7c9ac056a
>>>> File 
>>>> '/home/floflooo_mint/Software/galaxy-central/eggs/WebError-0.8a-py2.6.egg/weberror/evalexception/middleware.py',
>>>>  line 364 in respond
>>>>  app_iter = self.application(environ, detect_start_response)
>>>> File 
>>>> '/home/floflooo_mint/Software/galaxy-central/eggs/Paste-1.6-py2.6.egg/paste/debug/prints.py',
>>>>  line 98 in __call__
>>>>  environ, self.app)
>>>> File 
>>>> '/home/floflooo_mint/Software/galaxy-central/eggs/Paste-1.6-py2.6.egg/paste/wsgilib.py',
>>>>  line 539 in intercept_output
>>>>  app_iter = application(environ, replacement_start_response)
>>>> File 
>>>> '/home/floflooo_mint/Software/galaxy-central/eggs/Paste-1.6-py2.6.egg/paste/recursive.py',
>>>>  line 80 in __call__
>>>>  return self.application(environ, start_response)
>>>> File 
>>>> '/home/floflooo_mint/Software/galaxy-central/eggs/Paste-1.6-py2.6.egg/paste/httpexceptions.py',
>>>>  line 632 in __call__
>>>>  return self.application(environ, start_response)
>>>> File 
>>>> '/home/floflooo_mint/Software/galaxy-central/lib/galaxy/web/framework/base.py',
>>>>  line 160 in __call__
>>>>  body = method( trans, **kwargs )
>>>> File 
>>>> '/home/floflooo_mint/Software/galaxy-central/lib/galaxy/web/framework/__init__.py',
>>>>  line 173 in decorator
>>>>  return func( self, trans, *args, **kwargs )
>>>> File 
>>>> '/home/floflooo_mint/Software/galaxy-central/lib/galaxy/web/controllers/admin.py',
>>>>  line 805 in install_tool_shed_repository
>>>>  shed_tool_conf = trans.app.toolbox.shed_tool_confs.keys()[0].lstrip( './' 
>>>> )
>>>> IndexError: list index out of range
>>> I get the same problem with other wrappers such as FastQC. Am I doing 
>>> something wrong?
>>> 
>>> Thanks,
>>> 
>>> Florent
>>> 
>>> 
>>> ___________________________________________________________
>>> The Galaxy User list should be used for the discussion of
>>> Galaxy analysis and other features on the public server
>>> at usegalaxy.org.  Please keep all replies on the list by
>>> using "reply all" in your mail client.  For discussion of
>>> local Galaxy instances and the Galaxy source code, please
>>> use the Galaxy Development list:
>>> 
>>> http://lists.bx.psu.edu/listinfo/galaxy-dev
>>> 
>>> To manage your subscriptions to this and other Galaxy lists,
>>> please use the interface at:
>>> 
>>> http://lists.bx.psu.edu/
>> Greg Von Kuster
>> Galaxy Development Team
>> g...@bx.psu.edu
>> 
>> 
>> 
> 

Greg Von Kuster
Galaxy Development Team
g...@bx.psu.edu



___________________________________________________________
Please keep all replies on the list by using "reply all"
in your mail client.  To manage your subscriptions to this
and other Galaxy lists, please use the interface at:

  http://lists.bx.psu.edu/

Reply via email to